Imagine we have a twitter mood light, connected to a mechanical arm which draws. I want to make this amalgamation of Arduino projects which detects and ‘senses’ the world mood and represents it via a drawing. After much deliberation and research, I feel that the twitter algorithm has changed(even though I have found a few, wherein this is possible) and it is not possible to tap into its data.
Another possible iteration is to make a device which captures emotions using the press of a button and at the end of the day draws an image based on the prominent emotion of that day. Example: There are a few buttons installed at the PoD, people press(Happiness, Anger,Fear and more), at the end of the day the device captures the most dominant ‘mood’ and draws( I think I have to manually put in the image I want it to draw, though.) a painting relating to the most dominant emotion.
This can also be captured using the Vortolight method which uses a service like IFTTT to reflect colours based on what people tweet about. Essentially instead of changing colour, I would like to ‘move’ the drawing handle in a way which relates to a tweet.
Example: I am angry today #red. This makes the vortolight go red, in the same manner, the robot will draw a little something over a prompt by the tweet.
I looked at a few drawing robots and this seemed the most viable/accurate in terms of using a MECHANIX kit and servos(besides regular electronics) whereas the twitter part is mostly coding and web services(and a wifi module which should not be as expensive, however, the one used in some cases is).
Looking forward to starting with this after your feedback!