Quite a few experiments. Here are a few.
In the first, we took a sensor that would bend a surface digitally forwards or backwards based on how much you twisted it in a particular direction.
In the second, we took a web camera and based on the image, surfaces would be moved to and away from the human images/ interaction. The red image is not too clear with the 'sticks', but if you look closely you can see our faces outlined outside of the 'red sticks'.
The better example is one in which the human images provided a displacement; so the height of the surface would move up and down based on parts of the face (the eyes would give sinks in the terrain, and the nose would give a protruding mesh surface).
The third was that a tweet signal would give a trigger that would make a sphere of a certain radius once the signal was received. We also did one using the K!nect to pass a virtual ball from one person to the next, or draw imaginary lines from a series of points (like a trail) from the motion used by a person's hand.
The last was making an RGB LED change colour based on trigger words in Tw!tter that were assigned given a defined area (Longitude and Latitude) from which to acquire information. We took the words "Happy", "Sad" and "Mad". Mine turned Green from too many "sad"s LOL; Red was happy, green was sad, blue was mad. I guess where we were a lot of people were sad? It is Sunday, after all LOL.
Anyways, here are the pics of some of the stuff we did.

DANG! that sounds like fun times! I want to play virtual ball! What about virtual dodge ball?? All the fun without the risk of getting pegged in the face!
ReplyDeletethat's funny that you'd mention that; I actually have to make Pong this week
ReplyDeletefor one of my programming classes LOL.
Ahahah!! That's awesome!!!
ReplyDelete