The assignement is to design and "stage" a mediated play experience in a public space. This project explores how to use an interactive projection installations as an interface for play, social connectivity, experience design, and/or space augmentation. This augmentation, or invitation to participate must be self-contained in your mediation, and cannot rely on communication from you. Instead, explore audio, gestural, and visual means of expression. Observe the context and scenarios under which you prototypes work.
Introduce group and project title
What is project's nutshell, or thesis. 1-3 sentence explaination.
Beifly, outline for your audience what you will talk about, from begining to end.
[1-2] minutes 4)
Show video of your work installed, explain it in 500-1000 words.
Show demo of your working project.
Present anaylisis of you work.
Use photographs, video, and experiences to answer these questions. Create a flow and order for your answers that make for the most interesting narrative in presenting: 1. Strengths (what did the solution do very well?
2. Weaknesses (what could be done to improve the design?
3. The design process (What were some of the key moments/decisions during the process? What were trouble spots? How did the collaboration play out? Who took on what roles and how were decisions made or conflicts resolved?
4. Include your analysis of at least 2 new media public space instruction sets that are related to your project. Critique these projects in relation to your own: how are they similar or different? What can be learned from these projects if you were to evolve your design further? Include visual examples of the projects described, as well as information on who produced them, when, where, and why.
5. What can you conclude both from your experiments and the ones that you studied? What are open questions remaining to be answered?
6. Can your project be applied to other public spaces? Does it travel well?
7. Would your project make a good exhibition piece? Where would it be seen? If appropriate, submit your project to venues where it can be appreciated by other groups of people.Oct 30th
8. What are the successful/unsuccessful parts of the iterations? What can be concluded from your successes and mis-calculations? Were your assumptions correct? How did your installation effect volume of traffic? What are the narratives that emerge out of the context? [10-15 min]
6) Solicit questions from your audience. What do you want feedback for/from? [1 min] 7) Allow your audience to interact with you installation again. [optional]
I would like to get this to a point where the pieces appear slower and the user can move them around the screen and place them where they choose. I want the user to be able to build a puzzle using them. As of now several are drawn to the screen at once and are placed according to where the brightest pixel is. I would also like to have several users be involved at the same time.
To simplify and play with the ripple idea, several waves were created which react to the presence of a person with blob detection. There were some issues with getting the behavior completely understood and under control. While it is clear you are affecting the ripple, you don't exactly know how, and this lack of understanding negates the sense of control established in the space.
Starting fresh with the firefly concept, the blob detection ripple code was used as a basis to create fireflies independent of our previous synchronization idea. These are much more abstract fireflies that are attracted to blobs where they generate trails and 'glow'. These are attractive to use. In use, the flies will gather projected on a person's body. A person can play with them and 'catch' them on their hands. There has to be some patience as the usage is not great a fast movement, and some play is necessary to establish control. The lightning precursor played a part in the implementation of this prototype.
To begin, worked on implementation of having the audio file volume increase and the audio noise volume decreased. The eased the transition between the two audio component, and affected the look and feel by making the audio file not always present. When a person steps into the vision, the music begins to be slightly perceptible, which hopefully encourages them to fully explore exposing the sound and calming the visual noise to create a temporary, tenuous sanctuary through group effort.
Code, just audio: filter_proof.pde (will need to add an audio file for all examples)
Following, images were integrated. First, a static image, and then with a Muybridge progression. It would be possible to put other images and/or video in the background as well. As the visual noise decreases, the opacity of the image(s)/video increases. In regards to role: the use of images feels very literal. The feeling is that the images ought to convey a more concrete message. With the purpose of the piece to stimulate group interaction, images with groups are possible images, but they feel incomplete (see still example). The Muybridge series has an appeals ambiance, but this only weakly contributes to overall meaning (see Muybridge example).
Finally, in exploring a more abstract color treatment, a low opacity rectangle covering the noise was applied. The blue rectangle become more opaque as the 'amount' of presence in the field of vision increases, or as more people are there. Simultaneously, the auditory noise decreases and the audio file increases. Visually the pattern now looks like soft moving water. The water idea connects to our other ripple concept. All of this contributes to a sense of tranquility.
Lightning Ring50 is an iteration for the beginning of the storm-scape in the lightening piece. It explores the technique for getting many clouds to follow through blob detection. This will later be combined with lightening and new images of clouds instead of circles. The implementation is getting worked out before the final look and feel details are perfected.
Lightening is the beginning of an interactive piece that will strike the sudden movement of a user. It is currently scaled back to the essence of the interaction and later more lightening bolts and clouds will be added to the storm-scape. We will be using last blog detection code along with speed and easy ease. A combination of storm clouds gathering near the latest blog and any sudden movements evoking lightening. It is a subtle wall interactive installation that is best shared in a space with frequent passers that are pausing for the use of an elevator or another line. There can be a slight competition between users for the lightening to strike them.
My first prototype of our drawing code was to add 8-bit arcade game sprites to liven up the drawing, and make it a bit more pertinent to barcade. I haven't yet decided whether or not this adds or detracts from the drawing process or the finished product (if there ever is one). I also added in a little video capture screen in the top left to help me figure out what the camera is actually seeing.
For the second prototype I messed around with light sources. I started off with the bright led pen, and found it was quite difficult to control the drawing, and refracted in such a way that the camera sometimes read two points of light (I could see this in my little cheat video capture area--hooray). I tried using my phone instead, and I had much better control.
My third prototype involved a new user who has never encountered this interface. Set up in my apartment, my user tried out the rave bracelet that Yury provided, the led pen, and his own iPhone. The led pen was most difficult to use, the rave bracelet being slightly more erratic. The photos compare what Roman did with his phone with what appeared on the screen. The little bits on each side of the design were added after I took the photo, just so you know. Hereis a video of the rave bracelet in use, in case you want to see that sort of thing.
It was interesting watching Roman because he started off drawing very quickly, but the drawing didn't work so well. He immediately learned to slow down a bit, and his drawing came out much better. He likes the 8-bit characters, but didn't like that they draw over themselves if he didn't move around.
The next prototype includes some image capture code (the new program code is here ). Basically, we can now track how the drawing evolves over time. I'd like to be able to take a closer look at each users' work, and I think this will enable that
kind of analysis.
Other Stuff still to do:
1) identify multiple "pens" (perhaps two different phones?) to see if we can get multiple users simultaneously drawing. 2) get the sprites to fade over time so that the drawing never becomes entirely "full." this is didn't work either, but i'd really like it to. Here is the code i'm using for this--sorry, no pretty pictures. 3) play around with the sprites so that maybe they shift out of place before fading away this didn't work out too well.
About Social connectivity - there will be social connectivity because of audio will trigger responses, leading to social connectivity.
Why did I choose those sounds? Because they are more uncommon and have an old school arcade style. For example - the sound of a coin being inserted a machine and hitting the bottom.
Used Paper Bag Low-Fi Prototype of Video Game Sound Sampler aka Audio Vision Sketch featuring the average Barcade Consumer with a cell phone.
Low -Fi Prototype of Video Game Sound Sampler of Needed Code
Photoshop Low Fi Prototype of Coding for Computer Audio Vision Sketch
Breaking it apart. I decided to break the code apart and just completely understand small sections. Here's what I have attempted in Processing: 4 Squares is 4 squares (when your mouse rolls over one of the 4 squares, that square lights up) Track the brightest pixel with the shape of a square. Make 4 squares a sample player (have a selection of 4 buttons to click on - upper left, upper right, bottom left, bottom right) and each button is unique and generates it's own sound.