Difference between revisions of "Unraveling the infra-ordinary"
LarsNoback (talk | contribs) |
LarsNoback (talk | contribs) |
||
(17 intermediate revisions by the same user not shown) | |||
Line 16: | Line 16: | ||
==OBSERVATION== | ==OBSERVATION== | ||
− | |||
===LIBRARY=== | ===LIBRARY=== | ||
− | [[file:LIBE5.jpg| | + | [[file:LIBE5.jpg|200px]] |
− | [[file:LIB4.jpeg| | + | [[file:LIB4.jpeg|200px]] |
− | [[file:LIB3.jpeg| | + | [[file:LIB3.jpeg|200px]] |
− | [[file:LIB2.jpg| | + | [[file:LIB2.jpg|200px]] |
[[file: Library.JPG|400px]] | [[file: Library.JPG|400px]] | ||
− | |||
===MARKTHAL=== | ===MARKTHAL=== | ||
− | [[file: GirlsDontCry.JPG| | + | [[file: GirlsDontCry.JPG|200px]] |
− | [[file: LarsGeek.JPG| | + | [[file: LarsGeek.JPG|200px]] |
− | [[file: spy.JPG| | + | [[file: spy.JPG|200px]] |
− | [[file: SPY2.JPG| | + | [[file: SPY2.JPG|200px]] |
− | [[File:Markthal Phone Use 20150917 0001.jpg]] | + | [[File:Markthal Phone Use 20150917 0001.jpg|400px]] |
− | [[File:Markthal Phone Use 20150917 0002.jpg]] | + | [[File:Markthal Phone Use 20150917 0002.jpg|400px]] |
− | [[File:Markthal Phone Use 20150917 0003.jpg]] | + | [[File:Markthal Phone Use 20150917 0003.jpg|400px]] |
− | [[File:Markthal Phone Use 20150917 0004.jpg]] | + | [[File:Markthal Phone Use 20150917 0004.jpg|400px]] |
− | [[File:Markthal Phone Use 20150917 0005.jpg]] | + | [[File:Markthal Phone Use 20150917 0005.jpg|400px]] |
− | [[File:Markthal Phone Use 20150917 0006.jpg]] | + | [[File:Markthal Phone Use 20150917 0006.jpg|400px]] |
− | [[File:Markthal Phone Use 20150917 0007.jpg]] | + | [[File:Markthal Phone Use 20150917 0007.jpg|400px]] |
===FINDINGS=== | ===FINDINGS=== | ||
Line 93: | Line 91: | ||
==VISUAL REFERENCES== | ==VISUAL REFERENCES== | ||
− | [[File:Sony-playstation-move-projection-mapping.png| | + | [[File:Sony-playstation-move-projection-mapping.png|300px]] |
− | [[File:Living room.jpg| | + | [[File:Living room.jpg|150px] |
[https://vimeo.com/123491835 DECHARGE ELECTRIQUE]] | [https://vimeo.com/123491835 DECHARGE ELECTRIQUE]] | ||
Line 146: | Line 144: | ||
− | [[File:PLAN.png| | + | [[File:PLAN.png|400px|center|thumb|device schema]] |
− | [[File:Hologram test1.jpg| | + | [[File:Hologram test1.jpg|400px|center|thumb|first hologram test]] |
− | [[File:Mapping test2.jpg| | + | [[File:Mapping test2.jpg|400px|center|thumb| hologram and videomapping tests]] |
− | [[File:Kinect test1.jpg| | + | [[File:Kinect test1.jpg|400px|center|thumb|first hologram test with kinect]] |
− | [[File:Kinect test2.jpg| | + | [[File:Kinect test2.jpg|400px|center]] |
− | [[File:Lasercut phone.jpg| | + | [[File:Lasercut phone.jpg|400px|center]] |
− | [[File:Projection test1.jpg| | + | [[File:Projection test1.jpg|400px|center]] |
− | [[File:Projection and kinect.jpg| | + | [[File:Projection and kinect.jpg|400px|center|thumb|crowd mapping and hologram projection]] |
==BUILDING== | ==BUILDING== | ||
Line 174: | Line 172: | ||
− | |||
− | [[File:Building1.jpg| | + | [[File:Building1.jpg|400px|center]] |
− | [[File:Building2.jpg| | + | [[File:Building2.jpg|400px|center]] |
− | [[File:Building 3.jpg| | + | [[File:Building 3.jpg|400px|center]] |
− | [[File:Bending.jpg| | + | [[File:Bending.jpg|400px|center]] |
− | [[File:Phone.jpg| | + | [[File:Phone.jpg|400px|center]] |
− | [[File:Hologram in box.jpg| | + | [[File:Hologram in box.jpg|400px|center]] |
− | [[File:Holograminbox2.jpg| | + | [[File:Holograminbox2.jpg|400px|center]] |
− | [[File:12204797 10153618605234034 1857297222 n.jpg| | + | [[File:12204797 10153618605234034 1857297222 n.jpg|400px|center]] |
− | [[File:12208181 10153618605329034 1582889638 n.jpg| | + | [[File:12208181 10153618605329034 1582889638 n.jpg|400px|center]] |
− | [[File:12204983 10153618605474034 754891095 n.jpg| | + | [[File:12204983 10153618605474034 754891095 n.jpg|400px|center]] |
− | [[File:Underthephone.jpg| | + | [[File:Underthephone.jpg|400px|center]] |
− | [[File:Underthephone2.jpg| | + | [[File:Underthephone2.jpg|400px|center]] |
− | [[File:12202271 10153618605174034 980397541 n.jpg| | + | [[File:12202271 10153618605174034 980397541 n.jpg|400px|center]] |
Finished construction video: | Finished construction video: | ||
Line 209: | Line 206: | ||
− | ===STARTING OVER=== | + | ===STARTING OVER/ Q10 === |
Line 222: | Line 219: | ||
-Turn into an uncomfortable trap (distorting the original image+sound) | -Turn into an uncomfortable trap (distorting the original image+sound) | ||
− | |||
==CONCEPTUAL IMPROVEMENTS== | ==CONCEPTUAL IMPROVEMENTS== | ||
Line 252: | Line 248: | ||
This should result into a more serious installation that is solidly constructed and uses professional software that is easily adaptable for a better effect. | This should result into a more serious installation that is solidly constructed and uses professional software that is easily adaptable for a better effect. | ||
− | |||
− | |||
− | |||
− | |||
==REBUILDING== | ==REBUILDING== | ||
Line 272: | Line 264: | ||
[[file:Wood paint.JPG |400px]] | [[file:Wood paint.JPG |400px]] | ||
[[file:Finished.JPG |400px]] | [[file:Finished.JPG |400px]] | ||
− | + | [[file:Ins5.JPG|300px]] | |
==KINECT== | ==KINECT== | ||
Line 318: | Line 310: | ||
You can find our final processing code for interactive projection [[final processing code 2|here]] | You can find our final processing code for interactive projection [[final processing code 2|here]] | ||
− | |||
− | |||
==MAPPING== | ==MAPPING== | ||
"Projection mapping, also known as video mapping and spatial augmented reality, is a projection technology used to turn objects, often irregularly shaped, into a display surface for video projection. These objects may be complex industrial landscapes, such as buildings, small indoor objects or theatrical stages. By using specialized software, a two- or three-dimensional object is spatially mapped on the virtual program which mimics the real environment it is to be projected on. The software can interact with a projector to fit any desired image onto the surface of that object.[1] This technique is used by artists and advertisers alike who can add extra dimensions, optical illusions, and notions of movement onto previously static objects. The video is commonly combined with, or triggered by, audio to create an audio-visual narrative. " | "Projection mapping, also known as video mapping and spatial augmented reality, is a projection technology used to turn objects, often irregularly shaped, into a display surface for video projection. These objects may be complex industrial landscapes, such as buildings, small indoor objects or theatrical stages. By using specialized software, a two- or three-dimensional object is spatially mapped on the virtual program which mimics the real environment it is to be projected on. The software can interact with a projector to fit any desired image onto the surface of that object.[1] This technique is used by artists and advertisers alike who can add extra dimensions, optical illusions, and notions of movement onto previously static objects. The video is commonly combined with, or triggered by, audio to create an audio-visual narrative. " | ||
+ | |||
===TOOLS FOR PROJECTION=== | ===TOOLS FOR PROJECTION=== | ||
Line 365: | Line 356: | ||
− | == | + | ==SKETCHING== |
+ | |||
+ | [[file:BATMAN TRYING INSTALLATION.gif |400px]] | ||
+ | |||
[[File:Capture d’écran 2015-12-14 à 18.45.58.png|400px]] | [[File:Capture d’écran 2015-12-14 à 18.45.58.png|400px]] | ||
Line 386: | Line 380: | ||
We tried different animations. | We tried different animations. | ||
+ | |||
+ | [[file:Animation1.png |250px]] | ||
+ | [[file:Animation2.png |250px]] | ||
+ | [[file:Animation3.png |250px]] | ||
+ | [[file:Animation4.png |250px]] | ||
[[file:Mapping-hologram-test1.gif |250px]] | [[file:Mapping-hologram-test1.gif |250px]] | ||
+ | |||
+ | |||
+ | ==SETTING UP== | ||
+ | |||
+ | [[file:Giphy (4).gif |250px]] | ||
+ | |||
+ | [[file:Ins1.JPG|300px]] | ||
+ | [[file:Ins2.JPG|300px]] | ||
+ | [[file:Ins3.JPG|300px]] | ||
+ | [[file:Ins4.JPG|300px]] | ||
+ | |||
+ | |||
+ | |||
+ | ===IN CONCLUSION=== | ||
+ | |||
+ | In the end, the question to be asked is whether there can be a balance between these two worlds depicted in our installation. You can interpret the digital world as a bad thing that has to be erased from our lives, but the real question has to be much more nuanced. Because there’s obviously a good side to it, it can have a real purpose. But the threat to us is that most people see the bad side of it anymore. The parents used to be the ones who would tell the kids not to use their devices during dinner, but nowadays the parents have their own devices to be distracted by. Then, the real question and purpose of this installation is more about our relation to the habits of our new technologies and how there can be a healthy balance between our real life and our digital world. | ||
+ | |||
+ | |||
+ | ==FINAL VIDEO + USER TEST== | ||
+ | |||
+ | https://www.youtube.com/watch?v=NrqozcMVFn4 | ||
+ | |||
+ | https://www.youtube.com/watch?v=v-1jT0f3L_A |
Latest revision as of 22:26, 29 January 2016
Contents
- 1 UNRAVELLING THE INFRA-ORDINARY
UNRAVELLING THE INFRA-ORDINARY
INTRODUCTION
DESCRIPTION
Design/make/craft one or more objects, spaces (or both) that address changes in physical and/or social behaviour in public and private space due to digital devices. The final design must be based on findings from your initial research and should relate to a clearly articulated perspective. Examples of possible perspectives are: critical, speculative, practical, visionary or other.
KEYWORDS AND FACTS
- WI-FI
- Overflowing Informations
- Exhibition Through Phone
- Out Of Battery can change you Plans
- Biohacking
OBSERVATION
LIBRARY
MARKTHAL
FINDINGS
At the Markthal we noticed that:
- about half of the people there eventually would take out their phone or camera to take a picture
- people walk with their camera app open, looking for photo-opportunities
- people don't take time to look at anything, only through their camera even when they're with other people
- people don't notice other people around them when on their phone
- which causes them to move very slowly and bump into other people
- people follow each other to take the same photo's
- people take photos of everything, even boring/normal things
- once the picture is taken, the moment is gone
Conclusion:
- even when people are physically present in an area with interactive beings and social elements, they're not completely mentally focused on their surroundings (even though they're there as visitors).
- they're more focused on their reality through their screen and on what they can share online.
- they're alway busy with their online profile and not so much with their physical self.
- people don't live in the moment or take time to actually look at where they are
At the library we noticed that:
- almost all people have either a phone or laptop at their side at all time
- most people do not have books with them
- people are still busy with snapchat, instagram, facebook, etc.
Conclusions:
- people go to the library to have a quiet space to focus on their work, but cannot be detached from their digital devices.
- even when they're looking for a place to be alone, they still need to be a part of happening.
PERSONAL EXPERIENCES
One experience that we had outside of our visit to the Markthal is when people 'interact' with you online, but ignore you when you physically pass them. For example when they like your pictures on facebook and then don't say anything to you when you're in the same space or even pretend to not see you.
INTENTIONS
We want to make a critical piece about our findings.
We specifically 'like' the change of behaviour when people switch between online and physical interaction with other people.
We want to show that our phones don't add anything to our personal experience, but are only there for the people we want to connect to online and that it actually takes away from our own physical experience. There's a separation between physical and mental, our bodies are here (still interfering with other physical beings) but our minds are in our devices.
This ability to create a different world with our devices isn't necessarily a bad thing, but the way it's used is a bad thing to us, because we sacrifice our physical lives for what we want to be online.
In which world am I now?
CONCEPT
We are making an interactive installation that questions the complexity of our relationship with digital interfaces and what this relationship means in our physical and mental presence with respect to the world around us.
VISUAL REFERENCES
TESTS
Hologram test: https://www.youtube.com/watch?v=tMAu4tM1t5U
We want to make an installation that will show the switch between the real world and the digital world. We tried to imagine how to transcribe what happens when, surrounded by a crowd, you take your phone/device and switch to a different world.
Then, it appears that your physical presence is still there but your mind just disappeared in the digital sea. In order to get our intentions well and to have this feeling of someone switching from the real world to the digital one, we decided to blend different techniques as video mapping and holograms which are directly related to the paradox of real virtuality and connected with the physical world.
At first, we wanted to make a device with different iPhone animated applications on it and a video mapping crowd as a background. This first research pushed us to think about building a kind of a square prison to trap your own hologram inside. Then we finally decided to simplify the device, making a kind of a "monolithic device" based on the proportions of the iPhone. The device is the reflection of a smartphone and it will be presented, during the installation, as a smartphone on a store display.
The idea of the installation is to put this construction in a very dark space and project the video of a moving crowd on it. When the spectator comes in the room, he just sees this strange powerful construction with the video playing on and he or she will attempt to approach it.
At a certain point, a kinect system attached to device will switch the video for a mapping animation. At the same time, a second kinect which is connected to a computer, itself connected to an ipad will show a 3D realtime model (thanks to the kinect) of the spectator. The iPad is projecting on a transparent plexiglass sheet that will make an hologram-like version of the spectator, trapped in the device and which is also supposed to follow every movements made by the spectator.
As a material construction, we picked the decision to use white plexiglass because of its very clean white proprieties and also the fact that it has a very interesting reflection that went well with our reflective hologram idea and the looks of an iPhone. After making the first sketches on internet and determening the size of the device, we cut it out with the lasercutter machine. We also dug a square inside (just as a screen) to give some space for the hologram.
Material list :
- 2 kinects - 1 iPad - 2 computers - 1 beamer
BUILDING
To hide all our divices (kinect, compter, iPad...) and get it perfectly dark, we made a black box out of cardboard that we inserted in the "smartphone monolith".
Thanks to that, we were able to stabilize the iPad and the transparent plexiglass for the hologram projection. Then, we also realized that, the white plexiglass itself was not that effective in that because it was too flat. Then, we added some edges to give it more volume and to make the videomapping more interesting and 3D.
Plexiglass is not a very easy material to work with. We had to fold very carrefully each side to a piece of wood with a heatgun and stick it with chloroform glue.
Finished construction video: https://www.youtube.com/watch?v=7fj11qeDNLs
INSTALLATION
Final video in HD or NORMAL QUALITY
STARTING OVER/ Q10
-How to make the installation more understandable and less ‘a playful experience'
-Improve technical elements (mapping software, kinect/processing, more solid device) -Improve conceptual elements to give the right feeling to the viewer
to in the end:
-Lure people in (seduce by interesting image+sound)
-Turn into an uncomfortable trap (distorting the original image+sound)
CONCEPTUAL IMPROVEMENTS
We wanted to make the moment the videos switch more subtle, so we used processing to add a fade between them. The colored background in our prototype was to distracting, so we also got rid of that.
Another problem that we had is that people didn't know where to go or how fast to walk, so we came up with the idea to include the ground in our projection, to form a path with an animation. We decided to add a suction-effect (like an infinity-mirror) as an animation.
In our first version we didn't have any sound. We used processing to combine our image with sound. We also added the same fade to the sound. We wanted the first sound to be just a normal crowd. The second sound is a very distorted version of the same sound, almost like a wind-sound and then build up to be very loud with a lot of bass, to make it very uncomfortable. Or the sound will just slowly fade to nothing.
TECHNICAL IMPROVEMENTS
- We started to use MadMapper
- We started to use Processing
https://www.youtube.com/watch?v=SkcX_PIB43E
- We linked Processing to MadMapped with the Syphon plugin
- We used Cinema4D to pre-render the projection
-We decided to rebuild the device into something more solid
RESULT
This should result into a more serious installation that is solidly constructed and uses professional software that is easily adaptable for a better effect.
REBUILDING
Unfortunately the prototype broke, so we were forced to rebuild it. This gave us the opportunity to make it more solid. We made a construction out of wood around a black cardboard box. We painted the wood black. We made the device out of plexiglass again with a lasercutter. Then we glued some wood on the inside to make it more solid. The hole in the plexiglass fits exactly on top of the cardboard box so you can just put in on there. Everything else we made in the same way as we did the first time. Even though we made it more solid, it was still very fragile. It cracked a little bit, but we were able to glue it again.
KINECT
For our prototype, we used a small application called cocoaKinect, which just simply made a realtime pointcloud like the images seen above.
Unfortunately we didn't have a lot of control over the settings. There was always a colored background, there were lines in front of the image and we couldn't change the resolution.
So for our final installation, we decided to make our own code in processing. We used processing 2.2.1 and started with the DepthMap3D template from the SimpleOpenNI library.
We set up a parameter for the kinect so that everything behind a certain point would be completely black. Then whenever you step in front of the kinect, you suddenly appear as a pointcloud as if you came through a black wall. We altered the code so that the person would be able to change colo, depending on their distance to the kinect. At one point the person starts as red, but the closer you get to the kinect, the more red you become.
After one of the feedback moments we came tot the conclusion that it could be more interesting if the image would be more disturbing than just points. A more distorted version would be more fitting to our concept. We knew there was one line in our code that created all the points and that this could be replaced by other shapes:
beginShape(POINTS);
We changed (POINTS) to (TRIANGLES) and made sure the triangles would be empty by adding the noFill(); :
noFill(); beginShape(TRIANGLES);
After this our hologram looked like this:
We added a delay to the image so that the viewer would lose control over the image, so that it wouldn't be as playful anymore.
https://www.youtube.com/watch?v=M-GnvotnqpI
You can find our final processing code for the hologram here.
For our second code, we used pretty much the same one as for our prototype. We use the kinect to see where the person is in the room.
If there is no one, or the person is behind the first line, the video of the crowd is playing. Once the person crosses the line, the video switches to the animation.
The only thing that's changed is that we added a fade to the video and sound. We used opacity to make it fade.
Later we also made it work so that the animation would restart whenever you cross the line.
You can find our final processing code for interactive projection here
MAPPING
"Projection mapping, also known as video mapping and spatial augmented reality, is a projection technology used to turn objects, often irregularly shaped, into a display surface for video projection. These objects may be complex industrial landscapes, such as buildings, small indoor objects or theatrical stages. By using specialized software, a two- or three-dimensional object is spatially mapped on the virtual program which mimics the real environment it is to be projected on. The software can interact with a projector to fit any desired image onto the surface of that object.[1] This technique is used by artists and advertisers alike who can add extra dimensions, optical illusions, and notions of movement onto previously static objects. The video is commonly combined with, or triggered by, audio to create an audio-visual narrative. "
TOOLS FOR PROJECTION
HERE you will find a very good pannel of tools that you can use to do projection mapping. Some of the softwares are not available on mac/windows but all the details are on the list at the end of the article.
For the installation, we are using MAD MAPPER at an external mapping tool which is one of the best but it's not free and only available on MAC. If you want to do a first quick try, VPT is a FREE and very good software with a very complet interface. There's a lot of tutorials about it and THOMAS in the interactive station is a bite familliar with it.
Basicly you can find a lot of free tutorial on youtube.
ANIMATIONS
To deal with the animations, we recommand to take a look at CINEMA 4D which is also very useful to recreate the environment that you gonna map without being constantly with a beamer, and, most of all, see what's the best point of view for your beamer.
Plus CINEMA 4D, we also used Adobe After Effects which you can couple with CINEMA4D 3D tool to create your animations.
Once the animations are made, you can pull them in you projection tool
SOUND DEVELLOPMENT
Using several disturbing sounds with an high bass level, we hope to mix something based on several different existing music pieces :
References :
Emptyset https://www.youtube.com/watch?v=iu8tspoGuWw
Paul Jebanasam https://www.youtube.com/watch?v=19w6NNTpng4
Eric Holm https://www.youtube.com/watch?v=CkgSIaEmCWQ
Then we used sounds from freesound.org to transform them intoour own soundscape.
SKETCHING
TESTING
Because we had already made a prototype, we knew the things that took the most time. This allowed us to improve those things and test it, so it would take less time to set up. It also allowed us to make more tests to get the right feeling for the installation.
We had to find the right angle for the beamer.
We tried different animations.
SETTING UP
IN CONCLUSION
In the end, the question to be asked is whether there can be a balance between these two worlds depicted in our installation. You can interpret the digital world as a bad thing that has to be erased from our lives, but the real question has to be much more nuanced. Because there’s obviously a good side to it, it can have a real purpose. But the threat to us is that most people see the bad side of it anymore. The parents used to be the ones who would tell the kids not to use their devices during dinner, but nowadays the parents have their own devices to be distracted by. Then, the real question and purpose of this installation is more about our relation to the habits of our new technologies and how there can be a healthy balance between our real life and our digital world.