Difference between revisions of "User:Jerryestie"

From DigitalCraft_Wiki
Jump to navigation Jump to search
 
(22 intermediate revisions by the same user not shown)
Line 7: Line 7:
 
0902472@hr.nl
 
0902472@hr.nl
  
==Making is Connecting ==
+
==Practice Q7: Making is Connecting ==
 
For my second year project please look at [[User:Jerryestie/Making | Making is Connecting 2016]]
 
For my second year project please look at [[User:Jerryestie/Making | Making is Connecting 2016]]
  
 
[[File:workjr7.jpg|500px|Caption]]
 
[[File:workjr7.jpg|500px|Caption]]
  
=Human?=
+
==Practice Q9 & Q10: Craft & Radiation ==
[[File:Westworld-skele-fb-1.jpg‎]]
+
For my third year project please look at [[User:Jerryestie/Practice | Craft and Radiation 2017]]
  
==Week 1==
+
[[File:Gifcensor.gif|500px|Caption]]
Soon
 
  
=Quarter 9: Workshops & Assignments=
+
=Minor: Human?=
==Inspiration and Own Work ==
+
[[File:Westworld-skele-fb-1.jpg‎]]
 
+
[[File:Bladerr.jpg]]
===Breakfast Glove===
 
One of my own projects together with Joëlle was the breakfast glove for paper strings and electronic things. We wanted to make a freeze frame high five device. A song will play when you high five each other, but only if you stand still during the song. Check the video here:[https://youtu.be/UF43N295ig0 Breakfast Glove]
 
 
 
===Dunes===
 
[https://www.studioroosegaarde.net/project/dune/info/ Studio Roosengaarde's Dune Project] was a project here in Rotterdam with lights that responded to users walking next to them.
 
 
 
==Antenna==
 
[[File:Hallo copy.jpg | 500px]]
 
 
 
==Radio Experiments==
 
===Part 1===
 
With the tuner I had the idea of trying to find 'weird' noises and translating them into more humane sounds. So instead of looking for actual voices, I wanted to create them myself. I tried to do this with the program Audacity, using effects like Pitch etc.
 
 
 
[[File:audajerry1.png|500px]]
 
[[File:audajerry2.png|500px]]
 
 
 
[http://vocaroo.com/i/s0Mv0mqfzFku Original Sound File: First Test] & [http://vocaroo.com/i/s12rHJqiyUAR Edited Sound File: First Test]
 
 
 
The first test I mostly played with pitch, I tried to remove the static and see if that created a human feeling. It didn't, in my opinion. Instead it just sounded more flat.
 
 
 
[http://vocaroo.com/i/s0UbY5D8OQfX Original Sound File: White Noise] & [http://vocaroo.com/i/s0YrF0LFZAmK Edited Sound File: White Noise Talking]
 
 
 
Next, I tried to find a weak signal, where maybe you couid almost here a conversation but would be too confusing. It already begins to sound more human, yet some stretches of the 'conversation' have this glitchy sound to them that I couldn't get rid of. But just removing the noise and changing the pitch really helped.
 
 
 
[http://vocaroo.com/i/s0EVstmv3W5b Original: Noise] & [http://vocaroo.com/i/s1CJcNlD55gH Edited: Noise in Deep Space]
 
 
 
This was a 'happy accident'. Trying to make this noise more human I instead got the sound of a starship in deep space... I changed the pitch to a very low sound and changed the speed of the noise. This, together with messing around with the bass and treble (I was actually trying to decrease the bass) got this humming space sound.
 
===Part 2===
 
[[File:Buitenjr2.jpg| 300px |Looking for signals near the maritime museuml]] [[File:Buitenjr1.jpg | 300px | ]]
 
 
 
For the second workshop week we went outside to try and pick up some more signals. At first it was hard to find anything but eventually we were able to find two truckers communicating with each other. The conversation was very casual with some mentions of dissapointing children.
 
 
 
==Workshop with Jon==
 
[[File:Workshopjr.jpg | 500px]]
 
  
During this workshop we switched ideas around the table to the group in front of us. In our case, we first came up with the idea to create a sort of souvenir that allowed you to connect to the country of the souvenir through NFC or another connection method. Our given instructions were to make a device that changes sound into water waves. By connecting a speaker underneath foamboard we were able to make the sound move, either by music or voice call (on max volume)
+
==Week 1: Kick-Off==
 +
===Beginning===
 +
My group with Jochem and Naomi had the body parts of index finger, pink and the eye(s). We started out with topics that we found fitting before coming up with a more concrete subject. Things like Identity and security came up often and also folklore and myths involving hands and eyes. For instance: How in Japan would cut off their pinks as a way of apology, or how ISIS followers point up their index finger to the sky.
  
==Imaginary Radio Assignment==
+
===Initial Ideas===
====Introduction====
+
There were a few quick concepts we came up with before ending up with our hand-eye dis coordination box.  
For the assignment I worked together with Naomi van Maasakkers. For inspiration we decided to look into the past instead of into the future and looked up a database of sci-fi devices in books from the 40s to the 70s. We used the site [http://www.technovelgy.com/ Technovelgy] for this.
 
  
====Mediaglyphic====
+
=====Pointing fingers=====
After looking at several options, me and Naomi decided to go with the "mediaglyphic". A device for people not bothered to learn how to read: Translating words into pictures. In a way, this is happening right now on the web: A lot of information is distilled into quick headlines with 1 image, since nobody seems to take the time to sit down and read something anymore. From the 10-second time limit on Snapchat (which dissappears afterwards) and 140 characters-news on Twitter.  
+
[[File:Wewantu.jpg]]
  
====Mediaglyphic in the internet age====
+
When a finger points at you it creates this immediate connection. It can be a bit uncomfortable and we wanted to play with this feeling. One of the ideas was to walk into a room where fingers would follow you (through the use of a Kinect) and point at you. Another was a closed-off space where you would have to point into the camera before entering and then being confronted by the previous visitor pointing at you. There was a variation where it would connect the people pointing.
We decided to combine this new era with the idea of the mediaglyphic by creating a web plugin that changes news articles to articles in GIFs. Each new color represents a new sentence and headlines are bigger than normal text. All GIFs were found by using giphy.com's search for the word that's displayed. We also made a poster together. We used a New York Times Article as an example.
 
  
=====Website=====
+
[[File:Sketchjer1.jpg]]
[[File:Gifspage.jpg | 800px]]
+
[[File:Sketchjer2.jpg]]
  
The website can be found here: http://imaginaryradio.businesscatalyst.com/index.html
+
=====Games=====
 +
There were several variations where we played with the idea of a 2-screen/2-user installation. This set-up would have two players use the installation against each other in several ways. The 'controller' would force the player to use only the pink and index in a rapid order to force a sort of carpal tunnel syndrome. The player who would use the controller the fastest would induce a flashing/annoying eye-strain creating image to the other player. Another variation to this was the cheat screen. The cheat screen would give one player, unbeknownst to the other player, a sort of cheat screen that could make things worse or better for both the player and its competition.
  
=====Poster=====
+
[[File:Screeeeeeeeb.png]]
[[File:Poster_digitalcraft2.jpg | 500px]]
 
  
A moving poster can be found on: https://indd.adobe.com/view/f63629f0-bda2-4a7d-a21f-695524848298
+
===Hand-Eye Discoordination===
 +
With some time pressure from Tim we ended on a final concept of putting the coordination between eyes and hand on trial. The question basically became "what would happen if you can't see what your hand is doing anymore."
  
==Workshop with MICA==
+
=====Buttons=====
...one early tuesday morning we were put into mixed groups of WdKA and MICA students. I was in the group post-radio aesthetics. We started our morning by splitting the group into different parts: Builders and concept. The builders were on their way to build the given the device while the rest of us were put in several other groups but in the end we all just decided to go by the name 'conceptual group' ;). There we came up with the idea (after sparring across a few ideas) to try and make radio waves a tangible object, while keeping it's visual experience. First, we used a hand-built radio receiver to record a radio sound. Next, we opened that sound file in Photoshop and converted it into 6 different images. For one of the objects we used fabric and the lasercutter: The lasercutter was used to engrave unto fabric the 6 radio imagery parts. The other technique was knitting (an old friend from Q7 assignment last year!) where we used the bitmap to create 2-color knitting pattern.  
+
We decided to find several buttons and hide them from plain sight. By using these buttons you would get a visual feedback of what you were doing. We went to the piekfijn to try and fide specific buttons so that each one would be a different sensation.
  
Though in the end the engraving worked well, in hindsight I would have loved to spend a full week on this assignment to get the full potential out of this assignment. We used a lot of 'conversion' (Record audio> open audio in photoshop > save as different format) and this meant that everything was sort of 'planned'. In my own vision I would have loved to see this signal being picked up by software or hardware that we created and would convert this to imagery in real life without any manual labour from our part. I sort of had a plotter in mind for this, but again, time constraints.
+
=====Arduino Coding=====
 +
I focused on coding the Arduino and getting an input from the buttons that are pressed. There had to be a way to have each button have an unique identifier so Processing would know which button was pressed, since all input came in as the same numbers. I did this by adding a line to the serialprint that adds a different letter per button. This way Processing was able to differentiate all the different inputs.
  
Another idea was to use the knitted material and make it a video that shows the timeline of audio moving along with the knitted scarf, showing the visual and audio at the same time would make it easier to understand what these visuals meant (in my opinion.)
+
=====Exhibition=====
 +
filler
 +
soon im too lazy to export pics atm
  
All in all a fun day and morning, I have kept contact with a few students as well and maybe we'll work together again in the future!
+
=====Observations and Feedback=====
 +
One of the things that was immediately clear during the exhibition is that people didn't get that they had to turn around and watch the screen. They would either just look at the box and play around and then turn around after moving something or pressing a button. We did not communicate this properly or force the user to stand in the way we intended.
  
[[File:Eyyyyok.jpg | 500px]]
+
In general most people didn't (immediately or at all) get what the project was about or what was going on. The interactivity helped spark people's interest but the concept wasn't easy to see or understand through our installation.  
  
=Quarter 10: Own project (Pre-Research)=
+
The buttons were also too hard to find, whether this was actually a good or bad thing I haven't made my mind up about yet. I feel like it's in a middle ground right now, it could either be a lot harder and become this sort of brutal installation or easier but then how do you add a challenge to the lack of eye coordination?
 
 
===Early Thoughts===
 
For Quarter 10 and this project in general I haven taken an interest in NFC. This, relatively, new technology. In fact, I want to explore this very specific part of NFC products: The 'toy to life' Industry. Several major brands have been releasing these toys over the years that have NFC chips embedded in them. What these chips do is bring a toy into a video game world. While I have no interest to own these toys myself, I have always found them interesting. Especially the marketing and thoughts behind them that is being communicated to kids. The idea of a living 'thing' inside  your toy coming to life is of course very exciting.
 
 
 
I got inspired by the project of Boris and company (sorry, don't know everyone's name) with the sound recording of different technologies. I want to try and recieve this audio of an NFC chip inside this figure and find a way to automatically change this electronic sound to a more 'living' or 'organic' sound. To basically give this idea of actual 'life' inside these toys. This is jut my initial idea and I hope that during the coming weeks it will expand as I learn more and more about this technology.
 
 
 
 
 
===Checking out NFC tags===
 
[[File:Jerrrr.jpg | 700px ]]
 
 
 
By using the app Any NFC Launcher on Android you can see the ID tag of NFC cards. By using this app you can launch a certain app when you connect an NFC card like your OV chip card. One weird discovery is the Dutch Identity Card: It seems to send out weird NFC ID codes, and it's almost impossible to pick up the same one. I've had 6 different codes so far but haven't gotten the same one twice?
 
 
 
===Random Bits===
 
[[File:Magnetic-fields.png]]
 
 
 
[https://www.youtube.com/watch?v=XECHcbakMIg  RFID Installed into arm]
 
 
 
[http://stackoverflow.com/questions/18121707/nfc-tag-to-count-number-of-taps Tracking the amount of times an NFC tag is used]
 
 
 
 
 
===Changing Direction===
 
NFC doesn't seem to hold my interest anymore, the more I know about it the less I like it. I decided to do something else for Q10. I have two reasons for this. One of the reasons for doing so is that I want my digital craft project to fit within my practice of graphic design. The second is that I like the idea of turning sound into imagery (coming back to graphic design). I developed this change of thought during the workshop with the MICA students, where we worked on the post-radio aesthetics.  The week after I talked to Roel with a (badly explained) idea.
 
 
 
===Finding each other through radio===
 
The idea was from during the workshop, altered a bit. I wanted to make a sort of hide & seek game where the only way of finding each other was through a FM transmitter and receiver. Each person sends out a different (sounding) signal. At the end of the game you get a 'report' (this is where the graphic design comes in) of how close to each person you were: Just like each person had their own sound, they will also have their own shape for a pattern. If people stand in a certain radius next to each other, the geometric shapes will come out of their color area and into the person's color area. In the end I thought maybe it could also work more as an installation where people are actually aware of their position to each other and can see the patterns being generated on a screen, this way they can make the art themselves (maybe choose your own geometric shape/color too? So in a way this can work in two ways:
 
 
 
'''Hide and seek:'''
 
 
 
Blindfolded using only sound to find each other, with a report of sorts at the end showing your progress.
 
 
 
'''Making a design together:'''
 
 
 
Not blindfolded and aware of the shapes being made, this would be more of an installation where you can see the shapes you make and can move around to see how the design changes.
 
 
 
[[File:Artboard 1.png  |500px ]]
 
 
 
===Jpeg to Audio, Audio to Jpeg===
 
After making the choice of trying to connect this quarter more to my graphic design practice I played around with the idea of trying to make an image with the word 'hey' into an audio file that says hey, and do it vice versa going from audio to a jpeg. While audio to jpeg is still unknown to me, I 'sort of' managed to at least get an H with the jpeg to audio tests.
 
 
 
====Hey====
 
[[File:Hey.jpg |200px]]
 
 
 
[[File:Deheys2.jpg | 600px]]
 
 
 
[[File:Deheys.jpg | 600px]]
 
 
 
 
 
====Audio to imagery example====
 
[[File:Screen Shot 2016-11-08 at 09.01.22.png | 300px]]
 
NME Soundcloud Commercial: https://www.youtube.com/watch?v=SSgFquXRkV4
 
 
 
=Quarter 10: Facial Recognition and emotions=
 
===Concept===
 
The human body can distort the reception of a tv or audio signal. For our project we wanted to create an interpretation of this phenomenon. We want to make an interactive poster/installation that is influenced by the emotional state/age/gender of the viewer.
 
 
 
My research partner in this project is [[User:Naomimaria | Naomi van Maasakkers]]
 
  
 +
The visuals also had a disconnect with what you were doing, it didn't seem to be a whole product but rather two separate things.
 +
==Sensor==
 +
For the sensor I want to use sound as an unlocking mechanism. Using a distinct sound or sounds made in a distinct order.
 
===Research Links===
 
===Research Links===
http://www.mobileinc.co.uk/2010/03/cebit-2010-terminator-style-facial-recognition-detects-gender-age-group-mood/
+
[https://forum.processing.org/two/discussion/14502/create-tap-tempo-metronome Getting BPM]
 
 
http://blog.mashape.com/list-of-10-face-detection-recognition-apis/
 
 
 
http://nordicapis.com/20-emotion-recognition-apis-that-will-leave-you-impressed-and-concerned/
 
 
 
https://www.microsoft.com/cognitive-services/
 
 
 
http://lifehacker.com/build-your-own-smart-mirror-with-a-two-way-mirror-and-1739447316
 
  
[[File:Facial-recognition-600x400-e1267722305772.jpg]]
 
  
[[File:Screen Shot 2016-11-21 at 15.58.46.png | 500px ]]
+
----
 +
float bpm = 80;
 +
float minute = 60000;
 +
float interval = minute / bpm;
 +
int time;
 +
int beats = 0;
 +
 
 +
 
 +
void setup() {
 +
  size(300, 300);
 +
  fill(255, 0, 0);
 +
  noStroke();
 +
  time = millis();
 +
}
 +
 
 +
void draw() {
 +
  background(255);
 +
 
 +
  if (millis() - time > interval ) {
 +
    ellipse(width/2, height/2, 50, 50);
 +
    beats ++;
 +
    time = millis();
 +
  }
 +
 
 +
 
 +
  text(beats, 30, height - 25);
 +
}
 +
----

Latest revision as of 11:33, 4 September 2017

Main Information

Jerry Estié

0902472

0902472@hr.nl

Practice Q7: Making is Connecting

For my second year project please look at Making is Connecting 2016

Caption

Practice Q9 & Q10: Craft & Radiation

For my third year project please look at Craft and Radiation 2017

Caption

Minor: Human?

Westworld-skele-fb-1.jpg Bladerr.jpg

Week 1: Kick-Off

Beginning

My group with Jochem and Naomi had the body parts of index finger, pink and the eye(s). We started out with topics that we found fitting before coming up with a more concrete subject. Things like Identity and security came up often and also folklore and myths involving hands and eyes. For instance: How in Japan would cut off their pinks as a way of apology, or how ISIS followers point up their index finger to the sky.

Initial Ideas

There were a few quick concepts we came up with before ending up with our hand-eye dis coordination box.

Pointing fingers

Wewantu.jpg

When a finger points at you it creates this immediate connection. It can be a bit uncomfortable and we wanted to play with this feeling. One of the ideas was to walk into a room where fingers would follow you (through the use of a Kinect) and point at you. Another was a closed-off space where you would have to point into the camera before entering and then being confronted by the previous visitor pointing at you. There was a variation where it would connect the people pointing.

Sketchjer1.jpg Sketchjer2.jpg

Games

There were several variations where we played with the idea of a 2-screen/2-user installation. This set-up would have two players use the installation against each other in several ways. The 'controller' would force the player to use only the pink and index in a rapid order to force a sort of carpal tunnel syndrome. The player who would use the controller the fastest would induce a flashing/annoying eye-strain creating image to the other player. Another variation to this was the cheat screen. The cheat screen would give one player, unbeknownst to the other player, a sort of cheat screen that could make things worse or better for both the player and its competition.

Screeeeeeeeb.png

Hand-Eye Discoordination

With some time pressure from Tim we ended on a final concept of putting the coordination between eyes and hand on trial. The question basically became "what would happen if you can't see what your hand is doing anymore."

Buttons

We decided to find several buttons and hide them from plain sight. By using these buttons you would get a visual feedback of what you were doing. We went to the piekfijn to try and fide specific buttons so that each one would be a different sensation.

Arduino Coding

I focused on coding the Arduino and getting an input from the buttons that are pressed. There had to be a way to have each button have an unique identifier so Processing would know which button was pressed, since all input came in as the same numbers. I did this by adding a line to the serialprint that adds a different letter per button. This way Processing was able to differentiate all the different inputs.

Exhibition

filler soon im too lazy to export pics atm

Observations and Feedback

One of the things that was immediately clear during the exhibition is that people didn't get that they had to turn around and watch the screen. They would either just look at the box and play around and then turn around after moving something or pressing a button. We did not communicate this properly or force the user to stand in the way we intended.

In general most people didn't (immediately or at all) get what the project was about or what was going on. The interactivity helped spark people's interest but the concept wasn't easy to see or understand through our installation.

The buttons were also too hard to find, whether this was actually a good or bad thing I haven't made my mind up about yet. I feel like it's in a middle ground right now, it could either be a lot harder and become this sort of brutal installation or easier but then how do you add a challenge to the lack of eye coordination?

The visuals also had a disconnect with what you were doing, it didn't seem to be a whole product but rather two separate things.

Sensor

For the sensor I want to use sound as an unlocking mechanism. Using a distinct sound or sounds made in a distinct order.

Research Links

Getting BPM



float bpm = 80; float minute = 60000; float interval = minute / bpm; int time; int beats = 0;     void setup() {   size(300, 300);   fill(255, 0, 0);   noStroke();   time = millis(); }   void draw() {   background(255);     if (millis() - time > interval ) {     ellipse(width/2, height/2, 50, 50);     beats ++;     time = millis();   }       text(beats, 30, height - 25); }