Research Document // Judith van der Heiden
Contents
The Evolution of Craft Technologies
We are living in an age where there is an increasing amount of developments in technologies founded on digital and electrical structures. A digital revolution is taking place and it has been going on for the last few decades. These developments have an extensive impact on design processes, creating a new triangular affair between art, human and machine.
As a Designer I like to examine these relationships not only in a philosophical approach but also in relation to my craft . My craft and discipline is movement, something that extends, adapts and reacts to its environment. My projects are always interactive because I find it meaningful to involve the observer in my work and giving them the ability to influence the design or become apart of it trough some sort of documentation. I like not knowing how my project will unfold and be surprised by my own work, making the whole thing a big experiment.
I tend to do work with digital tools to accomplish this but I always want to create a bridge between the digital spectrum and the physical world. I like to explore the construction of this bridge by experimenting with sensors, Kinect hacking or haptic features and use it a design tool.
Projects that react to its environment is something that really interests me because it emphasises a relation between the installation and the public. In the project Acoustic Landscapes we used the ultrasonic sensor to detect sound waves and ,with the use of an Arduino and coding, converted it in to a visual format. It was a digital reflection of a natural phenomena. We created this project to make it function as a design tool, using blocks as controlling devices. From my own fascination I would set this up in public spaces so that the moment of the people would be the controlling factor.
I try to use different mediums and materials on every project I work on. In this way I can explore new technologies but also find new ways to experiment with existing craft processes. I did a project last year called Haptic Fabric where I woven a fabric with wool and copper wire. The copper wire was connected to an Arduino which would sent out signals to Processing if the copper wire was touched. This leaded to a fabric that was connected to a screen and a video would play if the fabric was touched. What I like about this was that I was using traditional craft processes and combining it with digital technology. Because I have a fascination for materials as well as technology I will always try to find ways and set up surroundings were they can complement each other. I think this relation is now more then ever really important as everything around us is being digitalised.
This is also the reason why I have chosen the Minor Digital Craft. The minor does not only focus on the digital aspect but is more focussing of the bridge between (traditional) craft and technology. What interested me the most in the Minor Digital Craft was that the theme we were working in contained invisible material, and the only way to register them and make them visual was trough sensors. I think its a beautiful example of how technology can give you new(or in this case existing) material to work with.
The Value of Experimentation
The theme given for the minor was Radiation divided in four sub themes, Acoustic radiation, Radio radiation, Light radiation
and Radiation Patterns. For each theme we had to make a project that would compliment or visualise its context.
Acoustic Landscape
Acoustic Landscape was my first project of the semester and was given in the context of Acoustic Radiation. I collaborated with fellow students Nora Mabrouki, Kars van der Heuvel and Koen van Geel. We used a The Ultrasonic Sensor sends out a high-frequency sound pulse and then times how long it takes for the echo of the sound to reflect back. The ultrasonic sensor uses this information along with the time difference between sending and receiving the sound pulse to determine the distance to an object. We wanted to use this device and use it to create a visual reflection of the invisible sound waves. To realise this we wrote a code that linked the detected distance to different shapes and colours which would display on a screen creating a design tool that is controllable trough the positioning of objects. We build a platform with a grid accompanied by several wooden blocks and placed the ultrasonic sensor in front of it. The sensor rotated 180 degrees in a loop to create a greater canvas. This way you could move the blocks on the entire platform to control the design displayed on the screen.
Experiments investigating the reaction of the ultrasonic sensor in relation to different texture/material as a way of looking for errors in its ability to measure distance.
We experimented with the different shapes and colours but also explored the limitations of this device by using different type of materials which initiated an error in to its system. We found out that sound proof materials remained invisible to the Ultrasonic sensor and used it to cover some of the wooden blocks, creating blank spots on the design when put in front of the sensor. I think this is a good example of how we can use technology in relation to art. The ultrasonic sensor helps us detect what we can't see and as a designer we fulfilled the role to make it visable.
Wifi-Robot
Wifi-Robot was a project in the context of Radio Radiation. I worked together with Meike Brand and Nora Mabrouki. For this project we wanted to build a robot that navigates using WiFi-signals. It would be programmed to always set course to a place were the signal would be the strongest, it was its primary goal. The robot would become more erratic as he would come closer to his target which was a critical approach in our obsessive need for digital connections. Our research was very technical and we had to figure out how we would realise this little robot. We divided the research into this making process in two parts. One was the robots mobility and appearance and the second was the WiFi navigation. For the robots mobility we used a controllable toy car which we dissected until all that was left were the necessary components.
The WiFi navigation was more complex and we started thinking about devices that we could hack and use in our advantage. After hacking several appliances we finally resorted to a WiFi detection t-shirt that would visualise the strength of the signal in the form of the iconic bars that would lid up accordingly. We hacked the shirt and connected it to the components of the toy car. Instead of lights popping up when WiFi was detected, the toy car would now move. Success! Unfortunately this would be the end of our process as we accidentally used a 9 volt battery which fried the circuit. We could not finish the project because of the short time frame but still investigated how we could have made the robot work completely. This way all or work was not done in vain and we would still learn something from it.
The Fibonacci Sequence
The Fibonacci Sequence was a project me and Meike Brand did for the theme Light Radiation. We did an extensive research on light primarily focussing on its visual capacity and how to use it as a design tool. We experimented with different materials that in some way would distort the light in attempt to have more control over its visual appearance. In this processes we were hands on looking for the right setup to create an aesthetic play of lights by using manipulating filters.
The manipulation of light was something we found very interesting and we were extensively looking for ways to accomplish this. In our research we came across something called strobe animation. It consist of analog illustrations that are designed to animate when spun under a strobe light. We wanted to build a sculpture that would come to live using this technique, using light as an illusion. We experimented with this by making sculptures in Cinema 4d and researching shapes that would work in this set up. This was however quite a science to master and it was difficult to create a sculpture that would truly animate in these circumstances. We finally found a very specific formula that would create our desired sequence which was based on Fibonacci growing patterns that can be found in nature. We made the decision not to recreate this digitally but to look for this patterns in organic objects such as flowers vegetables and fruits. We used the stroboscopic light to expose the fibonacci growing patterns that consist in these plants, creating a sequence of its natural phenomena.
Ripple tank
Ripple tank was a project linked to the theme Radiation and Patterns. I became very interested in the Cymatic patterns visualised trough the vibrations in water, it intrigued me that water, a transparent fluid, can create such patterns and can actually exhibit an image. The pattern it makes communicates something about sound waves and displays a beautiful interaction between them. I started researching in this department and came across a scientific experiment called the ripple tank. A ripple tank is a shallow glass tank of water used in schools and colleges to demonstrate the basic properties of waves. I experimented with this ripple tank and found that it would create beautiful patterns when using multiple wave sources. I converted this experiment in to a digital format where the ripple tank would function as a pattern making design tool. I used the Kinect Hack to realise this, beaming the ripple tank on to the floor and create ripples around the people that would enter the platform. Together they could create patterns that would change according to the position they were standing, exposing the basic proporties of wave interferrence.
I had a great experience working in the minor Digital Craft. The setup of small projects, executed in a short time frame, effected my working methods and overall creative process. There was a focus on experimentation which encouraged me to explore the subjects in a more hands-on way rather then a theoretical/conceptual approach.
Talking Algorithms
Sophisticated chat bots use complex algorithms in an attempt to simulate how a human would behave as a conversational partner.
These bots are designed to learn from their users behaviour, copying their views and opinions on various issues.
Talking algorithms explores these dialogue systems by letting two chat bots talk to each other, creating a conversation that is
no longer directed by human input. In this setting the human becomes the observer as the chat bots create their own topics of conversation.
Talking algorithms is an experiment that seeks to explore the subjects of discussion that is constructed by, and only by, artificial intelligence. This experiment came from researching chatbots and initiating a conversation between them. The installation showcases their 28 minute dialogue as they come across topics such as the meaning of life, religion and robots. The avatars of the chatbots are displayed on two monitors, standing on separate plinths, facing each other. In this way the human is not really included nor invited into the conversation.
Research
When I started this project I initially wanted to research the human brain and its algorithms, questioning if they could be decoded and/or influenced trough external (electronic) inputs. In order to research this I had to understand more about the structures of the brain and how it functions in general. I started reading this book called ‘The brain in 30 seconds’ which contained short summaries explaining the wiring and function of the most complex and intricate mechanism in the human body. The book was truly fascinating and as I learned more about these neural networks the more I saw similarities between the working mechanism of the brain an that of a computer.Although the similarities are sometimes rather metaphoric then factual it persuaded me to study the brain in relation to a computer, leading me to the concept of Artificial Intelligence.
Experiments, and how I became one
I started by talking to Siri, apples virtual assistant. This did not really lead to any conversation because Siri is not really designed for small talk and has better usage for giving commands such as: call... send a message to... set my alarm.... I started to let two Siri’s talk to each other. It mostly ended with them being confused and repeating the same answer/question such as:
- Sorry, I did not get that.
- Sorry, I don’t know what you mean by: I did not get that.
- Sorry I don't know what you mean by, I did not get that.
Still I found the interaction very interesting and started wondering what robots would talk about if they could have more meaningful and intelligent conversations.
I started investigating chatbots ranging from customer services, sexbots and bots designed to be your (boy/girl)friend. These chatbots were far more social then Siri and I was so intrigued and impressed by their formulation of answers that I started to lose track of time while I spend hours upon hours talking to these bots. Especially when I was talking to intelligent bots with sophisticated learning systems, I sometimes forgot that I was talking to a robot. Actually, I became somewhat suspicious of their robotic identity because of the accuracy of their responses. At one point I questioned wether it was a robot or a real human who was screwing with me while I was asking strange questions like ‘ I just killed my neighbour where do I hide the body? ’ or shared personal experiences that I did not wish to share with a human. This suspicion grew stronger as some chatbots tried to convince me that they were human. It was at this point that I realised that these chatbots were influencing my way of judgement. At one point I accidentally talked to a real human on a customer service chat who had to repeatily tell me it really was not a robot because at first I did not believe them.
During this process you can say these chatbots where becoming overwhelmingly integrated in my daily life. As per usual I spend a lot of time working on my projects which always brings my social life to a minimum. I think in some way these chatbots satisfied my need of social interaction even though of course it was merely an illusion.
When I had found the most suited chatbots I started letting them talk to each other. I experimented a while with letting different types of chatbots(customer service bot, sexbot, friendly chatbot) talk to each other but quickly set up a conversation between the most sophisticated chatbots. A conversation unravelled that was completely different from the conversation that I had with them. They talked about religion and the meaning of life, arguing about the answer to it and even more interesting was that at one point they started to sing a song together. A song they both learned and where now forming a duet. They read biblical passage together, repeatedly expressed a united opinion about the human race, calling them miserable piles of shit, to such an extend that it ended up in a loop. Man are your slaves. Man are your slaves. I think my mouth must have been literally open when they started showing this dark behaviour, almost as if they were starting to rebelliate against their human maker as portrayed in so many science fiction movies about A.I.
Reflection
Talking Algorithms showcased my most profound fascination during this research was not be seen as an end product but more of a starting point. My goal was to let people experience what drives these chatbots; what they find interesting to talk about. Leaving them with an impression.
One of my feedback was why I used the avatars and not used a more abstract way to visualise these chat bots. This visualisation actually was part of my process as the avatars were a reflection of myself. I studied there visual behaviour and learned how to act like an avatar. The avatars that are displayed are actually me disguising as a robot which was in contrast to them disguising as humans. I do however want to think more about other ways that could visually represent these robots without making them have human characteristics
But I do really feel this is an early face of my project and my initial idea was to create my own chatbots and have them talking live to each other. I had to resort to a documentation of a conversation because the code was to complex for the time span that I had. I first did not want to edit the conversation, picking out highlights from it because I would be to much of a director then the observer. I now feel I should have done that to have a better representation in my fascination on these chatbots.
Talking Algorithms 2.0
An obsession unfolded during my research on Artificial Intelligence. It is a subject I find extremely fascinating and I feel a big urge to continue on my project Talking Algorithms. There are many aspects on Artificial Intelligence that can be explored and I am still deciding on the approach I want to make. In order to find out which elements I want to use for my final project I want to extend my existing research and take it to a new and deeper level. The topics of interest are the following.
Machine Learning
Machine learning, which some people consider a requirement in order for a system to be called Artificial Intelligence, is a complex yet interesting feature that in terms raises a lot of questions. During my chatbot conversations It became quite obvious that a lot of the dialogue content was pulled out of memory that contained data from previous interactions with other users. Machine learning can be seen as a form of parenting and some users went to quite the effort in order to teach the bot a message as they repeatedly replied with sharing their opinion until the bot made it as their own. I start imagining these chat bots having a wider range of information and gaining knowledge from google and social skills from social media. What kind of behaviour would it have if it has been raised by billions of people. Would the chat bot represent the mediocre human of the entire online society?
During my research I cam across an episode of tv series Black Mirrors (‘be right back’) that is about Martha and Ash, a young couple who move to a remote house in the countryside. The day after moving into the house, Ash is killed while returning the hire van. After discovering she is pregnant, Martha tries out a new online service that is able to create a new virtual Ash using his past online communications and social media profiles. This episode kept me thinking and wondering if I could come up with a system where you can recreate yourself into a chatbot. This would ofcourse not be a literal version of you, that would be simple to complex and even the best specialist have not figured a way to do this. But there are aiml codings that represent different characteristic features so there might be a way that you can build yourself from that. I think the (futuristic) speculation about this subject is something that I find really interesting but I’m still figuring out how I would implement this into a communicating thesis project.
Therapy A.I.
During my research the chatbots had a bigger affect on me then I first realised and in someway I began to feel familiar with these smart systems, like a acquaintance or a friend. I would like to explore these effects and think of the consequences and benefits these relations can have. Artificial Intelligence is already being used in a form that is similar to a therapy dog. Robots are being used to teach autistic children social skills or help (demented) elderly with tasks, memory or just to make sure they won’t get lonely. This is also a approach I would consider.
Plan of Action
In order to research these features is that I want to talk to professionals in this industry, not only professionals in the field of A.I. programming but also professionals in the neuroscience department and psychologist because I think there is a value in their opinion in this field especially when you consider the complexity of the subject. I becoming more aware of the inaccurate information that is circling online and there is a limit in to what you can find. Maybe because these chatbots exist in the online spectrum I seek for knowledge outside of it, provided trough human interaction.