Experiment 3 – Evolution

Following on from the most expansive and grandiose system I could find I found another closer to home: life itself. I promise this is going to be less egocentric than making my head into a universe, so stay with me here.


Species evolution over time, evogeneao.com

Starting from the middle of the timeline at the bottom of the above image and working out and away what does life look like other than an ongoing escalation of variety and complexity?

Other artists are experimenting along these lines. Oron Catts is an easy namedrop and some of Stelarc’s work fits into this category, but it is the work of Neil Harbisson and Moon Ribas I want to focus on for this experiment.

Some background information

Neil Harbisson was born profoundly colourblind, meaning he sees the world in shades of grey. To compensate for this he has, over a period of many years and several iterations, designed a technological device to translate the world of colour for him.


Neil Harbisson

The result is a small camera at the end of an antenna that is implanted into the back of his skull. The camera translates the spectrum of visible light into a single octave of sound and transmits that data via bone conduction to his ears – it’s basically using the same technique we use to hear our own voices.

Self reports from Neil claim it took some time to both get used to the physical presence of the antenna and the sound it was imparting, but years into the experiment he now claims to have ‘perfect pitch’ in the single octave of sound he uses to hear visible light. He has now started experimenting with listening to light wavelengths beyond the visible spectrum, both down into infrared and up into ultra-violet.

I have, often, pondered the current relationship of human beings to evolution. And it seems to me, largely, whatever we once might have had to out evolve to survive we now out-engineer. Of course this is not a perfect way of looking at the relationship of our species to the world. I don’t need anyone to remind me of the environmental trouble we also cause, and that on that wonderful spectrum of life rending towards complexity and variety we alone are stomping on it and killing things off. Technology is in no way a perfect means of interacting with the world or solving its problems. However the example of Neil is a nice one and we’re going to focus on the positive aspects of this for the time being.

What Neil has done, in short, is evolve a new sensory apparatus. Studies undertaken by different groups on Neil and other users of SSDs (Sensory Substitution Devices) report that along with the addition of a new sense there are changes within the brain that support the addition of this new sensory information. A good resource for this is ‘Hearing Colors: An example of brain plasticity’ by Alfaro et al, 2015. – https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4396351/

What is this if not a form of engineered evolution?  How else might we expand our senses?

Experiment 3

My proposal here, unmade, is the most abstract version of the idea of escalation that I’ve hit upon so far. It is an escalation – in the sense of increase, add, build up, of sensory input.

My goal is to take a source of real-time data and map that information to my body via the use of trans-cutaneous electrical nerve stimulation. The expensive version of this is obviously a TENS unit, and the cheap and easily hack-able version is sold for, uh, other reasons.

What could you do with this idea?

[1] More sensory information, more points of view

Initially I have focused on animal data sets because the idea of expanding conscious understanding interactions between non-human species and their environment feels like a big leap forward in an evolutionary sense. Also I wonder if knowing the data is driven by a non-human, living source might help promote empathy for that data. As such my initial explorations of this idea have focused on bees.


Bee with a tracking device, CSIRO

Pictured above is a bee with a GPS tracker installed on it courtesy of the CSIRO. Dr Paulo de Souza has been working with bee populations in Tasmania, tracking their movements as part of the Global Initiative on Honey Bee Health and to better understand the problems facing bees like Colony Collapse Disorder. I’ve been in contact with Dr de Souza about the data he gathers, but, you know how these things are, I am waiting on a reply about its possible application for a project like this.

I like bees as a data source as they are about as far from human as you can get. It’s not a stretch to say most people would enjoy being in a swarm of puppies,

very few could say the same for bees. The idea then of having them as a sensory input and seeing if that information might also foster empathy for such an alien species is another interesting topic for discussion.

Another source of this information could be Movebank.org, an online database of animal migrations. The draw back of this is that it contains a lot of historical rather than real-time data.

A real-time data source feels important for this project. An abstract source of sensory input has no meaning. The mapping of data to the body needs to be done in a way that carries the same immediacy as the rest of our sensory input. It needs to have a clear and precise meaning in its mapping, a low latency in its response, and overall be a simple interpretable signal. This will aid the process of taking it from conscious semantic meaning – e.g. “Oh, this pulse means X” – to an immediate knowledge – e.g. I know how hot is it without having to think about it.

[2] Monitor for hazards

Some reports of the Chernobyl  disaster state that animals fled from the area pretty rapidly while humans had no clue about the danger they faced. Expanding sense into the invisible realm of hazardous materials seems like a useful application for a variety of reasons. Many sensors exist to monitor air quality, radiation, etc. The difference here would be the embodied experience of that data.

[3] Embody a building!

According to its documentation the Faculty of Engineering and IT at UTS  has sensors that measure indoor air quality,
carbon dioxide levels, Volatile Organic Compounds, people counting, concrete erosion, building structural movement etc. There’s no reason you couldn’t take a some of this data and map it to your body until you became sensorily ‘aware’ of it. Many shopkeepers, farmers and people who work closely with a community and area have a kind of ‘folk knowledge’ about their surrounds. They might know the ebb and flow of traffic around their street, what the movement of people means, etc. It’s really hard to have any larger connection to your environment when you’re working in an office all day. Perhaps embodying this kind of data is one way to make us feel more connected?

[4] Performance art, a practical example

Moon Ribas, performance artist and friend of Neil Harbisson, uses an external sensory input that vibrates proportionally to current seismic activity on earth. She uses this to perform via dance and drumming the often invisible movement of tectonic plates.

 

[5] What data can you find?

That data is freely available on https://earthquake.usgs.gov and many other sites online offer real-time data, often collected by governments. The EU, UK, USA and Australian governments all have vast amounts of publically available data, although not a lot of it is updated in real-time. A very abridged list of sources is available here in this article: https://www.forbes.com/sites/bernardmarr/2016/02/12/big-data-35-brilliant-and-free-data-sources-for-2016/#69992070b54d

Parts, but not time

Pictured is some of the resources I have to make this project happen. It is presented here to take some of the ‘theoretical’ edge off this idea. It’s ready to start assembling and programming but unfortunately the time frame for this course does not allow its completion.

 

 

 

 

 

 

 

Top to bottom, left to right:

Electrically conductive adhesive pads.
Placement of these would be systematic and determine the mapping of data to the body.

Bluetooth enabled Arduino board ‘Bluno Beetle’.
This is used as an interface between mobile phone – which would connect to the internet to collect data – and electronics for delivering sensory stimulus.

Lipo battery charger
A small usb and battery lug so you can charge the battery to keep the system running and power the Bluno Beetle

Lipo Battery
Although it’s not ideal to have something so potentially volatile near the skin we do it all the time when holding a mobile phone.

TENS-style device
The cheap ‘I bought this off AliExpress’ version. This can be broken down further. It does not need to be so large.

Cables.