Monday, December 23, 2024
HomeOpinions & People“I May Only See In Black & White, But I Can Hear...

“I May Only See In Black & White, But I Can Hear Colour Using My Eyeborg”

- Advertisement -

Let’s start off with a backgrounder: why do you need the “eyeborg”?

I was born with a condition that makes me the see the world in black and white – literally. It’s called achromatopsia and it means that I can only see in greyscale. Media occasionally mistakes this with colour-blindness, but the truth is that I can only see in shades of grey. During my childhood I was teased a lot due to this. Kids would give me a red pen saying it was blue and I would write essays in the wrong colour. During my teenage years, I only wore black and white clothes. This was until I got my first “eyeborg”; it allowed me to sense colours.

What created the spark that led to you integrating an electronic device onto your body?

It started when I heard a lecture by a cybernetics expert named Adam Montandon during the time I studied music composition at the Darlington College of Arts. The idea of using digital inputs from an electronic device to augment my senses excited me, especially because it meant that I could sense colour. He helped me create the first model of the eyeborg. The prototype wasn’t implanted back then, but more like a headphone. Eventually it evolved into a cyborg-like extension of my sensory system – essentially a prosthesis that would deliver input signals to my existing sense of sound.

Why use sound, was it a calculated decision or simply because you love music?

We used sound because Adam felt that it would give me a better approximation of the variations of colour, since I am a musician. Moreover, the natural occurrence of synaesthesia (Ed: A neurological phenomenon in which stimulation of one sensory or cognitive pathway leads to automatic, involuntary experiences in a second sensory or cognitive pathway) suggested that visual and auditory senses could in some cases overlap. However the challenge was in figuring out how to convert colours into sounds.

- Advertisement -

How did the team manage to convert light into sound?

Since both light and sound are waves, a physical model of transposing light into sound was used. This allowed us to create something that would create an experience similar to how we sense colours – in a continuous spectrum. Although light waves have a wavelength that is too high to hear, we were able to mathematically transpose them down until they sit within the audible wavelength. We then implemented this into software that runs on a wearable device outfitted with a camera. Red is the lowest colour in the visible spectrum, and it is also the lowest note that I hear.

“Those Rows Of Rainbow-Coloured Bottles In A Supermarket’s Cleaning Product Aisle, They Sound Like A Symphony To Me”

How challenging was it to get this software to work properly?

A challenge that came up with the use of a digital camera was that of saturation. The digital camera tended to over saturate or under saturate what it saw based on the environment, much like how our smartphone cameras do in a dark or bright environment. We know how irritating it is when that happens while we snap a picture, so imagine the same happening to me all the time! We were able to solve this by tuning the system to detect 360 different hues and disregard brighter or darker versions. Saturation is used for adjusting the volume of sound.

How is the wearable/implantable experience working out for you?

The initial designs left me with cables coming out of my head into a computer outfitted into my backpack. It looked awkward and made people feel uncomfortable around me. With modern technology, the eyeborg now uses a chip placed at the back of my skull to convert colour into sound. The converted sound is then transferred to me though the device pressed against my head by using bone conduction technology. The device itself lets me hear music or receive phone calls directly to my head.

Why did you use bone conduction?

Bone conduction technology allows me to sense the colours through a different “channel” – which is to say that I would be able to hear people speak and hear the colour of their clothes at the same time. I’m also having it osteointegrated, which will put the device inside the bone and then the sound will resonate much better.

Which technology are you most excited about that will help you enhance this device even more?

It’s 2015, and I still have to recharge the device at a power socket by standing near it while it charges. However, I’m working on ways to use my blood circulation instead.

What other cyborg projects are you working on?

The Fingerborg is a prosthetic finger that we designed for a multimedia student who lost a finger in an accident. It features a camera inside, and we are working on suing the camera to deliver feedback from the camera to his finger. The Speedborg are internal radars that allow you to perceive the exact speed of movements in front of you. While the prototypes were attached to the hand, later advanced versions were attached to the earlobes. We are also working on a 360-degree sensory extension that allows people to sense if someone is standing behind them.

- Advertisement -
Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here
Captcha verification failed!
CAPTCHA user score failed. Please contact us!

News

Solutions

Most Popular

×