I was lucky enough to speak with Dr Aldo Faisal of Imperial College London earlier this week who spoke to me in great depth about the pioneering work he is carrying out with eye tracking software.
The project started 5 years ago when Dr Faisal’s lab was set up, and according to him the potential it has is enormous. “Those who could benefit include people with MS, arthritis, Muscular Dystrophy, those paralysed by accidents, Parkinson’s, etc. We could go on all night. It’s ideal basically for people affected by any form of movement disability you can think of”.
Because the eyes effectively have their own department all to themselves within the human brain (please forgive the totally unscientific term) the eyes are not affected by a lot of illnesses that hamper mobility and movement. In many cases this makes the eyes the last remaining resource for communication.
Dr Faisal found that the state of the art software he developed with his team could work just as well on simple (customised) hardware costing just £9.95 that is normally used for video games. The more specialised hardware would normally cost well over £20,000.
The hardware may seem smart enough but for Faisal it is very much the junior partner in this project. “The hardware is fairly simple. It’s been around for a few years now. All our technology, because it relies on smart software, doesn’t need smart hardware”.
So how does it work?
The eyes might be windows to the soul but Dr Faisal is more interested in what they can tell us about the brain. “This science…” he explains, “…combines neuroscience and technology (neuro-technology). A lot of the research revolved around how the human eye interacts with the world”.
The software that the camera has been fitted with is able to not only track but also decode the movements of the eye and predict the user’s intentions. Building on decades of research into what eye movements mean, Faisal and his team have made the most in-depth study ever in this field and we now have a better understanding than ever before as to how our eyes work and behave.
Using this knowledge, Dr Faisal and the team at Imperial College have created the most sophisticated eye tracking software ever that can be used on affordable hardware.
I asked him whether it had a name or whether they were sticking with “eye tracking software” for now. He laughs, “No that would be very boring. It is called intention decoding”.
The most obvious way in which Intention Decoding can be used to help people is with wheelchair mobility. In fact Dr Faisal’s team have already made a prototype wheelchair that uses Intention Decoding to allow the user to control it. Essentially the user simply looks at where they wish to go, the software can read their eye movements and sends a signal to the electric wheelchair which will carry them in that direction.
Probably one of the first questions that pops into your head is – “How does it know if you want to move to some place or if you’re just glancing in that direction?”
And it’s a fair point. Imagine a wheelchair user who wants to cross a busy road and simply by gazing at the other side the wheelchair carries them straight into busy traffic. “That’s a very good question” Dr Faisal says. The answer is that our eyes are far more complex than we give them credit for. The research into eye movement revealed that our eyes do not act the same way in the two instances. In short the software is so sophisticated that it can tell the difference between a passive glance and one with intent just by looking into your eyes.
Intention Decoding may prove advantageous also for people who have lost the use of their limbs and might have to move their chair using the sip and puff technology. With Intention Decoding they can instead move their chair in a safer, more efficient and hygienic way. Plus they will find it much easier to communicate whilst on the move.
Other uses
Wheelchairs are just one of the ways in which intention decoding can be applied to help people, and Dr Faisal and his team are now looking at various ways to progress with it. “How can we use intention decoding? How can we make better wheelchairs? How can we use it in the house? How can we restore peoples’ independence? Perhaps we could even control prosthetic limbs. Could people use them to control orthotic limbs?” These are just some of the exciting questions that are being considered.
The possibilities are not limited to the world of disability either. “There is a trainee astronaut at the university who is interested in using it in space to fix robots; and people in the mining sector are interested in using our technology” Dr Faisal tells me, indicating the vast potential of Intention Decoding. “Imagine walking into a room and switching on the lights just by looking at them”.
When intention decoding was demonstrated at a large science fair; of the 5,000 people attending over 2,000 came forward to try out the software, using it to play the classic vintage video game, Pong. 70% of those who came forward mastered the use of it within 15 seconds. “You don’t have to learn to use the device; the device knows how you operate. You have the same brain architecture as anybody else”.
The software is clearly very easy to learn to use and the affordability of the hardware opens up the possibility that it could be commercially readily available some day. “Ultimately you could see cheap devices that you can buy at boots”, explains Dr Faisal. This however is dependent on a great many things and would take time to process so don’t get too excited too quickly.
Dr Faisal non-the-less predicts that roughly “within 2 to 3 years eye tracking technology will be everywhere”. And with around 6 million people in the UK alone who either have a movement disability or care for someone who does, the potential it has to change lives is enormous.
I leave you with this TEDx Talk by Dr Faisal which is definitely worth watching.
Tags:
© 2024 Created by Gordon White. Powered by