A TASMANIAN teenager's simple idea that would allow a complete quadriplegic to control a wheelchair by voice has earned her international recognition and a top science award for school students, handed out by Commonwealth science agency CSIRO.
Subscribe now for unlimited access.
$0/
(min cost $0)
or signup to continue reading
Yaya Lu, 16, of Hobart, has her mind set on helping the disabled through technology.
She has just arrived home after presenting a research paper to the Biomedical Engineering International Conference in Bangkok.
The paper Yaya presented was for an international conference that normally accepts papers from postgraduate students or university lecturers.
She was able to present her findings to the conference in Thailand after Google sponsored her and her mother's trip.
Yaya was recently awarded the Gold CREST Award by the CSIRO, which is only given to a select few students each year and requires original ideas and more than 100 hours of work.
As part of her submission for the award she put together a 60-page report, showed prototypes, a video and participated in a verbal examination.
The high school student, who said presenting her findings to about 30 academics in Bangkok was "nerve-racking" and "quite intimidating", used the Lego Mindstorms NXT robotics kit to demonstrate how her robot would work in reality.
The robot makes use of language-independent voice commands by using a combination of short and long sounds (like "dit" and "dah").
It can use eight commands, allowing it to move forwards and backwards, spin clockwise and anti-clockwise, move left and right, and stop.
An eighth command could toggle it to rise and fall.
Yaya said she decided to start on the project last year when she heard about a complete quadriplegic from Northern Tasmania who, like most quadriplegic people, couldn't move any part of his body below his neck.
"So I kind of thought I could develop a system that could help a quadriplegic like him gain more independence in their daily life," Yaya said.
Two control systems, which Yaya thought might help the man control a wheelchair, were investigated.
The first involved using various parts of the face to control a wheelchair, like using movement of the ears, dilation of the nostrils and movement of the eyebrows. The second focused on language-independent voice commands for control, which Yaya said she created for people who were unable to move their facial features in a certain way for the first system to work.
The beauty of the language- independent voice controlled wheelchair design, Yaya said, was that it wasn't based on voice recognition, which takes a long time to get working correctly, costs lots of money and uses a lot of computing power.
Other control systems, such as one which uses a magnet glued to the tongue, were "very intrusive", Yaya said.
"You want to be able to eat, you want to be able to speak normally without that magnet glued to you."
Yaya's mother, Yin, an information systems lecturer at the University of Tasmania, said she was blown away by what her daughter had achieved.
Yaya's mentor, neighbour and a former university lecturer with a PhD in artificial intelligence, Graeme Faulkner, said what Yaya had achieved was remarkable.
"The paper Yaya was presenting was for an international conference, which normally accepts papers from postgraduate students or university lecturers," Dr Faulkner said.
And I suppose it may have happened before but I've never heard of a year 10 student presenting one who has had to ask for leave from school to attend.
"It just doesn't happen, at least to my knowledge in my half-century of academic experience."