题型:阅读理解 题类:模拟题 难易度:普通
浙江省杭州市2019届高三上学期英语模拟卷八
Microsoft has developed a new smartphone app that interprets eye signals and translates them into letters, allowing people with motor neurone disease to communicate with others from a phone.
The GazeSpeak app combines a smartphone's camera with artificial intelligence to recognize eye movements in real time and convert them into letters, words and sentences. For people suffering from ALS(渐冻症), also known as motor neurone disease, eye movement can be the only way they are able to communicate.
"Current eye-tracking input systems for people with ALS or other motor impairments are expensive, not robust under sunlight, and require frequent re-calibration and substantial, relatively immobile setups," said Xiaoyi Zhang, a researcher at Microsoft who developed the technology.
"To mitigate the drawbacks…we created GazeSpeak, an eye-gesture communication system that runs on a smartphone, and is designed to be low-cost, robust, portable and easy to learn."
The app is used by the listener by pointing their smartphone at the speaker. A chart that can be stuck to the back of the smartphone is then used by the speaker to determine which eye movements to make in order to communicate.
The sticker shows four grids of letters, which each correspond to a different eye movement. By looking up, down, left or right, the speaker selects which grids the letters they want belong to. The artificial intelligence algorithm is then able to predict the word or sentence they are trying to say.
Zhang's research, Smartphone-Based Gaze Gesture Communication for People with Motor Disabilities, is set to be presented at the Conference on Human Factors in Computing Systems in May.
试题篮