Microsoft has developed a new smartphone app that interprets eye signals and translates them into letters, allowing people with motor neurone disease to communicate with others from a phone.
The GazeSpeak app combines a smartphone’s camera with artificial intelligence to recognize eye movements in real time and convert them into letters, words and sentences. For people suffering from ALS(渐冻症), also known as motor neurone disease, eye movement can be the only way they are able to communicate.
“Current eye-tracking input systems for people with ALS or other motor impairments are expensive, not robust under sunlight, and require frequent re-calibration and substantial, relatively immobile setups,” said Xiaoyi Zhang, a researcher at Microsoft who developed the technology.
“To mitigate the drawbacks…we created GazeSpeak, an eye-gesture communication system that runs on a smartphone, and is designed to be low-cost, robust, portable and easy to learn.”
The app is used by the listener by pointing their smartphone at the speaker. A chart that can be stuck to the back of the smartphone is then used by the speaker to determine which eye movements to make in order to communicate.
The sticker shows four grids of letters, which each correspond to a different eye movement. By looking up, down, left or right, the speaker selects which grids the letters they want belong to. The artificial intelligence algorithm is then able to predict the word or sentence they are trying to say.
Zhang’s research, Smartphone-Based Gaze Gesture Communication for People with Motor Disabilities, is set to be presented at the Conference on Human Factors in Computing Systems in May.
1.What should people with motor neurone disease do when they communicate with other people?
A. They need speak to the smartphone loudly.
B. They have to predict the sentence they are trying to say.
C. They need choose a grid on the back of the smartphone with eye movements.
D. They should work out the artificial intelligence algorithm in the smartphone.
2.How does the Gaza Speak app work?
A. The GazaSpeak app can hear what the speakers say.
B. The GazaSpeak app can convey speakers movements to the listeners’ smartphone.
C. The camera in the phone can record speakers’ gestures and predict what they want to say.
D. The artificial intelligent camera in the phone can recognize eye signals and translate them into words and sentences.
3.Which of the following statement is true ?
A. DazaSpeak is designed for blind people.
B. GazaSpeak is low-cost, robust, portable and easy to learn.
C. Current eye-tracking input systems don’t need any re-calibration and setups.
D. Smartphones with GazaSpeak app have been popular among people with Motor Diseases
高三英语阅读理解中等难度题
Microsoft has developed a new smartphone app that interprets eye signals and translates them into letters, allowing people with ALS(渐冻症), also known as motor neurone disease, to communicate with others from a phone.
The GazeSpeak app combines a smartphone’s camera with artificial intelligence to recognize eye movements in real time and transform them into letters, words and sentences.
For people suffering from ALS, eye movement can be the only way they are able to communicate.
“Current eye-tracking input systems for people with ALS are expensive, not effective under sunlight, and require frequent re-calibration(再校正) and abundant, relatively stable setups,” said Xiaoyi Zhang, a researcher at Microsoft who developed the technology.
“To mitigate the drawbacks…we created GazeSpeak, an eye-gesture communication system that runs on a smartphone, and is designed to be low-cost, effective, and easy to carry and learn.”
The app is used by the listener by pointing their smartphone at the speaker. A chart that can be stuck to the back of the smartphone is then used by the speaker to determine which eye movements to make in order to communicate.
The chart shows four grids of letters, which each correspond to a different eye movement. By looking up, down, left or right, the speaker selects which grids the letters they want belong to. The artificial intelligence is then able to predict the word or sentence they are trying to say.
Zhang’s research, Smartphone-Based Gaze Gesture Communication for People with Motor Disabilities, is set to be presented at the Conference on Human Factors in Computing Systems in May.
1.The main purpose of the passage is to _________.
A. arouse people’s attention on the shortcomings of Current eye-tracking input systems.
B. introduce a new smartphone App for people suffering from ALS to communicate.
C. call for people’s awareness of helping people with ALS to communicate successfully.
D. compare the current eye-tracking input systems with the new GazeSpeak app.
2.The underlined word “mitigate” in paragraph5 probably means _________
A. ignore B. accept
C. strengthen D. weaken
3.According to the passage, which of the following sentences is TRUE?
A. There are many ways to communicate for people suffering from ALS.
B. The speaker points their smartphone at the listener when using the app.
C. The current eye-tracking input systems for people with ALS need improving.
D. The new smartphone app for people with ALS has been put on the market.
高三英语阅读理解困难题查看答案及解析
Microsoft has developed a new smartphone app that interprets eye signals and translates them into letters, allowing people with motor neurone disease to communicate with others from a phone.
The GazeSpeak app combines a smartphone’s camera with artificial intelligence to recognize eye movements in real time and convert them into letters, words and sentences. For people suffering from ALS(渐冻症), also known as motor neurone disease, eye movement can be the only way they are able to communicate.
“Current eye-tracking input systems for people with ALS or other motor impairments are expensive, not robust under sunlight, and require frequent re-calibration and substantial, relatively immobile setups,” said Xiaoyi Zhang, a researcher at Microsoft who developed the technology.
“To mitigate the drawbacks…we created GazeSpeak, an eye-gesture communication system that runs on a smartphone, and is designed to be low-cost, robust, portable and easy to learn.”
The app is used by the listener by pointing their smartphone at the speaker. A chart that can be stuck to the back of the smartphone is then used by the speaker to determine which eye movements to make in order to communicate.
The sticker shows four grids of letters, which each correspond to a different eye movement. By looking up, down, left or right, the speaker selects which grids the letters they want belong to. The artificial intelligence algorithm is then able to predict the word or sentence they are trying to say.
Zhang’s research, Smartphone-Based Gaze Gesture Communication for People with Motor Disabilities, is set to be presented at the Conference on Human Factors in Computing Systems in May.
1.What should people with motor neurone disease do when they communicate with other people?
A. They need speak to the smartphone loudly.
B. They have to predict the sentence they are trying to say.
C. They need choose a grid on the back of the smartphone with eye movements.
D. They should work out the artificial intelligence algorithm in the smartphone.
2.How does the Gaza Speak app work?
A. The GazaSpeak app can hear what the speakers say.
B. The GazaSpeak app can convey speakers movements to the listeners’ smartphone.
C. The camera in the phone can record speakers’ gestures and predict what they want to say.
D. The artificial intelligent camera in the phone can recognize eye signals and translate them into words and sentences.
3.Which of the following statement is true ?
A. DazaSpeak is designed for blind people.
B. GazaSpeak is low-cost, robust, portable and easy to learn.
C. Current eye-tracking input systems don’t need any re-calibration and setups.
D. Smartphones with GazaSpeak app have been popular among people with Motor Diseases
高三英语阅读理解中等难度题查看答案及解析
Microsoft has developed a new smartphone app that interprets eye signals and translates them into letters, allowing people with motor neurone disease to communicate with others from a phone.
The GazeSpeak app combines a smartphone’s camera with artificial intelligence to recognize eye movements in real time and convert them into letters, words and sentences.
For people suffering from ALS(渐冻症), also known as motor neurone disease, eye movement can be the only way they are able to communicate.
“Current eye-tracking input systems for people with ALS or other motor impairments are expensive, not robust under sunlight, and require frequent re-calibration and considerable, relatively immobile setups,” said Xiaoyi Zhang, a researcher at Microsoft who developed the technology. “To mitigate the drawbacks…we created GazeSpeak, an eye-gesture communication system that runs on a smartphone, and is designed to be low-cost, robust, portable and easy to learn.”
The app is used by the listener by pointing their smartphone at the speaker. A chart that can be stuck to the back of the smartphone is then used by the speaker to determine which eye movements to make in order to communicate. The sticker shows four grids of letters, which each correspond to a different eye movement. By looking up, down, left or right, the speaker selects which grids the letters they want belong to. The artificial intelligence algorithm is then able to predict the word or sentence they are trying to say.
Zhang’s research, Smartphone-Based Gaze Gesture Communication for People with Motor Disabilities, is set to be presented at the Conference on Human Factors in Computing Systems in May.
1.The underlined word “mitigate” in paragraph probably means __________________.
A. adjust B. correct
C. strengthen D. lessen
2.We can learn from the passage that _________________________.
A. the present eye-tracking input systems can hardly meet the demand of ALS suffers
B. ALS suffers can only be understood through the newly-developed app
C. speakers can make different eye movements to choose preferable word or sentence
D. the GazeSpeak app is going to be released at the Conference on Human Factors in May
3.Which is the best title for the text?
A. Microsoft finds hope for patients with ALS
B. Motor neurone disease might be cured by a smartphone app
C. A smartphone app helps ALS suffers speak with their eyes
D. ALS suffers finally find a way to communicate
高三英语阅读理解中等难度题查看答案及解析
Microsoft has developed a new smartphone app that interprets eye signals and translates them into letters, allowing people with motor neurone disease to communicate with others from a phone.
The GazeSpeak app combines a smartphone’s camera with artificial intelligence to recognize eye movements in real time and convert them into letters, words and sentences.
For people suffering from ALS(渐冻症), also known as motor neurone disease, eye movement can be the only way they are able to communicate.
“Current eye-tracking input systems for people with ALS or other motor impairments are expensive, not robust under sunlight, and require frequent re-calibration (重新校对) and substantial, relatively immobile setups,” said Xiaoyi Zhang, a researcher at Microsoft who developed the technology.
“To mitigate the drawback,we created GazeSpeak, an eye-gesture communication system that runs on a smartphone, and is designed to be low-cost, robust, portable and easy to learn.”
The app is used by the listener by pointing their smartphone at the speaker. A chart that can be stuck to the back of the smartphone is then used by the speaker to determine which eye movements to make in order to communicate.
The sticker shows four grids of letters, which each correspond to a different eye movement. By looking up, down, left or right, the speaker selects which grids the letters they want belong to. The artificial intelligence algorithm(计算程序) is then able to predict the word or sentence they are trying to say.
1.Who is the smartphone app designed for?
A. People with motor eye disease.
B. People suffering from eye communication.
C. People with problems in intelligence.
D. People suffering from motor neurone disease.
2.The underlined word “mitigate” in the fifth paragraph probably means ________.
A. relieve B. get rid of
C. remove D. find
3.According to the text, ________.
A. for people suffering from ALS, eye movement is one of the best ways to communicate.
B. a new communication system that runs on a smartphone is designed to be immobile.
C. the speaker could use the app by pointing their smartphone at the listener.
D. the artificial intelligence algorithm will translate sentences for speakers.
高三英语阅读理解困难题查看答案及解析
Microsoft has developed a new smart phone app that interprets eye signals and translates them into letters, allowing people with motor neurone disease to communicate with others from a phone.
The GazeSpeak app combines a smartphone’s camera with artificial intelligence to recognize eye movements in real time and convert(改变) them into letters, words and sentences.
For people suffering from ALS(渐冻症), also known as motor neurone disease, eye movement can be the only way they are able to communicate.
“Current eye-tracking input systems for people with ALS or other motor impairments are expensive, not robust under sunlight, and require frequent re-calibration and substantial, relatively immobile setups,” said Xiaoyi Zhang, a researcher at Microsoft who developed the technology.
“To mitigate the drawbacks…we created GazeSpeak, an eye-gesture communication system that runs on a smartphone, and is designed to be low-cost, robust, portable and easy to learn.”
The app is used by the listener by pointing their smartphone at the speaker. A chart that can be stuck to the back of the smartphone is then used by the speaker to determine which eye movements to make in order to communicate.
The sticker shows four grids(方格) of letters, which each correspond to a different eye movement. By looking up, down, left or right, the speaker selects which grids the letters they want belong to. The artificial intelligence algorithm(程序) is then able to predict the word or sentence they are trying to say.
1.What does the underlined word “mitigate” in paragraph 5 probably mean?
A. ignore B. destroy
C. increase D. reduce
2.The passage mainly tells us ________ .
A. The advantages of Gaze Speak over the Current eye-tracking input systems.
B. Smartphone App helps ALS suffers speak with their eyes movement.
C. The sticker plays an important role in Gaze Speak.
D. The writer is making an advertisement for Gaze Speaker.
3.What’s the writer’s attitude towards the invention of Gaze Speaker?
A. doubtful B. negative
C. favorable D. unclear
高三英语阅读理解困难题查看答案及解析
Microsoft has developed a new smart phone app that identify eye signals and translates them into letters, allowing people with motor neurone disease (运动神经元症) to communicate with others from a phone.
The GazeSpeak app combines a smartphone’s camera with artificial intelligence to recognize eye movements immediately and change them into letters, words and sentences.
For people suffering from ALS (渐冻症) , also known as motor neurone disease, eye movement can be the only way they are able to communicate.
“Current eye-tracking input systems for people with ALS are expensive, not strong under sunlight, and require frequent re-adjustment and material, relatively steady setups,” said Xiaoyi Zhang, a researcher at Microsoft who developed the technology.
“To ease off the disadvantages, we created GazeSpeak, an eye-gesture communication system that runs on a smartphone, and is designed to be low-cost, strong, portable and easy to learn.”
The app is used by the listener by pointing their smartphone at the speaker. A chart that can be stuck to the back of the smartphone is then used by the speaker to determine which eye movements to make in order to communicate.
The sticker shows four grids of letters, which each is equal to a different eye movement. By looking up, down, left or right, the speaker selects which grids the letters they want belong to. The artificial intelligence is then able to predict the word or sentence they are trying to say.
Zhang’s research, Smartphone-Based Gaze Gesture Communication for People with Motor Disabilities, is set to be presented at the Conference on Human Factors in Computing Systems in May. (265)
1.According to the passage, people with ALS can communicate with others by __________.
A. eye contact B. body language
C. hand shaking D. language expression
2.Which of the following is NOT the advantage of GazeSpeak?
A. cheap B. unsteady
C. accessible D. learnable
3.How do the speakers use the app?
A. They point at the letters they want on the phone.
B. They predict the word or sentence they try to say.
C. They turn to the listeners to speak out the letters for them.
D. They look in the four directions to choose the grid of letters.
4.Which of the following might be the best title for the text?
A. People Suffering From ALS Needs Help
B. Smartphone App Helps ALS Sufferers Speak With Eyes
C. How Do People With ALS Communicate With Others
D. Researchers Develop The New Smartphone App
高三英语阅读理解困难题查看答案及解析
A technology company is developing a lie detector app for smartphones that could be used by parents, teachers—and even Internet daters.
The app measures blood flow in the face to assess whether or not you are telling the truth. Its developers say that it could be used for daters wanting to see if somebody really is interested in them. Parents could use it on their children to see if they are lying and teachers could work out which of their pupils are honest.
The app is being developed by Toronto startup NuraLogix and the software is called Transdermal Optical Imaging. The idea is that different human emotions create different facial blood flow patterns that we have no control over. These patterns change if we are telling the truth or telling a lie.
Using the footage(拍摄的片段) from the smartphone camera, the software will see the changes in skin colors and compare them to standardized results. A study found last year found that anger was associated with more blood flow and redness while sadness was associated with less of both.
Developmental neuroscientist(神经病学家)Kang Lee, who has been researching the field for 20 years, said, “It could be very useful, for example, for teachers. A lot of our students have math anxiety but they do not want to tell us, because that’s embarrassing.” Lee added that the technology would not replace lie detectors used in a court of law. He said: “They want the accuracy to be extremely high, like genetic tests, so a one-in-a-million error rate. Our technique won’t be able to achieve an extremely high accuracy level, so because of that I don’t think it’s useful for the courts.” He added that it will be a few years before the app is available to consumers.
1.How does the app work to identify whether the person is lying or not?
A. By controlling the blood flow patterns in our face when people are speaking
B. By measuring blood flow patterns and comparing changes in skin colors
C. By taking footage to replace lie detectors used in a court of law
D. By creating different facial blood flow patterns people needed
2.Why Transdermal Optical Imaging cannot be applied in courts at present?
A. It is too complicated to standardize results in courts
B. It hasn’t reached the required accuracy yet
C. Genetic tests are enough for situations like this
D. Its use is forbidden by law
3.What can be inferred from Kang Lee’s remarks in the last paragraph?
A. Students with math anxiety rarely ashamed of themselves
B. Lie detectors and Transdermal Optical Imaging are of the same function
C. The result of genetic tests is far more accurate than that of Transdermal Optical Imaging
D. Consumers will be able to download the app in the near future
高三英语阅读理解困难题查看答案及解析
A new research has uncovered that culture is a determining factor when interpreting facial emotions (情感).The study reveals that in cultures where emotional control is the standard, such as Japan, focus is placed on the eyes to interpret emotions.Whereas in cultures where emotion is openly expressed, such as the United States, the focus is on the mouth to interpret emotion.
"These findings go against the popular theory that the facial expressions of basic emotions can be universally recognized," said University of Alberta researcher Dr.Takahiko Masuda."A person's culture plays a very strong role in determining how they will read emotions and needs to be considered when interpreting facial expression."
These cultural differences are even noticeable in computer emoticons (情感符号), which are used to convey a writer's emotions over email and text messaging.The Japanese emoticons for happiness and sadness vary in terms of how the eyes are drawn, while American emoticons vary with the direction of the mouth.In the United States the emoticons :) and :-) show a happy face, whereas the emoticons : ( or : -( show a sad face.However, Japanese tend to use the symbol ( ' ' ) to indicate a happy face, and ( ;_; )to indicate a sad face.
"We think it is quite interesting and appropriate that a culture tends to mask its emotions.The Japanese would focus on a person's eyes when determining emotion, as eyes tend to be quite subtle (微妙的)," said Masuda."In the United States, where open emotion is quite common, it makes sense to focus on the mouth, which is the most expressive feature on a person's face."
1.The text mainly tells us that __________.
A.cultural differences are expressed in emotions
B.culture is the key to interpreting facial emotions
C.different emoticons are preferred in different cultures
D.people from different cultures express emotions differently
2.Which emoticon is used by Americans to show a happy face?
A.(;_;) B.:-) C.:-( D.: (
3.If a Japanese wants to detect whether a smile is true or false, he will probably_______.
A.read the whole face B.focus on the mouth
C.look into the eyes D.judge by the voice
4.People used to believe that _______.
A.some facial expressions of emotions were too complex to be recognized
B.people in the world interpreted basic emotions in different ways
C.people could only recognize the facial expressions of basic emotions
D.people all over the world understood basic emotions in the same way
高三英语阅读理解简单题查看答案及解析
A new research has uncovered that culture is a determining factor when interpreting facial emotions (情感).The study reveals that in cultures where emotional control is the standard, such as Japan, focus is placed on the eyes to interpret emotions.Whereas in cultures where emotion is openly expressed, such as the United States, the focus is on the mouth to interpret emotion.
"These findings go against the popular theory that the facial expressions of basic emotions can be universally recognized," said University of Alberta researcher Dr.Takahiko Masuda."A person's culture plays a very strong role in determining how they will read emotions and needs to be considered when interpreting facial expression."
These cultural differences are even noticeable in computer emoticons (情感符号), which are used to convey a writer's emotions over email and text messaging.The Japanese emoticons for happiness and sadness vary in terms of how the eyes are drawn, while American emoticons vary with the direction of the mouth.In the United States the emoticons :) and :-) show a happy face, whereas the emoticons : ( or : -( show a sad face.However, Japanese tend to use the symbol ( ' ' ) to indicate a happy face, and ( ;_; )to indicate a sad face.
"We think it is quite interesting and appropriate that a culture tends to mask its emotions.The Japanese would focus on a person's eyes when determining emotion, as eyes tend to be quite subtle (微妙的)," said Masuda."In the United States, where open emotion is quite common, it makes sense to focus on the mouth, which is the most expressive feature on a person's face."
1.The text mainly tells us that __________.
A.cultural differences are expressed in emotions
B.culture is the key to interpreting facial emotions
C.different emoticons are preferred in different cultures
D.people from different cultures express emotions differently
2.Which emoticon is used by Americans to show a happy face?
A.(;_;) B.:-) C.:-( D.: (
3.If a Japanese wants to detect whether a smile is true or false, he will probably_______.
A.read the whole face B.focus on the mouth
C.look into the eyes D.judge by the voice
4.People used to believe that _______.
A.some facial expressions of emotions were too complex to be recognized
B.people in the world interpreted basic emotions in different ways
C.people could only recognize the facial expressions of basic emotions
D.people all over the world understood basic emotions in the same way
高三英语阅读理解中等难度题查看答案及解析
A new app aims to help parents interpret what their baby wants based on the sound of their cry. The free app ChatterBaby, which was released last month, analyzes the acoustic (声学的) features of a baby’s cry, to help parents understand whether their child might be hungry, fussy or in pain. While critics say caregivers should not rely too much on their smartphone, others say it’s a helpful tool for new or tired parents.
Ariana Anderson, a mother of four, developed the app. She originally designed the technology to help deaf parents better understand why their baby was upset, but soon realized it could be a helpful tool for all new parents.
To build a database, Anderson and her team uploaded 2,000 audio samples of infant(婴儿) cries. She used cries recorded during ear piercings and vaccinations to distinguish pain cries. And to create a baseline for the other two categories, a group of moms had to agree on whether the cry was either hungry or fussy.
Anderson’s team continues to collect data and hopes to make the app more accurate by asking parents to get specific about what certain sounds mean.
Pediatrician Eric Ball pointed out that evaluating cries can never be an exact science. “I think that all of the apps and technology that new parents are using now can be helpful but need to be taken seriously,” Ball said ,“ I do worry that some parents will get stuck in big data and turn their parenting into basically a spreadsheet(电子表格) which I think will take away the love and caring that parents are supposed to be providing for the children. ”
But Anderson said the aim of the app is to have parents interpret the results, not to provide a yes or no answer. The Bells, a couple using this app, say it’s a win-win. They believe they are not only helping their baby now but potentially others in the future.
1.How does the app judge what babies want?
A. By collecting data.
B. By recording all the sounds.
C. By analyzing the sound of their cries.
D. By asking parents about specific messages.
2.What was the app designed for in the beginning?
A. All new parents. B. Deaf parents.
C. Ariana Anderson. D. Crying babies.
3.What is Ball’s opinion about the app?
A. Parents should use the app wisely.
B. The app can create an accurate result.
C. Parents and babies are addicted to the app.
D. The app makes babies lose love and caring.
4.What is the text mainly about?
A. Parents should not rely too much on their smartphones.
B. A new app helps parents figure out why their babies are crying.
C. Parents can deal with babies’ hunger with the help of a new app.
D. A new app called ChatterBaby can prevent babies from crying.
高三英语阅读理解中等难度题查看答案及解析