阅读理解
The new social robots,
including Jibo, Cozmo, Kuri and Meccano M.A.X., bear some similarities to
assistants like Apple's Siri, but these robots come with something more. They
are designed to win us over not with their smarts but with their personality.
They are sold as companions that do more than talk to us. Time magazine cheered
for the robots that "could fundamentally reshape how we interact with
machines." But is reshaping how we interact with machines a good thing,
especially for children?
Some researchers in favor
of the robots don't see a problem with this. People have relationships with
many kinds of things. Some say robots are just another thing with which we can
have relationships. To support their argument, roboticists sometimes point to
how children deal with toy dolls. Children animate (赋予…生命) dolls and turn them into imaginary friends.
Jibo, in a sense, will be one more imaginary friend, and arguably a more
intelligent and fun one.
Getting attached to dolls
and sociable machines is different, though. Today's robots tell children that
they have emotions, friendships, even dreams to share. In reality, the whole
goal of the robots is emotional trickery. For instance, Cozmo the robot needs
to be fed, repaired and played with. Boris Sofman, the chief executive of Anki,
the company behind Cozmo, says that the idea is to create "a deeper and
deeper emotional connection ... And if you neglect him, you feel the pain of
that." What is the point of this, exactly? What does it mean to feel the
pain of neglecting something that feels no pain at being neglected, or to feel
anger at being neglected by something that doesn't even know it is neglecting
you?
This should not be our only
concern. It is troubling that these robots try to understand how children feel.
Robots, however, have no emotions to share, and they cannot put themselves in
our place. No matter what robotic creatures "say", they don't
understand our emotional lives. They present themselves as empathy machines,
but they are missing the essential equipment. They have not been born,
they don't know pain, or death, or fear. Robot thinking may be thinking, but
robot feeling is never feeling, and robot love is never love.
What is also troubling is
that children take robots' behavior to indicate feelings. When the robots
interact with them, children take this as evidence that the robots like them,
and when robots don't work when needed, children also take it personally. Their
relationships with the robots affect their self-esteem (自尊). In one study, an 8-year-old boy concluded that the robot
stopped talking to him because the robot liked his brothers better.
For so long, we dreamed of
artificial intelligence offering us not only simple help but conversation and
care. Now that our dream is becoming real, it is time to deal with the
emotional downside of living with robots that "feel".