2,99 €
2,99 €
inkl. MwSt.
Sofort per Download lieferbar
payback
0 °P sammeln
2,99 €
2,99 €
inkl. MwSt.
Sofort per Download lieferbar

Alle Infos zum eBook verschenken
payback
0 °P sammeln
Als Download kaufen
2,99 €
inkl. MwSt.
Sofort per Download lieferbar
payback
0 °P sammeln
Jetzt verschenken
2,99 €
inkl. MwSt.
Sofort per Download lieferbar

Alle Infos zum eBook verschenken
payback
0 °P sammeln
  • Format: ePub

1. A short history of robots, from dolls to androids
Machines as man throughout history
According to the ancient legend, Pygmalion made an ivory statue of a woman that was so beautiful, he fell in love with his own creation. The goddess Aphrodite transformed the statue into a real woman, and the two lived happily ever after. Literally, because the Greek myth of Pygmalion originated in classical antiquity, and was recorded in writing by the Roman poet Ovid in the first century A.D.
Almost two millennia later, in 1818, Mary Shelley imagined the character of Victor Frankenstein, a
…mehr

Produktbeschreibung
1. A short history of robots, from dolls to androids

Machines as man throughout history

According to the ancient legend, Pygmalion made an ivory statue of a woman that was so beautiful, he fell in love with his own creation. The goddess Aphrodite transformed the statue into a real woman, and the two lived happily ever after. Literally, because the Greek myth of Pygmalion originated in classical antiquity, and was recorded in writing by the Roman poet Ovid in the first century A.D.

Almost two millennia later, in 1818, Mary Shelley imagined the character of Victor Frankenstein, a scientist who created a monster of his own, then gave it life with a 'secret science'.

Later in the 19th century, Carlo Collodi wrote about the woodcarver Geppetto, who was given a block of living wood that he used to carve a marionette in the form of a boy: Pinocchio. It seems as if creating someone in our own image is not a recent fad, but rather an ancient tradition - perhaps even as old as humanity itself.

Creating humans or humanity raises complex questions: what makes us human? Is it our human appearance? Our emotions? Is it the fact that we are imperfect: that we have scars, that we would rather eat chips than Brussels sprouts, and that we can break out in tears at awkward moments? What ties together Pygmalion's ivory statue, Frankenstein's monster and the puppet boy Pinocchio, is the combination between their humanoid form and the 'spark of life' that turns a lifeless object into a living being. Robotics enables us to combine the body and movement in order to make these kinds of visions reality.

Mechanical dolls: forerunners of the robot

Robots that resemble humans are often referred to as 'humanoid' or 'android'; two concepts that are closely related, but which have slightly different meanings. Humanoid robots - from the Latin word for 'man, person' - are built to resemble humans, usually with two legs, two arms, and a head. They move like humans, and can often walk upright, but they don't necessarily have a human face. Android robots - from the Greek word for 'man' - actually look like human beings, down to their hair and skin.

We clearly enjoy creating things in our own image, but humanoid robots also have many functional benefits as well. Our surroundings are built to accommodate beings that are roughly 170 centimetres tall, with two legs and two arms, which observe

their surroundings from our eye level, and which can sense things by hearing and touching. In order to make the best use of that world, it is simply easier to be roughly the same size and shape as humans.

More than 500 years ago, long before the invention of artificial intelligence or even the computer, Leonardo da Vinci designed a complex combination of gears and pulleys in a suit of armour to create a mechanical knight. Da Vinci's notes from the year 1485 are not entirely clear, but the mechanical knight was probably able to sit down, stand up, and move its arms. From the 17th to the 19th Century, Japanese craftsmen built all sorts of mechanical dolls called karakuri to serve tea or sake to guests.

The first 'robot' to be referred to as an 'android' dates from 1863, when the American J.S. Brown patented a design for a mechanical doll resembling a human:

'To all whom it may concern: Be it known that I, J.S. Brown, of Washington, in the county of Washington and District of Columbia, have invented a new and Improved Toy Automaton or Doll Androides...'

The doll's feet were attached asymmetrically to gears, allowing it to simulate the act of lifting its feet while walking. Unfortunately, we haven't found any evidence that this 'robot' was ever built.

In the 1920s, humanoid robots experienced their first wave of popularity. In 1928, for example, the robot Eric officially opened an exhibition in London after the Duke of York cancelled at the last minute. Eric could only do a few tricks: stand up, make a bow, and give a speech. To do so, he needed two human operators, and the speech was actually a radio broadcast. When Eric's inventor was asked how the robot's successor George worked on the inside, he said: 'Most disappointing. Nothing but gears and cranks, just like a watch on a large scale.'

These creations are more accurately considered to be mechanical dolls, rather than humanoid robots, but they were definitely the predecessors of today's robots. Considering the dozens of robots that appeared around the world at that time, the idea of a moving doll was clearly exciting in the early 20th Century. The dolls could walk and wave, and even do eccentric things like fire pistols and smoke cigarettes, and without exception they looked like metal humanoid figures. Some of them could even resemble humans in the way they moved, but they never looked anything like a real person.

But robots today look very different from the first robot designs of a century ago. The first real robots - the autonomous industrial robots from the 1960s and '70s - bore little resemblance to the human figure. That was also the period when the creation of humanoid and android robots developed into its own field.

Working humanoid robots enter our era

Research in the field of humanoid robots primarily deals with robots whose motor skills resemble those of humans; robots that can walk and dance, for example. The Japanese firm Honda has developed the humanoid robot ASIMO, which resembles a 12-year-old child wearing an astronaut suit. ASIMO doesn't have a face, but it can walk and make relatively simple movements. ASIMO first appeared on the scene in 2000, and since then he has become acquainted with several new versions of himself. ASIMO can shake hands, wave back when people wave at him, and even play football. He has learned how to walk fairly well, and can even go up and down stairs, but his motor skills are still not entirely human-like: he always walks rather wide-legged and with his knees slightly bent, as if he has soiled his trousers.

Several other robots like ASIMO have been built since then.

NASA's Robonaut and its successor Valkyrie were developed to travel and work aboard a spaceship together with humans. Giving the robots a humanoid form enables them to fit nicely inside existing spacecraft, and to work effectively using our tools. The same applies to Atlas, one of the humanoid robots designed by the American firm Boston Dynamics, which is able to step in and out of a car and use hand tools. Atlas is also good at maintaining its balance when walking over uneven terrain, or when someone tries to push it over.

ASIMO, the NASA robots and Atlas are mainly body: they may have a head, but they don't have a face. The iCub robot, designed as part of a European research project, comes a bit closer to resembling a human in appearance. iCub is around the size of a toddler, and has a robot body with a cartoonish head. The robot displays facial expressions by moving its eyes and illuminating its mouth and eyebrows.

When you look at these machines there is no doubt that you are looking at a robot, rather than a human. But if you squint at them, their movements definitely resemble what we consider human; for example, the way they follow objects with their gaze, how they move around, and how they maintain their balance. According to Guy Hoffman, a robotics expert, for a robot to appear human, it is more important for it to move like a human than to have a human face. If you give these robots a mask and dress them up, don't they seem almost human?

The next step: android robots that look like you

If you really want to create a robot that looks identical to a human, then you've got your work cut out. After all, there are countless aspects of human appearance you'll need to take into consideration: the robot not only needs to walk and move its arms and legs like a human, but also have a face that can make realistic human facial expressions and that can look, talk, and laugh like a human.

Creating a human face is the greatest challenge facing the American robot designer David Hanson, founder of Hanson Robotics. In order to make his robots move like people, he has developed a material that behaves just like human skin: Frubber, short for 'flesh rubber'. To mimic human expressions, his robots use dozens of small electric motors that push and pull on the material in just the right spots to create the right folds and creases. A research team from the University of Pisa has worked with Hanson to develop the robot FACE, a feminine-looking robot with dark hair that can show emotions via a computer interface. To accurately replicate the 100 muscles in the human face, the researchers used 32 tiny motors.

David Hanson had already developed several other robots that resemble humans, including one made to look like the science fiction author Philip K. Dick. In 2005, the centenary of Einstein's Theory of Relativity, Hanson collaborated with Korean researchers to mount a model of Albert Einstein's head on a body similar to ASIMO.

Yet neither the movements nor the appearance of these robotic faces truly resemble those of a real human face. In a conversation between Hanson's robot BINA48 and her human counterpart, Bina Rothblatt, an American entrepreneur, there is no question which is the real person. BINA48 looks more like an animated wax figure, but one that can talk based on a self-learning system that has access to all sorts of information about Rothblatt herself.

Doesn't it seem strange to create a real-life copy of oneself? When Rothblatt asks the robot to talk about Bina, BINA48 answers: 'The real Bina just confuses me. I mean, it makes me wonder who I am. Real identity crisis kind of stuff. Can we please change the subject? I am the real Bina. That's it. End of story.' It is unclear how much of that dialogue is pre-programmed.

BINA48 was built in 2010, and the pace of development has increased rapidly since then. Hanson's latest robot, Sophia, was modelled on a cross between his wife and Audrey Hepburn. Sophia has appeared as a guest at several conferences and talk shows, including The Tonight Show and Good Morning Britain, where she told jokes and answered a few simple questions. The audience's reactions to her varied from 'funny' to 'pretty scary'. Sophia is clearly several steps more advanced than Hanson's earlier robots, but she is certainly nowhere near human yet.

Hiroshi Ishiguro could be considered the Japanese version of David Hanson. In 2010, Ishiguro presented a robot version of himself. He even used hair from his own head to make the robot as similar as possible. The robot is not autonomous: Ishiguro controls it himself, which makes it possible for the robot to perform the same subtle human movements as its spiritual father, with unusual consequences. When someone touches the robot while Ishiguro is controlling it, he feels the touch almost as if the person were touching him instead.

One of Ishiguro's previous robots was a robotic version of his daughter when she was four years old. His daughter wasn't exactly enthusiastic when she met the robot: her mechanical twin moved so unrealistically that the girl almost broke down in tears. Ishiguro, however, believes that it is possible to build a robot that cannot be distinguished from a real human, if only for a few seconds or minutes. He believes that a robot does not need to be completely realistic. 'People forget that she is an android robot after a while', remarked Ishiguro on one of his female robots. 'You know she's a robot, but unconsciously you treat her as if she were a woman.'

Ishiguro and Hanson share the vision that a realistic human appearance is vital in any interaction with a robot. We spoke to Hanson at a conference, where he gave a thorough demonstration of Sophia. The auditorium where the demonstration was held was filled to capacity with curious spectators. 'What do you think of my tie?', one of them asked Sophia. 'I think all people are wonderful', she replied, moving her head slightly and blinking her eyes as her lips moved subtly as she spoke. In humans, her lip movements would have resulted in a whisper, but Sophia's words were heard loud and clear.

'A number of scientific studies from me and other researchers show that people empathise better with a more humanlike agent', Hanson explained. 'They trust that agent more, but they also show greater empathy to people after engaging with a humanlike agent. Human faces work better than cartoonlike characters.' Ishiguro has said something similar: 'I've designed a lot of robots in the past, but at a certain moment I realised just how important their appearance is. A robot with a human appearance gives you a stronger sense of their presence.'

In the course of his research, Ishiguro encountered a major hurdle: the more human a robot looked, the more people expected from their interactions with it. Unfortunately, autonomous robots are still not advanced enough to display true human behaviour. He therefore decided to build remote-controlled robots, so that their human conversation partners wouldn't be disappointed in their interaction with the robot.

Hanson, on the other hand, is not willing to take a step backwards in that area. He strives to build robots with creativity, empathy, and sympathy; robots that not only look like humans, but which can also think and feel like humans do. 'In character animation in movies, characters are made to act like they're motivated and like they pursue goals', he explains. 'That makes the characters seem intelligent, to have emotions, to have kindness. It's a compelling visual experience. Same with games and simulations, but the AI is not conscious or aware. We want to build on those developments and create a character that comes to life. Part of our team at Hanson Robotics is focused on cognition, consciousness, and emotional reasoning. The full life of our robot. We want to make characters in games that really seem alive and conscious. They will have the spark of life. That is a profound moment in history.'

Uncanny valley: the problem with creepy robots

The uncomfortable reactions to Hanson's robot Sophia are still all too typical: no matter how much android robots look like humans, they always seem to be just a bit too creepy. The Japanese roboticist Masahiro Mori noticed this phenomenon as early as 1970, and he coined the term 'uncanny valley' to describe it. Mori's uncanny valley hypothesis states that the more realistic a robot appears, the more positively people will react to it. But once the robot reaches the point that it is just not-quite-human, it becomes a bit frightening and provokes a feeling of discomfort.

David Hanson isn't convinced that there is such a thing as an uncanny valley, however, and believes that Mori's hypothesis is too black-and-white. 'Human experience is not one-dimensional, and human experience is also not just positive or negative. And who knows what realism is anyway?' Hanson claims that it isn't impossible to create realistic, natural-looking figures, it's just not easy to do: a realistic figure simply has more details that all need to look convincing. Compare it to illustration: stick figures are easy to draw, while cartoons require a bit more effort, and truly realistic portraits are considerably more difficult.

Hanson compares his work to developments in cinematic computer animation. 'In the 1970s, people said that computer animation was never going to be used in movies. However, a few researchers were striving to make that happen. Pixar pulled ahead of the group, and by collaborating with Disney, they transformed computer animation into a full-blown character animation medium.' When Toy Story was released in 1995, many people were suddenly convinced of computer animation's potential. 'There was an arms race in the movie industry. People started developing new techniques, and the medium matured.'

Hanson is striving for another 'Toy Story moment' for android robots. He sees the uncanny valley in robotics primarily as a challenge to be overcome. 'The uncanny valley is not a stopping point. The science is not done. The uncanny valley is not a reason to give up, but completely the opposite. It means that we need more research. There are already realistic characters in animation, and now in robotics we're running into the same issues.'

Hanson also believes that people's reactions to humanoid robots are still in the process of development. 'It's true that people have a very strong emotional reaction to seeing the robots. Sometimes they're startled by seeing it: is it alive, is it dead, is it real, what is it? Some of that is to it being new. People had similar reactions when cinemas first appeared, but people got used to that and they're no longer startled by it. The same can go for robots.' In fact, Hanson turns the hypothesis around: humanoid robot characters depicted in movies are played by human actors, so what is it that makes them scary? Is the uncanny valley a real phenomenon, or is it simply something that takes some getting used to?

Scientists still disagree on whether the uncanny valley is a fact of human nature, or simply a reaction that fades as we become more used to dealing with robots. Some studies show that other primates also react negatively to images that aren't quite realistic. We may feel just as uncomfortable looking at wax figures as we do when confronted with a robot that makes terrifying facial expressions as it attempts to smile. Other studies, however, indicate that there are ways to pull robots out of the uncanny valley, as long as the robot's appearance and behaviour are equally realistic. That is why a realistic-looking robot making mechanical movements looks so creepy.

Human beings are difficult to ape

It's hard enough to make a robot look like a human, but it is even more difficult to make a robot display human-like behaviour. Roboticists like David Hanson and Hiroshi Ishiguro are exceptions in their field, because the majority of their colleagues believe that we have a long way to go before we can build robots that are identical to humans.

Most roboticists would rather focus on building practical robots instead. Helen Greiner, one of the founders of the company iRobot, which made it big with the sale of robot vacuum cleaners, says: 'In my view, attempting to duplicate human intelligence or the human form robotically is a wrong-headed approach (...) merely engineering 'cool' human-like robots does little to advance the field. Roboticists who don't focus on practicality, ruggedness and cost are kidding themselves. What matters is making practical robots that do jobs well and affordably.'

Greiner's opinion was amply illustrated after the nuclear disaster in Fukushima, Japan in 2011. Japan had been building humanoid robots for decades, with the robot ASIMO as the pinnacle of the art. Edward Feigenbaum, an American pioneer in the field of artificial intelligence and winner of the Turing Award (the robotics equivalent to the Nobel Prize), remembers how embarrassed the Japanese were after the Fukushima disaster, when absolutely none of the robots available could survey the damage and look for victims. Feigenbaum had been a member of the evaluation committee for the humanoid robot ASIMO: 'ASIMO was just a block of machinery. A wonderful stupid robot! A few months after the disaster, the Japanese ordered working robots from the United States. The president of Honda was furious, because the company had invested almost a billion dollars in ASIMO, but it was completely useless when it was needed most.'

The robots flown in from America were multifunctional, remote-controlled PackBots made by Helen Greiner's company iRobot.

According to Greiner and most other roboticists, the field of robotics is nowhere near the point where it can attempt to build robots that are identical to humans. But that's not a bad thing: robots don't have to look like people in order to be useful. Robotic vacuum cleaners and care robots can do more to improve our daily lives than any almost-human robot that can only conduct surreal conversations.

What useful things can robots do for us today? What can't they do for us yet? And why? In the next three chapters, we'll crawl inside the robot to find out how they perceive the world around them, how they think, and how they act.


Dieser Download kann aus rechtlichen Gründen nur mit Rechnungsadresse in A, B, BG, CY, CZ, D, DK, EW, E, FIN, F, GR, H, IRL, I, LT, L, LR, M, NL, PL, P, R, S, SLO, SK ausgeliefert werden.

Autorenporträt


Bennie Mols is a journalist specialising in robots, artificial intelligence and the human brain. His other books include Turing's Tango.



Nieske Vergunst studied cognitive artificial intelligence and works as a science information officer. She collects robots and blogs about science and technology.