The emotional robot

Humans and computers are fundamentally different. Apart from the ungainly exterior of the latter, the main difference is the way in which they process information. Computers follow the instructions of programmed algorithms and user input, whereas the human brain processes information in a nonlinear way with often‐unexpected results—which explains much of human inventiveness and creativity. Yet, it might not be too long until computers ‘evolve’ to emulate human cognition—with some help from their human masters, of course. > …a dream that is as old as the golem of Jewish folklore […] intelligent robots that understand human emotions… Researchers from fields as diverse as computer science, mathematics, neuroscience, kinematics and cognitive science are getting closer to creating computers and robots that can reason, learn and recognize emotion. They might finally realize a dream that is as old as the golem of Jewish folklore and as current as blockbuster science fiction: robots that understand human emotions, and that can adapt to new environments and unexpected situations. According to Stan Franklin, a Professor in the Cognitive Computing Research Group at the University of Memphis in Tennessee, USA—who describes himself as a mathematician turned computer scientist turning cognitive scientist—these types of cognitive computing project usually have aspects of both biology and engineering. To replicate the inner workings of the brain, scientists first need to understand how the brain processes information, creates emotions and achieves cognition—questions that are still far from being answered. Computer scientists then use this information—or as much of it as is available—to create algorithms that emulate cognition. Once the many cognitive and engineering problems are solved, it should be possible to programme computers that think, act and feel like humans. Martin McGinnity, Professor of Intelligent Systems Engineering at the University of Ulster in Northern Ireland, has noted that a computer that could …