For years now, researchers have been exploring ways to create devices that understand the nonverbal cues that we take for granted in human-human interaction. One of the more interesting projects we've seen of late is led by Professor Peter Robinson at the Computer Laboratory at the University of Cambridge, who is working on what he calls "mind-reading machines," which can infer mental states of people from their body language. By analyzing faces, gestures, and tone of voice, it is hoped that machines could be made to be more helpful (hell, we'd settle for "less frustrating"). Peep the video after the break to see Robinson using a traditional (and annoying) satnav device, versus one that features both the Cambridge "mind-reading" interface and a humanoid head modeled on that of Charles Babbage. "The way that Charles and I can communicate," Robinson says, "shows us the future of how people will interact with machines." Next stop: uncanny valley!
Cambridge developing 'mind reading' computer interface with the countenance of Charles Babbage (video) originally appeared on Engadget on Thu, 23 Dec 2010 21:01:00 EDT. Please see our terms for use of feeds.
Permalink | | Email this | CommentsSource: http://feeds.engadget.com/~r/weblogsinc/engadget/~3/KgNUDAIPmw8/
MEMC ELECTRONIC MATERIALS MICROSOFT MILLICOM INTL CELLULAR MOBILE TELESYSTEMS
No comments:
Post a Comment