<p>The brain could be a remote unit with higher capacity, not unlike the MIT's NEXI "social robot".....</p><p><img style="width:300px;height:447px" src="http://images.dailytech.com/nimage/7799_Sad%20Robot.jpg" border="0" alt="" /><br /><br />
MIT page<br /><br />
MIT technical video (WATCH THIS ONE!!) (MOV format)<br /><br />Save/As and watching it in Quicktime at 2x size is recommended.<br /><br />
YouTube video (shows expressiveness etc.)<br /><br />
Daily Tech story.... <br /></p><div style="margin:5px0px0px"><div class="smallfont" style="margin-bottom:2px">Quote:</div><table border="0" cellspacing="0" cellpadding="4" width="100%"><tbody><tr><td class="alt2" style="border:1px"><strong>MIT Develops Advanced Humanlike <span class="highlight">Robot</span></strong><br /><br />Like something straight out of the movies, MIT's NEXI body has human-like expressions and speech, which is either really cool or really creepy<br /><br />Scientist continue to push the boundaries of artificial intelligence, deploying robots and computer AIs into increasingly complex and varied situations. <br /><br />Many observers on robotics and artificial intelligence, including Apple co-founder Steve Wozniak, remain skeptical that robots will ever be able to perform human like tasks and interact with humans on a social basis.<br /><br />However, seeing is believing, and if MIT's startling new video is any indication, it appears that researchers at the MIT Media Lab are much closer to overcoming the latter obstacle than previously thought. The product of the Lab's team, directed by Dr. Cynthia Breazeal, is a human-like <span class="highlight">robot</span> named Nexi that speaks and features complex hand movements and facial gestures.<br /><br />Nexi is an Mobile Dexterous Social <span class="highlight">robot</span>, or MDS. The <span class="highlight">robot</span> is mobile as it can navigate via wheels. It features a mobile base that is self balance, akin to a mini-Segway. It can travel at human walking speed.<br /><br />The <span class="highlight">robot</span> is dexterous in that it has two highly agile arms. The arms have four degrees of freedom (DOF), are elastic, and are based on the DOMO/WAM style arm design. They support position and force control via force sensors. The arms together can pick up a 10 pound object, fully extended. Several of the robots can "team up" to lift heavier objects. The shoulder chassis of the <span class="highlight">robot</span> is mounted on a torso pivot, giving it full freedom of motion.<br /><br />A DSP and FPGA control the motors while the balancing and force control are achieved via an embedded PC running Linux OS mounted near the base. The Linux PC features wireless communication. A laser sight is used to avoid obstacles.<br /><br />The hands are one of the robot's unique features. They feature five degrees of freedom. The forearm can roll and provide wrist flex akin to a human forearm. Each hand features three fingers and an opposable thumb, with the index finger and thumb independently controlled and the other two fingers coupled together. The <span class="highlight">robot</span> can grip objects and make hand gestures to convey emotions. The arms are developed by Meka, Inc. with help from MIT, and also feature protection against collision and slips.<br /><br />The most interesting and perhaps most disturbing part of Nexi is its expressive face. The face, design by Xitome Design with MIT, features complex expressions. The four degrees of freedom neck can bend low at the base and the head supports a pan-tilt-yaw, allowing for human-like motions. It can nod, shake its head, or move its head as if orienting itself with its surroundings.<br /><br />The face has 15 DOF and features expressive eyebrows, gaze, eyelids, and mandible. Each eye has a color CCD camera and the head also features an active indoor IR camera. Four separate microphones allow it to localize sounds and another microphone is used to detect speech. It has a speaker to allow it to synthesize speech.<br /><br />For the robot's human-like behavior and interaction, MIT is focusing on a human-robot interaction approach, which seeks to identify what average citizens want in a <span class="highlight">robot</span>. MIT will be deploying a team of four robots during a two week pilot program at the Boston Museum of Science in the summer of 2009.<br /><br />The <span class="highlight">robot</span> will interact with visitors within a "robot playroom." It will engage listeners in conversation and express emotions. During these interactions the <span class="highlight">robot</span> will try to learn conversation and new behaviors. At points the MIT operates can elect to tele-operate the <span class="highlight">robot</span>, Wizard of Oz style to give it more complex behavior, or help conversations from getting to boring. The <span class="highlight">robot</span> supports many emotions including sadness, anger, confusion, excitement, and boredom.<br /><br />If the video Nexi independently demonstrates its basic conversational skills, greeting the viewer and informing them, "But I hope you can see that I am very happy that I met you. Thank you for visiting me and I hope to see you again soon!"<br /><br />While the MIT researchers admit that human level learning and more complex conversational skills remain currently unsolved challenges, Nexi certainly represents an amalgamation of exciting and exotic advances in robotics. With robots like Nexi that can learn and interact, the world may soon become a very different place.<br /><br />The MIT team's research is sponsored by an ONR DURIP Award "Mobile, Dexterous, Social Robots to Support Complex Human-Robot Teamwork in Uncertain Environments" and by a Microsoft grant.