iCub

An iCub robot mounted on a supporting frame. The robot is 104 cm high and weighs around 22 kg

iCub is a 1 metre high humanoid robot testbed for research into human cognition and artificial intelligence.

It was designed by the RobotCub Consortium of several European universities and built by Italian Institute of Technology, and is now supported by other projects such as ITALK.[1] The robot is open-source, with the hardware design, software and documentation all released under the GPL license. The name is a partial acronym, cub standing for Cognitive Universal Body. Initial funding for the project was 8.5 million from Unit E5 Cognitive Systems and Robotics of the European Commission's Seventh Framework Programme, and this ran for 65 months from 1 September 2004 until 31 January 2010.

The motivation behind the strongly humanoid design is the embodied cognition hypothesis, that human-like manipulation plays a vital role in the development of human cognition. A baby learns many cognitive skills by interacting with its environment and other humans using its limbs and senses, and consequently its internal model of the world is largely determined by the form of the human body. The robot was designed to test this hypothesis by allowing cognitive learning scenarios to be acted out by an accurate reproduction of the perceptual system and articulation of a small child so that it could interact with the world in the same way that such a child does.[2]

Specifications

An iCub at a live demo making facial expressions

The dimensions of the iCub are similar to that of a 2.5 year old child. The robot is controlled by an on-board PC104 controller which communicates with actuators and sensors using CANBus.

It utilises tendon driven joints for the hand and shoulder, with the fingers flexed by teflon-coated cable tendons running inside teflon-coated tubes, and pulling against spring returns. Joint angles are measured using custom-designed Hall-effect sensors and the robot can be equipped with torque sensors. The finger tips can be equipped with tactile touch sensors, and a distributed capacitive sensor skin is being developed.

The software library is largely written in C++ and uses YARP for external communication via Gigabit Ethernet with off-board software implementing higher level functionality, the development of which has been taken over by the RobotCub Consortium.[2] The robot was not designed for autonomous operation, and is consequently not equipped with onboard batteries or processors required for this instead an umbilical cable provides power and a network connection.[2]

In its final version, the robot has 53 actuated degrees of freedom organized as follows:

The head has stereo cameras in a swivel mounting where eyes would be located on a human and microphones on the side. It also has lines of red LEDs representing mouth and eyebrows mounted behind the face panel for making facial expressions.

Since the first robots were constructed the design has undergone several revisions and improvements, for example smaller and more dexterous hands,[3] and lighter, more robust legs with greater joint angles and which permit walking rather than just crawling.[4]

Capabilities of iCub

The iCub has been demonstrated with capabilities to successfully perform the following tasks, among others:

iCubs in the world

There are about thirty iCubs in various laboratories mainly in the European Union but also one in the United States, one in Turkey and Japan. These were built as part of the RobotCub project by Istituto Italiano di Tecnologia (IIT) in Genoa and are used by a small but lively community of scientists that use the iCub to study embodied cognition in artificial systems. Most of the financial support comes from the European Commission's Unit E5 or the Istituto Italiano di Tecnologia (IIT) via the recently created iCub Facility department. The robots are constructed by IIT and cost about €250,000 or $266,186.38[16] each depending upon the version.[17] The development and construction of iCub at the Italian Institute of Technology is part of an independent documentary film called Plug & Pray which was released in 2010.[18]

See also

References

  1. "An open source cognitive humanoid robotic platform". Official iCub website. Retrieved 2010-07-30.
  2. 1 2 3 Metta, Giorgio; Sandini Giulio; Vernon David; Natale Lorenzo; Nori Francesco (2008). The iCub humanoid robot: an open platform for research in embodied cognition (PDF). PerMIS’08. Retrieved 2010-07-39. Check date values in: |access-date= (help)
  3. Laura June (12 March 2010). "iCub gets upgraded with tinier hands, better legs". Engadget. Retrieved 2010-07-30.
  4. Tsagarakis, N.G.; Vanderborght Bram; Laffranchi Matteo; Caldwell D.G. The Mechanical Design of the New Lower Body for the Child Humanoid robot 'iCub' (PDF). IEEE International Conference on Robotics and Automation Conference, (ICRA 2009). Retrieved 2010-07-30.
  5. "https://www.youtube.com/watch?v=JRqdIFCIZd8". iCub crawling video on YouTube. Retrieved 2011-03-19. External link in |title= (help)
  6. Nath, Vishnu; Stephen Levinson. Learning to Fire at Targets by an iCub Humanoid Robot. AAAI Spring Symposium 2013 : Designing Intelligent Robots : Reintegrating AI II. Retrieved 2013-09-29.
  7. "https://www.youtube.com/watch?v=78u8FkVc3Jc". iCub maze solving video on YouTube. Retrieved 2013-09-29. External link in |title= (help)
  8. Kormushev, Petar; Calinon Sylvain; Saegusa Ryo; Metta Giorgio. Learning the skill of archery by a humanoid robot iCub (PDF). IEEE International Conference on Humanoid Robots, (Humanoids 2010). Retrieved 2011-03-19.
  9. "https://www.youtube.com/watch?v=QCXvAqIDpIw". iCub archery video on YouTube. Retrieved 2011-03-19. External link in |title= (help)
  10. "https://www.youtube.com/watch?v=qsrs0e_9iX8". iCub facial expressions video on YouTube. Retrieved 2011-03-19. External link in |title= (help)
  11. "https://www.youtube.com/watch?v=sUErJodlPtQ". iCub force control video on YouTube. Retrieved 2011-03-19. External link in |title= (help)
  12. "Toward Intelligent Humanoids". iCub manipulating a variety of objects. Retrieved 2013-07-22.
  13. Frank, Mikhail; Jürgen Leitner; Marijn Stollenga; Gregor Kaufmann; Simon Harding; Alexander Förster; Jürgen Schmidhuber. The Modular Behavioral Environment for Humanoids & other Robots (MoBeE) (PDF). 9th International Conference on Informatics in Control, Automation and Robotics (ICINCO).
  14. Leitner, Jürgen ‘Juxi’; Simon Harding; Mikhail Frank; Alexander Förster; Jürgen Schmidhuber. Transferring Spatial Perception Between Robots Operating In A Shared Workspace (PDF). IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2012).
  15. Stollenga, Marijn; Leo Pape; Mikhail Frank; Jürgen Leitner; Alexander Förster; Jürgen Schmidhuber. Task-Relevant Roadmaps: A Framework for Humanoid Motion Planning. IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2013).
  16. "XE: (EUR/USD) Euro to US Dollar Rate". www.xe.com. Retrieved 2015-11-20.
  17. "http://www.icub.org/bazaar.php". iCub website. Retrieved 2010-07-30. External link in |title= (help)
  18. Plug & Pray, documentary film about the social impact of robots and related ethical questions
Wikimedia Commons has media related to ICub.
This article is issued from Wikipedia - version of the 12/3/2016. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.