SecondHands project members present first robot prototype ARMAR-6

Today the SecondHands project is presenting the first prototype of its collaborative robot which will act as the main platform for testing and developing new technologies related to the maintenance and repair of automation equipment in Ocado’s highly automated warehouses using a robot assistant.

The SecondHands robot prototype ARMAR-6 has been developed at the Karlsruhe Institute of Technology (KIT) by Tamim Asfour and his team at the High Performance Humanoid Technologies Lab (H²T) of the Institute for Anthropomatics and Robotics.

SecondHands is an EU-funded Horizon 2020 project aiming to design a collaborative robot (cobot) that can proactively offer support to maintenance technicians working in Ocado’s highly automated warehouses, also known as Customer Fulfilment Centers (CFCs). This robot will be a second pair of hands that will assist technicians when they are in need of help. The robot will learn through observation and will augment the humans’ capabilities by completing tasks that require a level of precision or physical strength that are not available to human workers.

The SecondHands robot prototype has been delivered to the Ocado Technology robotics research lab where experiments to evaluate the integrated research components from all project partners is currently taking place. The video below presents the first instances of the robot interacting with its testing environment right after it was assembled:


The SecondHands project combines the skills of world class researchers focusing on a real-world industrial use case to deliver:

  • the design of a new robotic assistant
  • a knowledge base to facilitate proactive help
  • a high degree of human-robot interaction
  • advanced perception skills to function in a highly dynamic industrial environment

Together with its research partners École Polytechnique Fédérale de Lausanne (EPFL), Karlsruhe Institute of Technology (KIT), Sapienza Università di Roma, and University College London (UCL), Ocado Technology is working to advance the technology readiness of areas such as computer vision and cognition, human-robot interaction, mechatronics, and perception and ultimately demonstrate how versatile and productive human-robot collaboration can be in practice. Here is a summary of the research contributions for each of the project partners:

  • EPFL: human-robot physical interaction with bi-manipulation, including action skills learning
  • KIT (H²T): Development of the ARMAR-6 robot including its entire mechatronics, software operating system and control as well as robot grasping and manipulation skills.
  • KIT (Interactive Systems Lab, ISL): the spoken dialog management system
  • Sapienza University of Rome: visual scene perception with human action recognition, cognitive decision making, task planning and execution with continuous monitoring
  • UCL: computer vision techniques for 3D human pose estimation and semantic 3D reconstruction of dynamic scenes
  • Ocado Technology: integration of researched functionality on the robot platform and evaluation in real-world demonstrations

Collaborative robots represent a fast-growing segment of the industrial robots market. According to the World Robotics Report released earlier this year by the International Federation of Robotics (IFR), industrial robot installations are forecast to grow by 15% in 2018. This increased adoption of robots for a wide range of applications comes off the back of stronger-than-expected growth in the global economy, faster business cycles, greater variety in customer demand, and the scaling up of Industry 4.0 concepts.

As robots evolve from industrial machines performing repetitive tasks in isolated areas of large-scale factories to highly complex systems powered by deep neural networks, SecondHands has the ambitious goal to solve one of the greatest challenges facing the robotics field: developing collaborative robots that can safely and intelligently interact with their human counterparts in a real-world factory environment.


Graham Deacon was an invited speaker at Next Generation Robots for the Factory of the Future at the Royal Society in London on November 17, 2017

FourByThree proposes the development of a new generation of modular industrial robotic solutions that are suitable for efficient task execution in collaboration with humans in a safe way and are easy to use and program by the factory workers. FourByThree is a European project aimed to design, build and test pioneering robotic solutions able to collaborate safely and efficiently with human operators in industrial manufacturing companies. This workshop will present the robotic advancements and scientific results achieved during FourByThree project. In addition, we will present and discuss the future trends of robotics.


Tamim Asfour was an invited speaker at the IEEE-RAS Humanoids 2017 conference in Birmingham, UK on November 15, 2017

Combining Model-based and Learning-based Approaches for Humanoid Grasping at the Towards Robust Grasping and Manipulation Skills for Humanoids workshop

The ability to grasp and manipulate objects provides an essential means to interact with the environment. Recent years have seen a proliferation of research projects to use robotic manipulation in real world applications such as human robot collaboration and industrial tasks. Despite the promising progress, robotic grasping and manipulation has yet to demonstrate necessary robustness and dexterity to be fully exploited in various settings, such as in everyday life contexts, industrial environments, and when dealing with novelty and uncertainty, e.g., object shape, pose, weight, friction at contacts, and with unstructured environments.

Studies on human grasping and manipulation have shown that sensorial capabilities play a key role in the success of human manipulation, allowing a better perception of the object and the interaction with it, and revealing adaptation and control strategies, e.g., using environment and its constraints for more effective manipulation. Inspired by these findings, robotics research aiming to robustify object grasping and manipulation skills shows the importance of effective use of sensory data (visual, tactile, proprioceptive) from planning stage to task completion. Various kinds of approaches have been proposed, e.g., data-driven and empirical approaches such as learning from experience and from human demonstration, analytic approaches such as modelling physical and dynamical constraints manually, and approaches to hand designs such as under-actuated and soft hands.

In this workshop, we aim to bring together researchers and experts in key areas for grasping and manipulation such as perception, control, learning, design of hands and grippers, and studies analysing human manipulation skills. We aspire to identify recent developments in these research areas, both in theory and applications, discussing recent achievements, debating underlying assumptions, and challenges for future progress.


Prof. Tamim Asfour was an invited speaker at the Humanoid Robots for Real Applications Use workshop, IEEE-RAS Humanoids 2017 in Birmingham, UK on November 15, 2017

ARMAR-6: Humanoid Robots for Maintenance tasks in warehouse Environments 

Whereas traditional robotic systems have paved the path in various applications and are renewing the landscape of future industry and our societal culture, the main usage of humanoid robot platforms is still confined to laboratory research and development despite the noticeable advances that are made in various aspects of humanoid technology. Yet recent needs that appeared from various application fields would be ready to adopt humanoid technology shall it fulfill specifications in terms of functionalities but also reliability and safety.

The problem that we witness now is that no humanoid platform is readily available to make such a jump, and more importantly, no known robotic arms industry seems to be strongly willing to build such systems for effective business, with the exception of SoftBank Robotics for domotics and marketing.

The aim of this workshop is to gather speakers around a topic of high importance in humanoid research and development: how far are we in achieving humanoid robots or transfer humanoid technology for real-use applications? What are the ingredients that make such an endeavor possible? What are the bottlenecks in terms of research and development that could make such a perspective a viable one?

Therefore, the workshop will gather a set of the main representative actors in current projects of well-identified end-users of humanoid technology and those industries that can potentially build it. The idea is to share opinion and raise discussions not on the research perspectives only but also on the non-said practical issues that are fuzzy and not clear to researchers. For instance, what business-plans are possible for humanoid robots? How costs can be reduced despite the complexity of the structure? What inhibits well-known robotic arms providers to build and commercialize humanoid robots? What prohibits companies that have already the technology to go further with innovation plans? From the industrial viewpoints, where research and development of humanoids should focus?

Website: http://jrl-umi3218.github.io/workshop-humanoids-2017.html#content



Prof. Tamim Asfour was an invited speaker at the IEEE/RJS IROS conference in Vancouver, Canada on September 24, 2017

One of the key skills for a robot is to physically interact with the environment in order to achieve basic tasks such as pick-and-place, sorting etc. For physical interaction, object grasping and manipulation capabilities along with dexterity (e.g. to use objects/tools successfully) and high-level reasoning (e.g. to decide about how to fulfill task requirements) are crucial. Typical applications of robots have been welding, assembly, pick-and-place in industrial settings. However, traditional industrial robots perform their assignments in cages and are heavily dependent on hard automation that requires pre-specified fixtures and timeconsuming programming.

This workshop focuses on human-in-the-loop robotic manipulation that can involve different human roles, e.g., supervisory, cooperative, active or passive. This workshop proposes to gather experts in human-in-the-loop robotic manipulation, for detecting synergies in the frameworks proposed to observe and model the human contribution to the task. We would like also to identify the critical challenges still to be addressed by the community, to reach the envisioned human-robot close and fluent collaboration, across the different approaches pertaining to the workshop topic.


Prof. Aude Billard was an invited speaker at (Empirically) Data-Driven Robotic Manipulation workshop, in Boston, USA on July 16th, 2017

There is a great excitement surrounding data-driven techniques for perceptual classification, inference, and motor control. These techniques come to robotic manipulation with the promise of enabling behavior with greater robustness, performance, and adaptability, as well as suggesting new representations for physical interaction. Recent excitement in the lab, however, is tempered by significant challenges faced when building practical data-driven robots. This workshop sets the focus on those challenges involved in making the data-driven approach work for robotic manipulation.

Robot manipulation is a useful “petri dish” for studying data-driven systems with significant potential impact. Hands, or end-effectors, are where the “rubber hits the road”—where robots make and break contact with the world; and where visual, tactile, and proprioceptive feedback combine to explore, model, and control interaction with the environment. In the course of such interaction, the robot is exposed to a great deal of information, in the form of data that is challenging to collect, maintain, organize, and use. On one end, we can only start capturing data with an already functional robotic system, which over time is prone to degrade and/or break. On the other end, the dynamics and perceptual feedback from robotic manipulation systems yield multi-modal data that is complicated to make sense of. The goal of this workshop is to identify the challenges that are preventing data-driven robotic manipulation from experiencing the same performance jump as other fields that have embraced it, and what can we do to overcome them.

Website: https://ddm2017.mit.edu/


Prof. Lourdes Agapito was a keynote speaker at ICRA 2017 in Singapore on May 30, 2017

Capturing Vivid 3D Models of the World from Video

As humans we take the ability to perceive the dynamic world around us in three dimensions for granted. From an early age we can grasp an object by adapting our fingers to its 3D shape; we can understand our mother’s feelings by interpreting her facial expressions; or we can effortlessly navigate through a busy street. All of these tasks require some internal 3D representation of shape, deformations and motion. Building algorithms that can emulate this level of human 3D perception has proved to be a much harder task than initially anticipated.

In this lecture Professor Lourdes Agapito will show progress from her early systems which captured sparse 3D models with primitive representations of deformation towards our most recent algorithms which can capture every fold and detail of hands, faces and clothes in 3D using as input video sequences taken with a single consumer camera.


Mirko Wächter was an invited speaker at the Roboterkontrollarchitekturen workshop in Dagstuhl, Germany on June 9, 2017

ArmarX – A Full Stack Robot Software Framework: From Real-Time Control to Symbolic Planning

In recent years, very complex systems have emerged, above all in service robotics and in assistant robotics. Due to better sensor technology, new methods of perception, efficient methods of localization and map generation as well as cognitive system components have been developed. Another aspect that supports this positive trend is the large number of open source robotic libraries. For example, the Robot Operating System – developed by Willow Garage – allows researchers to build basic systems faster. Secondly, it has helped to improve the reusability and integrability of large collections of community-driven robotic components. However, the growing complexity of robotic systems still poses a major challenge. Starting with a suitable embedded architecture, supporting control software development through powerful frameworks to control architecture, a suitable system design must be found to enable an efficient development process, and non-functional ones Features such as real-time capability, reliability, scalability or security. The workshop takes up this topic. Scientists working in these fields of research will present their latest scientific findings during the workshop. In addition, scientific questions are to be worked out that have so far been inadequately or not solved.


Dr Graham Deacon invited speaker at Humanoids 2017 workshop

SecondHands will be attending Humanoids 2017 IEEE RAS International Conference on Humanoid Robots, November 15-17, 2017, Birmingham (UK).

Project coordinator, Dr Graham Deacon, from Ocado, will be presenting at Human-Humanoid collaboration: the next industrial revolution? giving an overview of the SecondHands project, including the challenges and research work to build an autonomous Maintenance Assistant that can work alongside a human maintenance assistant in the Ocado warehouse.



Prof. Lourdes Agapito Keynote speaker at BMVA technical meeting on Dynamic Scene Reconstruction

The British Machine Vision Association and Society for Pattern Recognition

Keynote speaker at the BMVA technical meeting: Dynamic Scene Reconstruction, Prof. Lourdes Agapito will present Capturing Vivid 3D Models of the World from Video, June 21st 2017, London, UK.

Workshop Theme: Reconstruction of general dynamic scenes is motivated by potential applications in film and broadcast production together with the ultimate goal of automatic understanding of real-world scenes. With advances in hardware and advent of virtual and augmented reality, dynamic scene reconstruction is being applied to more complex scenes. We present a workshop of oral presentations, posters and demos.