Robot Cooking – Transferring observations into a planning language

20/10/2023

Contact

Name

Markus Schmitz

Oberingenieur

Phone

work
+49 241 80-99801

Email

E-Mail

The project focuses on capturing precise hand movements during cooking using state-of-the-art motion-capture technologies and converting them into a machine-readable planning language. Through the application of classification and clustering algorithms, complex cooking actions are translated into a Planning Domain Definition Language (PDDL), enabling precise timing and action assignment while integrating innovative cooking methods.

 

In our project focused on developing an automated method for Robot-Cooking, we are exploring the possibilities of capturing motion data and translating it into a machine-readable planning language. By employing cutting-edge Motion-Capture technologies enabling detailed hand pose detection, we are setting new standards in automating cooking processes.

Our Motion-Capture system, comprising seven cameras and a glove with precise markers, allows for accurate triangulation of hand movements at a recording rate of 120 frames per second. These data are continuously transformed into various poses with timestamps, creating a comprehensive picture of the motion sequences.

Through adept application of classification and clustering algorithms, we can translate complex cooking actions into a machine-readable Planning Domain Definition Language (PDDL). This not only enables precise timing and assignment of actions but also facilitates the efficient handling of unknown or variable motion patterns. With a focus on dynamic preconditions and implications, we are capable of integrating complex and innovative cooking methods.

Our research outcomes have the potential to revolutionize automation in the food industry and enable a broad spectrum of applications in robotics and artificial intelligence. We look forward to further exploring the frontiers of Robot-Cooking and creating innovative solutions for the future of cooking.

For further information, please refer to the following poster or the associated paper.