Skip to main content

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • Article
  • Published:

Real-world embodied AI through a morphologically adaptive quadruped robot

Abstract

Robots are traditionally bound by a fixed morphology during their operational lifetime, which is limited to adapting only their control strategies. Here we present the first quadrupedal robot that can morphologically adapt to different environmental conditions in outdoor, unstructured environments. Our solution is rooted in embodied AI and comprises two components: (1) a robot that permits in situ morphological adaptation and (2) an adaptation algorithm that transitions between the most energy-efficient morphologies on the basis of the currently sensed terrain. We first build a model that describes how the robot morphology affects performance on selected terrains. We then test continuous adaptation on realistic outdoor terrain while allowing the robot to constantly update its model. We show that the robot exploits its training to effectively transition between different morphological configurations, exhibiting substantial performance improvements over a non-adaptive approach. The demonstrated benefits of real-world morphological adaptation demonstrate the potential for a new embodied way of incorporating adaptation into future robotic designs.

This is a preview of subscription content, access via your institution

Access options

Buy this article

Prices may be subject to local taxes which are calculated during checkout

Fig. 1: The morphologically adaptive robot used in this study.
Fig. 2: Environments and results for indoor and outdoor experiments.
Fig. 3: The COT of different leg lengths on the three terrains.
Fig. 4: Diagram showing the two adaptation methods used.
Fig. 5: Analysis of the adaptation algorithm.
Fig. 6: Model transformation during adaptation.

Similar content being viewed by others

Data availability

All JSON-formatted measurements from robot evaluations in this study have been deposited in Figshare57. All other relevant data are available from the corresponding author on request.

Code availability

The code for running the robot is available on the DyRET Github page51, and is under a GPL v.3 license. The code to set-up the experiments and log the results is available is a separate Github repository58. Author contributions classified in the CRediT taxonomy59.

References

  1. Kawatsuma, S., Fukushima, M. & Okada, T.Emergency response by robots to Fukushima–Daiichi accident: summary and lessons learned. Industrial Robot 39, 428–435 (2012).

    Article  Google Scholar 

  2. Baines, R., Freeman, S., Fish, F. & Kramer, R. Variable stiffness morphing limb for amphibious legged robots inspired by chelonian environmental adaptations. Bioinspir. Biomim. 15, 025002 (2020).

  3. Paik, J. K., Byoungkwon, A., Rus, D. & Wood, R. J. Robotic origamis: self-morphing modular robot. In Proc. 2nd International Conference on Morphological Computation (EPFL, 2012).

  4. Pfeifer, R. & Bongard, J. How the Body Shapes the Way We Think: A New View of Intelligence (MIT Press, 2006).

  5. Wilson, A. D. & Golonka, S. Embodied cognition is not what you think it is. Frontiers Psychol. 4, 58 (2013).

    Article  Google Scholar 

  6. Zhang, T., Zhang, W. & Gupta, M. M. Resilient robots: concept, review, and future directions. Robotics 6, 22 (2017).

    Article  Google Scholar 

  7. Picardi, G., Hauser, H., Laschi, C. & Calisti, M. Morphologically induced stability on an underwater legged robot with a deformable body. Int. J. Robot. Res. https://doi.org/10.1177/0278364919840426 (2019).

  8. Nygaard, T. F., Martin, C. P., Samuelsen, E., Torresen, J. & Glette, K. Real-world evolution adapts robot morphology and control to hardware limitations. In Proc. Genetic and Evolutionary Computation Conference (ACM, 2018).

  9. Nygaard, T. F., Martin, C. P., Howard, D., Torresen, J. & Glette, K. Environmental adaptation of robot morphology and control through real-world evolution. Preprint at http://arxiv.org/abs/2003.13254 (2020).

  10. Heijnen, H., Howard, D. & Kottege, N. A testbed that evolves hexapod controllers in hardware. In 2017 IEEE International Conference on Robotics and Automation 1065–1071 (IEEE, 2017).

  11. Gong, D., Yan, J. & Zuo, G. A review of gait optimization based on evolutionary computation. Appl. Comput. Intell. Soft Comput. https://doi.org/10.1155/2010/413179 (2010).

  12. Ha, S., Xu, P., Tan, Z., Levine, S. & Tan, J. Learning to walk in the real world with minimal human effort. Preprint at http://arxiv.org/abs/2002.08550 (2020).

  13. Kober, J., Bagnell, J. A. & Peters, J. Reinforcement learning in robotics: a survey. Int. J. Robot. Res. 32, 1238–1274 (2013).

    Article  Google Scholar 

  14. Calandra, R., Seyfarth, A., Peters, J. & Deisenroth, M. P. Bayesian optimization for learning gaits under uncertainty. Ann. Math. AI 76, 5–23 (2016).

    MathSciNet  Google Scholar 

  15. Rodriguez, D., Brandenburger, A. & Behnke, S. Combining simulations and real-robot experiments for Bayesian optimization of bipedal gait stabilization. In RoboCup 2018: Robot World Cup XXII 70–82 (Springer, 2018).

  16. Lee, J., Hwangbo, J., Wellhausen, L., Koltun, V. & Hutter, M. Learning quadrupedal locomotion over challenging terrain. Sci. Robot. 5, eabc5986 (2020).

  17. Hwangbo, J. et al. Learning agile and dynamic motor skills for legged robots. Sci. Robot. 4, eaau5872 (2019).

  18. Kalakrishnan, M., Buchli, J., Pastor, P., Mistry, M. & Schaal, S. Learning, planning, and control for quadruped locomotion over challenging terrain. Int. J. Robot. Res. 30, 236–258 (2011).

    Article  Google Scholar 

  19. Kaushik, R., Anne, T. & Mouret, J.-B. Fast online adaptation in robotics through meta-learning embeddings of simulated priors. In 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems 5269–5276 (IEEE, 2020).

  20. Fahmi, S. et al. Stance: locomotion adaptation over soft terrain. IEEE Trans. Robot. 36, 443–457 (2020).

    Article  Google Scholar 

  21. Buchanan, R. et al. Walking posture adaptation for legged robot navigation in confined spaces. IEEE Robot. Autom. Lett. 4, 2148–2155 (2019).

    Article  Google Scholar 

  22. Long, J. Darwin’s Devices: What Evolving Robots can Teach Us about the History of Life and the Future of Technology (Basic Books, 2012).

  23. Eiben, A. E., Kernbach, S. & Haasdijk, E. Embodied artificial evolution: artificial evolutionary systems in the 21st century. Evol. Intell. 5, 261–272 (2012).

    Article  Google Scholar 

  24. Mouret, J.-B. & Chatzilygeroudis, K. 20 Years of reality gap: a few thoughts about simulators in evolutionary robotics. In Proc. Genetic and Evolutionary Computation Conference Companion 1121–1124 (Association for Computing Machinery, 2017).

  25. Cheney, N., MacCurdy, R., Clune, J. & Lipson, H. Unshackling evolution: evolving soft robots with multiple materials and a powerful generative encoding. In Proc. 15th Annual Conference on Genetic and Evolutionary Computation 167–174 (Association for Computing Machinery, 2013).

  26. Marbach, D. & Ijspeert, A. J. Online optimization of modular robot locomotion. In IEEE International Conference Mechatronics and Automation Vol. 1, 248–253 (IEEE, 2005).

  27. Passault, G., Rouxel, Q., Fabre, R., N’Guyen, S. & Ly, O. Optimizing morphology and locomotion on a corpus of parametric legged robots. In Conference on Biomimetic and Biohybrid Systems 227–238 (Springer, 2016).

  28. Spielberg, A. et al. Learning-in-the-loop optimization: end-to-end control and co-design of soft robots through learned deep latent representations. In Advances in Neural Information Processing Systems 8282–8292 (NeurIPS, 2019).

  29. Lipson, H. & Pollack, J. B. Automatic design and manufacture of robotic lifeforms. Nature 406, 974–978 (2000).

    Article  Google Scholar 

  30. Ha, S., Coros, S., Alspach, A., Kim, J. & Yamane, K. Joint optimization of robot design and motion parameters using the implicit function theorem. In Robotics: Science and Systems (Carnegie Mellon Univ., 2017).

  31. Collins, J., Geles, W., Howard, D. & Maire, F. Towards the targeted environment-specific evolution of robot components. In Proc. Genetic and Evolutionary Computation Conference 61–68 (2018).

  32. Hornby, G. S., Lipson, H. & Pollack, J. B. Generative representations for the automated design of modular physical robots. IEEE Trans. Robot. Autom. 19, 703–719 (2003).

    Article  Google Scholar 

  33. Auerbach, J. et al. Robogen: robot generation through artificial evolution. In Artificial Life Conference Proceedings 136–137 (MIT Press, 2014).

  34. Kriegman, S. et al. Scalable sim-to-real transfer of soft robot designs. In 2020 3rd IEEE International Conference on Soft Robotics 359–366 (IEEE, 2020).

  35. Jakobi, N., Husbands, P. & Harvey, I. Noise and the reality gap: the use of simulation in evolutionary robotics. In European Conference on Artificial Life 704–720 (Springer, 1995).

  36. Erez, T., Tassa, Y. & Todorov, E. Simulation tools for model-based robotics: comparison of Bullet, Havok, MuJoCo, ODE and PhysX. In 2015 IEEE International Conference on Robotics and Automation 4397–4404 (IEEE, 2015).

  37. Sun, Y., Chen, X., Yan, T. & Jia, W. Modules design of a reconfigurable multi-legged walking robot. In 2006 IEEE International Conference on Robotics and Biomimetics 1444–1449 (IEEE, 2006).

  38. Guan, Y., Jiang, L., Zhangy, X., Zhang, H. & Zhou, X. Development of novel robots with modular methodology. In 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems 2385–2390 (IEEE, 2009).

  39. Jelisavcic, M. et al. Real-world evolution of robot morphologies: a proof of concept. Artif. Life. 23, 206–235 (2017).

    Article  Google Scholar 

  40. Brodbeck, L., Hauser, S. & Iida, F. Morphological evolution of physical robots through model-free phenotype development. PLoS One 10, 1–17 (2015).

    Article  Google Scholar 

  41. Vujovik, V., Rosendo, A., Brodbeck, L. & Iida, F.Evolutionary developmental robotics: improving morphology and control of physical robots. Artif. Life 23, 169–185 (2017).

    Article  Google Scholar 

  42. Moreno, R. et al. Automated reconfiguration of modular robots using robot manipulators. In 2018 IEEE Symposium Series on Computational Intelligence 884–891 (IEEE, 2018).

  43. Nygaard, T. F. et al. Experiences from real-world evolution with DyRET: dynamic robot for embodied testing. In Symposium of the Norwegian AI Society 58–68 (Springer, 2019).

  44. Zhakypov, Z. & Paik, J. Design methodology for constructing multimaterial origami robots and machines. IEEE Trans. Robot. 34, 151–165 (2018).

    Article  Google Scholar 

  45. Riviere, V., Manecy, A. & Viollet, S. Agile robotic fliers: a morphing-based approach. Soft Robot. 5, 541–553 (2018).

    Article  Google Scholar 

  46. Bucki, N. & Mueller, M. W. Design and control of a passively morphing quadcopter. In 2019 International Conference on Robotics and Automation 9116–9122 (IEEE, 2019).

  47. Geilinger, M., Poranne, R., Desai, R., Thomaszewski, B. & Coros, S. Skaterbots: optimization-based design and motion synthesis for robotic creatures with legs and wheels. ACM Trans. Graphics 37, 1–12 (2018).

    Article  Google Scholar 

  48. Meiri, N. & Zarrouk, D. Flying STAR, a hybrid crawling and flying sprawl tuned robot. In 2019 International Conference on Robotics and Automation 5302–5308 (IEEE, 2019).

  49. Kriegman, S., Blackiston, D., Levin, M. & Bongard, J. A scalable pipeline for designing reconfigurable organisms. Proc. Natl Acad. Sci. USA 117, 1853–1859 (2020).

    Article  Google Scholar 

  50. Ritter, A. Shape-Changing Smart Materials 46–71 (Birkhäuser, 2007).

  51. Nygaard, T. F. & Nordmoen, J. DyRET Documentation (GitHub, 2021); https://github.com/dyret-robot/dyret_documentation

  52. Nygaard, T. F., Martin, C. P., Torresen, J. & Glette, K. Self-modifying morphology experiments with DyRET: dynamic robot for embodied testing. In 2019 IEEE International Conference on Robotics and Automation (IEEE, 2019).

  53. Seok, S. et al. Design principles for energy-efficient legged locomotion and implementation on the MIT cheetah robot. IEEE/ASME Trans. Mechatronics 20, 1117–1129 (2014).

    Article  Google Scholar 

  54. Xi, W., Yesilevskiy, Y. & Remy, C. D. Selecting gaits for economical locomotion of legged robots. Int. J. Robot. Res. 35, 1140–1154 (2016).

    Article  Google Scholar 

  55. Howard, A. & Seraji, H. Vision-based terrain characterization and traversability assessment. J. Robot. Syst. 18, 577–587 (2001).

    Article  Google Scholar 

  56. Nygaard, T. F., Martin, C. P., Torresen, J. & Glette, K. in Applications of Evolutionary Computation (Springer, 2019).

  57. Nygaard, T. F. Dataset Hosted on Figshare (Figshare, 2021); https://doi.org/10.6084/m9.figshare.12661619

  58. Nygaard, T. F. tonnesfn_experiments (GitHub, 2021); https://github.com/tonnesfn/tonnesfn_experiments

  59. Allen, L., O’Connell, A. & Kiermer, V. How can we ensure visibility and diversity in research contributions? How the contributor role taxonomy (CRediT) is helping the shift from authorship to contributorship. Learned Publishing 32, 71–74 (2019).

    Article  Google Scholar 

Download references

Acknowledgements

This work was partially supported by The Research Council of Norway under grant agreement no. 240862 and its Centres of Excellence scheme, project no. 262762.

Author information

Authors and Affiliations

Authors

Contributions

T.F.N. was responsible for conceptualization, methodology, software, validation, formal analysis, investigation, resources, data curation, visualization, and reviewing and editing the original draft. C.P.M. was responsible for conceptualization, methodology, formal analysis and supervision. J.T. was responsible for supervision and funding acquisition. K.G. was responsible for conceptualization, methodology and supervision. D.H. was responsible for conceptualization, methodology, formal analysis, resources, visualization, supervision, project administration and funding acquisition. C.P.M., J.T., K.G. and D.H. all reviewed and edited the manuscript.

Corresponding author

Correspondence to Tønnes F. Nygaard.

Ethics declarations

Competing interests

The authors declare no competing interests.

Additional information

Peer review information Nature Machine Intelligence thanks Agoston Eiben, and the other anonymous reviewer(s), for their contribution to the peer review of this work.

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Extended data

Extended Data Fig. 1 A diagram of the prediction model used, instantiated with the baseline data set measurements from the indoor boxes.

a, The full model with the 25 sub-models for each leg length combination. b, The sub-model for femur 50mm tibia 40mm which shows the predicted energy efficiency (COT) of that leg-length combination for different terrains. The 15 points are the actual measurements from the indoor data collection (square from concrete, circle from sand, and diamond from gravel). When the robot encounters a new terrain, each sub-model is queried at the given roughness/hardness values to form a predicted 5 × 5 COT map similar to the ones shown in Figure 2 in the main text.

Extended Data Fig. 2 Reconfigurable mechanism and workspace difference.

a, The components of the mechanical adaptation mechanism. b, An indication of the difference in workspace for the two extreme leg lengths.

Extended Data Fig. 3 Terrain features for indoor boxes.

Kernel density estimate of the terrain features for the three different surfaces in the indoor terrain boxes, measured during the data set collection. This includes all the different morphologies. Default seaborn.jointplot parameters are used for the estimate.

Extended Data Fig. 4 Effect of speed on terrain measurements.

Terrain characteristics of the three surfaces in the indoor terrain boxes, shown for different walking speeds. The solid lines show the mean, and the shaded areas show the standard deviation. a, Perceived roughness from the depth camera in the front of the robot. b, Perceived hardness from the force sensors in the feet.

Extended Data Fig. 5 One walking sequence on the indoor terrains.

Generated from single 16 second walking sessions with shortest possible leg length. Roughness to the left is calculated from the depth camera. The middle plot shows the absolute, filtered three axis force measurements used to infer hardness, summed for the two front leg sensors (x-axis, sideways, in blue; y-axis, lengthwise, in orange; and z-axis, up, in green). The right plot shows the total current reported by the servos.

Supplementary information

Supplementary Information

Supplementary Methods and Tables 1–5.

Supplementary Video 1

A video summary of the paper.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Nygaard, T.F., Martin, C.P., Torresen, J. et al. Real-world embodied AI through a morphologically adaptive quadruped robot. Nat Mach Intell 3, 410–419 (2021). https://doi.org/10.1038/s42256-021-00320-3

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1038/s42256-021-00320-3

This article is cited by

Search

Quick links

Nature Briefing AI and Robotics

Sign up for the Nature Briefing: AI and Robotics newsletter — what matters in AI and robotics research, free to your inbox weekly.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing: AI and Robotics