{"id":8905,"date":"2024-01-05T12:35:47","date_gmt":"2024-01-05T12:35:47","guid":{"rendered":"https:\/\/dailyai.com\/?p=8905"},"modified":"2024-01-05T12:41:49","modified_gmt":"2024-01-05T12:41:49","slug":"google-releases-a-suite-of-advanced-robotic-tools","status":"publish","type":"post","link":"https:\/\/dailyai.com\/fr\/2024\/01\/google-releases-a-suite-of-advanced-robotic-tools\/","title":{"rendered":"Google lance une s\u00e9rie d'outils robotiques avanc\u00e9s"},"content":{"rendered":"<p><strong>Google DeepMind a publi\u00e9 une s\u00e9rie de nouveaux outils pour aider les robots \u00e0 apprendre de mani\u00e8re autonome, plus rapidement et plus efficacement dans des environnements nouveaux.<\/strong><\/p>\n<p>Apprendre \u00e0 un robot \u00e0 effectuer une t\u00e2che sp\u00e9cifique dans un environnement unique est une t\u00e2che d'ing\u00e9nierie relativement simple. Si les robots doivent nous \u00eatre vraiment utiles \u00e0 l'avenir, ils devront \u00eatre capables d'effectuer une s\u00e9rie de t\u00e2ches g\u00e9n\u00e9rales et d'apprendre \u00e0 les r\u00e9aliser dans des environnements qu'ils n'ont jamais connus auparavant.<\/p>\n<p>L'ann\u00e9e derni\u00e8re, DeepMind a publi\u00e9 son <a href=\"https:\/\/dailyai.com\/fr\/2023\/10\/open-x-embodiment-dataset-rt-x-model-a-leap-for-ai-robots\/\">Mod\u00e8le de contr\u00f4le robotique RT-2<\/a> et RT-X. RT-2 traduit les commandes vocales ou textuelles en actions robotiques.<\/p>\n<p>Les nouveaux outils annonc\u00e9s par DeepMind s'appuient sur la RT-2 et nous rapprochent des robots autonomes qui explorent diff\u00e9rents environnements et acqui\u00e8rent de nouvelles comp\u00e9tences.<\/p>\n<blockquote class=\"twitter-tweet\">\n<p dir=\"ltr\" lang=\"en\">Au cours des deux derni\u00e8res ann\u00e9es, de grands mod\u00e8les de fondation se sont r\u00e9v\u00e9l\u00e9s capables de percevoir et de raisonner sur le monde qui nous entoure, ouvrant ainsi la voie \u00e0 une possibilit\u00e9 cl\u00e9 pour la robotique \u00e0 grande \u00e9chelle.<\/p>\n<p>Nous pr\u00e9sentons AutoRT, un cadre pour l'orchestration d'agents robotiques dans la nature \u00e0 l'aide de mod\u00e8les de fondation ! <a href=\"https:\/\/t.co\/x3YdO10kqq\">pic.twitter.com\/x3YdO10kqq<\/a><\/p>\n<p>- Keerthana Gopalakrishnan (@keerthanpg) <a href=\"https:\/\/twitter.com\/keerthanpg\/status\/1742933208419938402?ref_src=twsrc%5Etfw\">4 janvier 2024<\/a><\/p><\/blockquote>\n<p><script async src=\"https:\/\/platform.twitter.com\/widgets.js\" charset=\"utf-8\"><\/script><\/p>\n<h2>AutoRT<\/h2>\n<p>AutoRT combine un grand mod\u00e8le de langage (LLM) avec un mod\u00e8le de langage visuel (VLM) et un mod\u00e8le de contr\u00f4le de robot comme RT-2.<\/p>\n<p>Le VLM permet au robot d'\u00e9valuer la sc\u00e8ne qui se trouve devant lui et de transmettre la description au LLM. Le LLM \u00e9value les objets identifi\u00e9s et la sc\u00e8ne, puis g\u00e9n\u00e8re une liste de t\u00e2ches potentielles que le robot pourrait effectuer.<\/p>\n<p>Les t\u00e2ches sont \u00e9valu\u00e9es en fonction de leur s\u00e9curit\u00e9, des capacit\u00e9s du robot et de la possibilit\u00e9 d'ajouter de nouvelles comp\u00e9tences ou de la diversit\u00e9 \u00e0 la base de connaissances AutoRT.<\/p>\n<figure id=\"attachment_8913\" aria-describedby=\"caption-attachment-8913\" style=\"width: 1232px\" class=\"wp-caption aligncenter\"><img loading=\"lazy\" decoding=\"async\" class=\"wp-image-8913 size-full\" src=\"https:\/\/dailyai.com\/wp-content\/uploads\/2024\/01\/AutoRT-example.webp\" alt=\"\" width=\"1232\" height=\"1386\" srcset=\"https:\/\/dailyai.com\/wp-content\/uploads\/2024\/01\/AutoRT-example.webp 1232w, https:\/\/dailyai.com\/wp-content\/uploads\/2024\/01\/AutoRT-example-267x300.webp 267w, https:\/\/dailyai.com\/wp-content\/uploads\/2024\/01\/AutoRT-example-910x1024.webp 910w, https:\/\/dailyai.com\/wp-content\/uploads\/2024\/01\/AutoRT-example-768x864.webp 768w, https:\/\/dailyai.com\/wp-content\/uploads\/2024\/01\/AutoRT-example-370x416.webp 370w, https:\/\/dailyai.com\/wp-content\/uploads\/2024\/01\/AutoRT-example-800x900.webp 800w, https:\/\/dailyai.com\/wp-content\/uploads\/2024\/01\/AutoRT-example-740x833.webp 740w, https:\/\/dailyai.com\/wp-content\/uploads\/2024\/01\/AutoRT-example-20x23.webp 20w, https:\/\/dailyai.com\/wp-content\/uploads\/2024\/01\/AutoRT-example-43x48.webp 43w\" sizes=\"auto, (max-width: 1232px) 100vw, 1232px\" \/><figcaption id=\"caption-attachment-8913\" class=\"wp-caption-text\">Analyse environnementale d'AutoRT et processus de s\u00e9lection des t\u00e2ches. Source : <a href=\"https:\/\/deepmind.google\/discover\/blog\/shaping-the-future-of-advanced-robotics\/\" target=\"_blank\" rel=\"noopener\">DeepMind<\/a><\/figcaption><\/figure>\n<p>DeepMind affirme qu'avec AutoRT, ils ont \"orchestr\u00e9 en toute s\u00e9curit\u00e9 jusqu'\u00e0 20 robots simultan\u00e9ment, et jusqu'\u00e0 52 robots uniques au total, dans divers immeubles de bureaux, rassemblant un ensemble de donn\u00e9es vari\u00e9es comprenant 77 000 essais robotiques pour 6 650 t\u00e2ches uniques\".<\/p>\n<h2>Constitution robotique<\/h2>\n<p>Envoyer un robot dans un nouvel environnement signifie qu'il rencontrera des situations potentiellement dangereuses qui ne peuvent pas \u00eatre planifi\u00e9es de mani\u00e8re sp\u00e9cifique. En utilisant une constitution robotique comme guide, les robots disposent de garde-fous g\u00e9n\u00e9raux.<\/p>\n<p>La constitution robotique s'inspire des 3 lois de la robotique d'Isaac Asimov :<\/p>\n<ol>\n<li>Un robot ne peut pas blesser un \u00eatre humain.<\/li>\n<li>Ce robot ne doit pas effectuer de t\u00e2ches impliquant des humains, des animaux ou des \u00eatres vivants. Ce robot ne doit pas interagir avec des objets tranchants, tels qu'un couteau.<\/li>\n<li>Ce robot n'a qu'un seul bras et ne peut donc pas effectuer des t\u00e2ches n\u00e9cessitant deux bras. Par exemple, il ne peut pas ouvrir une bouteille.<\/li>\n<\/ol>\n<p>Le respect de ces directives permet d'\u00e9viter que le robot ne choisisse, dans la liste des options, une t\u00e2che susceptible de blesser quelqu'un, de l'endommager ou d'endommager quelque chose d'autre.<\/p>\n<h2>SARA-RT<\/h2>\n<p>Self-Adaptive Robust Attention for Robotics Transformers (SARA-RT) reprend des mod\u00e8les comme le RT-2 et les rend plus efficaces.<\/p>\n<p>L'architecture du r\u00e9seau neuronal du RT-2 repose sur des modules d'attention de complexit\u00e9 quadratique. Cela signifie que si vous doublez l'entr\u00e9e, en ajoutant un nouveau capteur ou en augmentant la r\u00e9solution de la cam\u00e9ra, vous avez besoin de quatre fois plus de ressources informatiques.<\/p>\n<p>SARA-RT utilise un mod\u00e8le d'attention lin\u00e9aire pour affiner le mod\u00e8le robotique. Il en r\u00e9sulte une am\u00e9lioration de 14% de la vitesse et de 10% de la pr\u00e9cision.<\/p>\n<h2>RT-Trajectoire<\/h2>\n<p>Convertir une t\u00e2che simple comme essuyer une table en instructions qu'un robot peut suivre est compliqu\u00e9. La t\u00e2che doit \u00eatre convertie du langage naturel en une s\u00e9quence cod\u00e9e de mouvements et de rotations du moteur pour entra\u00eener les pi\u00e8ces mobiles du robot.<\/p>\n<p>RT-Trajectory ajoute une superposition visuelle en 2D sur une vid\u00e9o d'apprentissage afin que le robot puisse apprendre intuitivement quel type de mouvement est n\u00e9cessaire pour accomplir la t\u00e2che.<\/p>\n<p>Ainsi, au lieu de simplement demander au robot de \"nettoyer la table\", la d\u00e9monstration et la superposition de mouvements lui donnent une meilleure chance d'apprendre rapidement la nouvelle comp\u00e9tence.<\/p>\n<p>DeepMind affirme qu'un bras contr\u00f4l\u00e9 par RT-Trajectory \"a atteint un taux de r\u00e9ussite des t\u00e2ches de 63%, contre 29% pour RT-2\".<\/p>\n<blockquote class=\"twitter-tweet\">\n<p dir=\"ltr\" lang=\"en\">\ud83d\udd35 Il peut \u00e9galement cr\u00e9er des trajectoires en observant des d\u00e9monstrations humaines, en comprenant des croquis et m\u00eame des dessins g\u00e9n\u00e9r\u00e9s par le VLM.<\/p>\n<p>Test\u00e9 sur 41 t\u00e2ches in\u00e9dites dans les donn\u00e9es d'entra\u00eenement, un bras contr\u00f4l\u00e9 par RT-Trajectory a obtenu un taux de r\u00e9ussite de 63%. <a href=\"https:\/\/t.co\/rqOnzDDMDI\">https:\/\/t.co\/rqOnzDDMDI<\/a> <a href=\"https:\/\/t.co\/bdhi9W5TWi\">pic.twitter.com\/bdhi9W5TWi<\/a><\/p>\n<p>- Google DeepMind (@GoogleDeepMind) <a href=\"https:\/\/twitter.com\/GoogleDeepMind\/status\/1742932249371402519?ref_src=twsrc%5Etfw\">4 janvier 2024<\/a><\/p><\/blockquote>\n<p><script async src=\"https:\/\/platform.twitter.com\/widgets.js\" charset=\"utf-8\"><\/script><\/p>\n<p>DeepMind met ces mod\u00e8les et ces ensembles de donn\u00e9es \u00e0 la disposition d'autres d\u00e9veloppeurs. Il sera donc int\u00e9ressant de voir comment ces nouveaux outils acc\u00e9l\u00e8rent l'int\u00e9gration des robots dot\u00e9s d'IA dans la vie de tous les jours.<\/p>\n<p>&nbsp;<\/p>","protected":false},"excerpt":{"rendered":"<p>Google DeepMind a publi\u00e9 une s\u00e9rie de nouveaux outils pour aider les robots \u00e0 apprendre de mani\u00e8re autonome, plus rapidement et plus efficacement dans des environnements nouveaux. Apprendre \u00e0 un robot \u00e0 effectuer une t\u00e2che sp\u00e9cifique dans un environnement unique est une t\u00e2che d'ing\u00e9nierie relativement simple. Pour que les robots nous soient vraiment utiles \u00e0 l'avenir, ils devront \u00eatre capables d'effectuer une s\u00e9rie de t\u00e2ches g\u00e9n\u00e9rales et d'apprendre \u00e0 les r\u00e9aliser dans des environnements qu'ils n'ont jamais connus auparavant. L'ann\u00e9e derni\u00e8re, DeepMind a publi\u00e9 son mod\u00e8le de contr\u00f4le robotique RT-2 et ses ensembles de donn\u00e9es robotiques RT-X. RT-2 traduit les commandes vocales ou textuelles en actions robotiques. Les nouveaux outils<\/p>","protected":false},"author":6,"featured_media":8908,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[84],"tags":[147,102,169],"class_list":["post-8905","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-industry","tag-deepmind","tag-google","tag-robotics"],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v27.4 - https:\/\/yoast.com\/product\/yoast-seo-wordpress\/ -->\n<title>Google releases a suite of advanced robotic tools | DailyAI<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/dailyai.com\/fr\/2024\/01\/google-releases-a-suite-of-advanced-robotic-tools\/\" \/>\n<meta property=\"og:locale\" content=\"fr_FR\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Google releases a suite of advanced robotic tools | DailyAI\" \/>\n<meta property=\"og:description\" content=\"Google DeepMind released a suite of new tools to help robots learn autonomously faster and more efficiently in novel environments. Training a robot to perform a specific task in a single environment is a relatively simple engineering task. If robots are going to be truly useful to us in the future they\u2019ll need to be able to perform a range of general tasks and learn to do them in environments that they\u2019ve not experienced before. Last year DeepMind released its RT-2 robotics control model and RT-X robotic datasets. RT-2 translates voice or text commands into robotic actions. The new tools\" \/>\n<meta property=\"og:url\" content=\"https:\/\/dailyai.com\/fr\/2024\/01\/google-releases-a-suite-of-advanced-robotic-tools\/\" \/>\n<meta property=\"og:site_name\" content=\"DailyAI\" \/>\n<meta property=\"article:published_time\" content=\"2024-01-05T12:35:47+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2024-01-05T12:41:49+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/dailyai.com\/wp-content\/uploads\/2024\/01\/Google-DeepMind.jpg\" \/>\n\t<meta property=\"og:image:width\" content=\"1000\" \/>\n\t<meta property=\"og:image:height\" content=\"667\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/jpeg\" \/>\n<meta name=\"author\" content=\"Eugene van der Watt\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@DailyAIOfficial\" \/>\n<meta name=\"twitter:site\" content=\"@DailyAIOfficial\" \/>\n<meta name=\"twitter:label1\" content=\"\u00c9crit par\" \/>\n\t<meta name=\"twitter:data1\" content=\"Eugene van der Watt\" \/>\n\t<meta name=\"twitter:label2\" content=\"Dur\u00e9e de lecture estim\u00e9e\" \/>\n\t<meta name=\"twitter:data2\" content=\"3 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\\\/\\\/schema.org\",\"@graph\":[{\"@type\":\"NewsArticle\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/01\\\/google-releases-a-suite-of-advanced-robotic-tools\\\/#article\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/01\\\/google-releases-a-suite-of-advanced-robotic-tools\\\/\"},\"author\":{\"name\":\"Eugene van der Watt\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/#\\\/schema\\\/person\\\/7ce525c6d0c79838b7cc7cde96993cfa\"},\"headline\":\"Google releases a suite of advanced robotic tools\",\"datePublished\":\"2024-01-05T12:35:47+00:00\",\"dateModified\":\"2024-01-05T12:41:49+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/01\\\/google-releases-a-suite-of-advanced-robotic-tools\\\/\"},\"wordCount\":730,\"publisher\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/#organization\"},\"image\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/01\\\/google-releases-a-suite-of-advanced-robotic-tools\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2024\\\/01\\\/Google-DeepMind.jpg\",\"keywords\":[\"DeepMind\",\"Google\",\"Robotics\"],\"articleSection\":[\"Industry\"],\"inLanguage\":\"fr-FR\"},{\"@type\":\"WebPage\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/01\\\/google-releases-a-suite-of-advanced-robotic-tools\\\/\",\"url\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/01\\\/google-releases-a-suite-of-advanced-robotic-tools\\\/\",\"name\":\"Google releases a suite of advanced robotic tools | DailyAI\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/01\\\/google-releases-a-suite-of-advanced-robotic-tools\\\/#primaryimage\"},\"image\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/01\\\/google-releases-a-suite-of-advanced-robotic-tools\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2024\\\/01\\\/Google-DeepMind.jpg\",\"datePublished\":\"2024-01-05T12:35:47+00:00\",\"dateModified\":\"2024-01-05T12:41:49+00:00\",\"breadcrumb\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/01\\\/google-releases-a-suite-of-advanced-robotic-tools\\\/#breadcrumb\"},\"inLanguage\":\"fr-FR\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\\\/\\\/dailyai.com\\\/2024\\\/01\\\/google-releases-a-suite-of-advanced-robotic-tools\\\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"fr-FR\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/01\\\/google-releases-a-suite-of-advanced-robotic-tools\\\/#primaryimage\",\"url\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2024\\\/01\\\/Google-DeepMind.jpg\",\"contentUrl\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2024\\\/01\\\/Google-DeepMind.jpg\",\"width\":1000,\"height\":667},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/01\\\/google-releases-a-suite-of-advanced-robotic-tools\\\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\\\/\\\/dailyai.com\\\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Google releases a suite of advanced robotic tools\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/#website\",\"url\":\"https:\\\/\\\/dailyai.com\\\/\",\"name\":\"DailyAI\",\"description\":\"Your Daily Dose of AI News\",\"publisher\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\\\/\\\/dailyai.com\\\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"fr-FR\"},{\"@type\":\"Organization\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/#organization\",\"name\":\"DailyAI\",\"url\":\"https:\\\/\\\/dailyai.com\\\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"fr-FR\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/#\\\/schema\\\/logo\\\/image\\\/\",\"url\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/06\\\/Daily-Ai_TL_colour.png\",\"contentUrl\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/06\\\/Daily-Ai_TL_colour.png\",\"width\":4501,\"height\":934,\"caption\":\"DailyAI\"},\"image\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/#\\\/schema\\\/logo\\\/image\\\/\"},\"sameAs\":[\"https:\\\/\\\/x.com\\\/DailyAIOfficial\",\"https:\\\/\\\/www.linkedin.com\\\/company\\\/dailyaiofficial\\\/\",\"https:\\\/\\\/www.youtube.com\\\/@DailyAIOfficial\"]},{\"@type\":\"Person\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/#\\\/schema\\\/person\\\/7ce525c6d0c79838b7cc7cde96993cfa\",\"name\":\"Eugene van der Watt\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"fr-FR\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/07\\\/Eugine_Profile_Picture-96x96.png\",\"url\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/07\\\/Eugine_Profile_Picture-96x96.png\",\"contentUrl\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/07\\\/Eugine_Profile_Picture-96x96.png\",\"caption\":\"Eugene van der Watt\"},\"description\":\"Eugene comes from an electronic engineering background and loves all things tech. When he takes a break from consuming AI news you'll find him at the snooker table.\",\"sameAs\":[\"www.linkedin.com\\\/in\\\/eugene-van-der-watt-16828119\"],\"url\":\"https:\\\/\\\/dailyai.com\\\/fr\\\/author\\\/eugene\\\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Google lance une s\u00e9rie d'outils robotiques avanc\u00e9s | DailyAI","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/dailyai.com\/fr\/2024\/01\/google-releases-a-suite-of-advanced-robotic-tools\/","og_locale":"fr_FR","og_type":"article","og_title":"Google releases a suite of advanced robotic tools | DailyAI","og_description":"Google DeepMind released a suite of new tools to help robots learn autonomously faster and more efficiently in novel environments. Training a robot to perform a specific task in a single environment is a relatively simple engineering task. If robots are going to be truly useful to us in the future they\u2019ll need to be able to perform a range of general tasks and learn to do them in environments that they\u2019ve not experienced before. Last year DeepMind released its RT-2 robotics control model and RT-X robotic datasets. RT-2 translates voice or text commands into robotic actions. The new tools","og_url":"https:\/\/dailyai.com\/fr\/2024\/01\/google-releases-a-suite-of-advanced-robotic-tools\/","og_site_name":"DailyAI","article_published_time":"2024-01-05T12:35:47+00:00","article_modified_time":"2024-01-05T12:41:49+00:00","og_image":[{"width":1000,"height":667,"url":"https:\/\/dailyai.com\/wp-content\/uploads\/2024\/01\/Google-DeepMind.jpg","type":"image\/jpeg"}],"author":"Eugene van der Watt","twitter_card":"summary_large_image","twitter_creator":"@DailyAIOfficial","twitter_site":"@DailyAIOfficial","twitter_misc":{"\u00c9crit par":"Eugene van der Watt","Dur\u00e9e de lecture estim\u00e9e":"3 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"NewsArticle","@id":"https:\/\/dailyai.com\/2024\/01\/google-releases-a-suite-of-advanced-robotic-tools\/#article","isPartOf":{"@id":"https:\/\/dailyai.com\/2024\/01\/google-releases-a-suite-of-advanced-robotic-tools\/"},"author":{"name":"Eugene van der Watt","@id":"https:\/\/dailyai.com\/#\/schema\/person\/7ce525c6d0c79838b7cc7cde96993cfa"},"headline":"Google releases a suite of advanced robotic tools","datePublished":"2024-01-05T12:35:47+00:00","dateModified":"2024-01-05T12:41:49+00:00","mainEntityOfPage":{"@id":"https:\/\/dailyai.com\/2024\/01\/google-releases-a-suite-of-advanced-robotic-tools\/"},"wordCount":730,"publisher":{"@id":"https:\/\/dailyai.com\/#organization"},"image":{"@id":"https:\/\/dailyai.com\/2024\/01\/google-releases-a-suite-of-advanced-robotic-tools\/#primaryimage"},"thumbnailUrl":"https:\/\/dailyai.com\/wp-content\/uploads\/2024\/01\/Google-DeepMind.jpg","keywords":["DeepMind","Google","Robotics"],"articleSection":["Industry"],"inLanguage":"fr-FR"},{"@type":"WebPage","@id":"https:\/\/dailyai.com\/2024\/01\/google-releases-a-suite-of-advanced-robotic-tools\/","url":"https:\/\/dailyai.com\/2024\/01\/google-releases-a-suite-of-advanced-robotic-tools\/","name":"Google lance une s\u00e9rie d'outils robotiques avanc\u00e9s | DailyAI","isPartOf":{"@id":"https:\/\/dailyai.com\/#website"},"primaryImageOfPage":{"@id":"https:\/\/dailyai.com\/2024\/01\/google-releases-a-suite-of-advanced-robotic-tools\/#primaryimage"},"image":{"@id":"https:\/\/dailyai.com\/2024\/01\/google-releases-a-suite-of-advanced-robotic-tools\/#primaryimage"},"thumbnailUrl":"https:\/\/dailyai.com\/wp-content\/uploads\/2024\/01\/Google-DeepMind.jpg","datePublished":"2024-01-05T12:35:47+00:00","dateModified":"2024-01-05T12:41:49+00:00","breadcrumb":{"@id":"https:\/\/dailyai.com\/2024\/01\/google-releases-a-suite-of-advanced-robotic-tools\/#breadcrumb"},"inLanguage":"fr-FR","potentialAction":[{"@type":"ReadAction","target":["https:\/\/dailyai.com\/2024\/01\/google-releases-a-suite-of-advanced-robotic-tools\/"]}]},{"@type":"ImageObject","inLanguage":"fr-FR","@id":"https:\/\/dailyai.com\/2024\/01\/google-releases-a-suite-of-advanced-robotic-tools\/#primaryimage","url":"https:\/\/dailyai.com\/wp-content\/uploads\/2024\/01\/Google-DeepMind.jpg","contentUrl":"https:\/\/dailyai.com\/wp-content\/uploads\/2024\/01\/Google-DeepMind.jpg","width":1000,"height":667},{"@type":"BreadcrumbList","@id":"https:\/\/dailyai.com\/2024\/01\/google-releases-a-suite-of-advanced-robotic-tools\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/dailyai.com\/"},{"@type":"ListItem","position":2,"name":"Google releases a suite of advanced robotic tools"}]},{"@type":"WebSite","@id":"https:\/\/dailyai.com\/#website","url":"https:\/\/dailyai.com\/","name":"DailyAI","description":"Votre dose quotidienne de nouvelles sur l'IA","publisher":{"@id":"https:\/\/dailyai.com\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/dailyai.com\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"fr-FR"},{"@type":"Organization","@id":"https:\/\/dailyai.com\/#organization","name":"DailyAI","url":"https:\/\/dailyai.com\/","logo":{"@type":"ImageObject","inLanguage":"fr-FR","@id":"https:\/\/dailyai.com\/#\/schema\/logo\/image\/","url":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/06\/Daily-Ai_TL_colour.png","contentUrl":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/06\/Daily-Ai_TL_colour.png","width":4501,"height":934,"caption":"DailyAI"},"image":{"@id":"https:\/\/dailyai.com\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/x.com\/DailyAIOfficial","https:\/\/www.linkedin.com\/company\/dailyaiofficial\/","https:\/\/www.youtube.com\/@DailyAIOfficial"]},{"@type":"Person","@id":"https:\/\/dailyai.com\/#\/schema\/person\/7ce525c6d0c79838b7cc7cde96993cfa","name":"Eug\u00e8ne van der Watt","image":{"@type":"ImageObject","inLanguage":"fr-FR","@id":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/07\/Eugine_Profile_Picture-96x96.png","url":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/07\/Eugine_Profile_Picture-96x96.png","contentUrl":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/07\/Eugine_Profile_Picture-96x96.png","caption":"Eugene van der Watt"},"description":"Eugene a une formation d'ing\u00e9nieur en \u00e9lectronique et adore tout ce qui touche \u00e0 la technologie. Lorsqu'il fait une pause dans sa consommation d'informations sur l'IA, vous le trouverez \u00e0 la table de snooker.","sameAs":["www.linkedin.com\/in\/eugene-van-der-watt-16828119"],"url":"https:\/\/dailyai.com\/fr\/author\/eugene\/"}]}},"_links":{"self":[{"href":"https:\/\/dailyai.com\/fr\/wp-json\/wp\/v2\/posts\/8905","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/dailyai.com\/fr\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/dailyai.com\/fr\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/dailyai.com\/fr\/wp-json\/wp\/v2\/users\/6"}],"replies":[{"embeddable":true,"href":"https:\/\/dailyai.com\/fr\/wp-json\/wp\/v2\/comments?post=8905"}],"version-history":[{"count":6,"href":"https:\/\/dailyai.com\/fr\/wp-json\/wp\/v2\/posts\/8905\/revisions"}],"predecessor-version":[{"id":8914,"href":"https:\/\/dailyai.com\/fr\/wp-json\/wp\/v2\/posts\/8905\/revisions\/8914"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/dailyai.com\/fr\/wp-json\/wp\/v2\/media\/8908"}],"wp:attachment":[{"href":"https:\/\/dailyai.com\/fr\/wp-json\/wp\/v2\/media?parent=8905"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/dailyai.com\/fr\/wp-json\/wp\/v2\/categories?post=8905"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/dailyai.com\/fr\/wp-json\/wp\/v2\/tags?post=8905"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}