{"id":6133,"date":"2023-10-05T12:53:51","date_gmt":"2023-10-05T12:53:51","guid":{"rendered":"https:\/\/dailyai.com\/?p=6133"},"modified":"2023-10-05T12:53:51","modified_gmt":"2023-10-05T12:53:51","slug":"open-x-embodiment-dataset-rt-x-model-a-leap-for-ai-robots","status":"publish","type":"post","link":"https:\/\/dailyai.com\/fr\/2023\/10\/open-x-embodiment-dataset-rt-x-model-a-leap-for-ai-robots\/","title":{"rendered":"L'ensemble de donn\u00e9es Open X-Embodiment et le mod\u00e8le RT-X, un pas en avant pour les robots d'IA"},"content":{"rendered":"<p><strong>DeepMind de Google a collabor\u00e9 avec 33 laboratoires universitaires diff\u00e9rents pour cr\u00e9er un ensemble de donn\u00e9es d'entra\u00eenement \u00e0 l'IA bas\u00e9 sur 22 types de robots diff\u00e9rents.<\/strong><\/p>\n<p>Les robots sont tr\u00e8s dou\u00e9s pour faire une chose pr\u00e9cise. Si vous voulez qu'il fasse quelque chose de l\u00e9g\u00e8rement diff\u00e9rent, le robot doit \u00eatre form\u00e9 \u00e0 partir de z\u00e9ro. L'objectif ultime de la robotique est de disposer d'un robot capable de r\u00e9aliser un large \u00e9ventail d'actions et d'acqu\u00e9rir de nouvelles comp\u00e9tences par lui-m\u00eame.<\/p>\n<p>Pour entra\u00eener un mod\u00e8le d'intelligence artificielle, il faut disposer d'un vaste ensemble de donn\u00e9es en rapport avec l'objectif du mod\u00e8le. Les mod\u00e8les linguistiques tels que <a href=\"https:\/\/dailyai.com\/fr\/2023\/09\/openai-reveals-new-voice-and-image-features-for-chatgpt\/\">GPT-4<\/a> sont form\u00e9s sur de grandes quantit\u00e9s de donn\u00e9es \u00e9crites. Les g\u00e9n\u00e9rateurs d'images tels que <a href=\"https:\/\/dailyai.com\/fr\/2023\/10\/dall-e-3-ai-image-generator-available-free-on-bing-chat\/\">DALL-E 3<\/a> sont form\u00e9s sur de grandes quantit\u00e9s d'images.<\/p>\n<p>Avec X-Embodiment, DeepMind a cr\u00e9\u00e9 un ensemble de donn\u00e9es d'actions robotiques bas\u00e9es sur 22 types de robots diff\u00e9rents. Il a ensuite utilis\u00e9 cet ensemble de donn\u00e9es pour former de nouveaux mod\u00e8les bas\u00e9s sur ses mod\u00e8les robotiques RT-1 et RT-2.<\/p>\n<p>Les donn\u00e9es relatives \u00e0 X-Embodiment proviennent de \"22 incarnations de robots, d\u00e9montrant plus de 500 comp\u00e9tences et 150 000 t\u00e2ches \u00e0 travers plus d'un million d'\u00e9pisodes\". <a href=\"https:\/\/www.deepmind.com\/blog\/scaling-up-learning-across-many-different-robot-types\" target=\"_blank\" rel=\"noopener\">Message de DeepMind<\/a>.<\/p>\n<blockquote class=\"twitter-tweet\">\n<p dir=\"ltr\" lang=\"en\" style=\"text-align: center;\">Pr\u00e9sentation de \ud835\udde5\ud835\udde7-\ud835\uddeb : un mod\u00e8le d'IA g\u00e9n\u00e9raliste pour aider \u00e0 faire progresser la fa\u00e7on dont les robots peuvent apprendre de nouvelles comp\u00e9tences. \ud83e\udd16<\/p>\n<p>Pour l'entra\u00eener, nous nous sommes associ\u00e9s \u00e0 33 laboratoires universitaires du monde entier afin de constituer un nouvel ensemble de donn\u00e9es contenant les exp\u00e9riences acquises par 22 types de robots diff\u00e9rents.<\/p>\n<p>Pour en savoir plus : <a href=\"https:\/\/t.co\/k6tE62gQGP\">https:\/\/t.co\/k6tE62gQGP<\/a> <a href=\"https:\/\/t.co\/IXTy2g4Lty\">pic.twitter.com\/IXTy2g4Lty<\/a><\/p>\n<p style=\"text-align: center;\">- Google DeepMind (@GoogleDeepMind) <a href=\"https:\/\/twitter.com\/GoogleDeepMind\/status\/1709207886943965648?ref_src=twsrc%5Etfw\">3 octobre 2023<\/a><\/p>\n<\/blockquote>\n<p><script async src=\"https:\/\/platform.twitter.com\/widgets.js\" charset=\"utf-8\"><\/script><\/p>\n<p>Les r\u00e9sultats des tests ant\u00e9rieurs de la RT-1 et de la <a href=\"https:\/\/dailyai.com\/fr\/2023\/07\/googles-ai-turns-vision-language-into-robotic-actions\/\">Mod\u00e8les RT-2<\/a> \u00e9taient d\u00e9j\u00e0 impressionnantes, mais DeepMind a constat\u00e9 que les versions RT-X \u00e9taient nettement plus performantes en raison de la nature g\u00e9n\u00e9rale du nouvel ensemble de donn\u00e9es.<\/p>\n<p>Les tests ont consist\u00e9 \u00e0 comparer un robot contr\u00f4l\u00e9 par un mod\u00e8le entra\u00een\u00e9 pour une t\u00e2che sp\u00e9cifique avec ce m\u00eame robot contr\u00f4l\u00e9 par le mod\u00e8le RT-1-X. RT-1-X a r\u00e9alis\u00e9 en moyenne 50% de mieux que les mod\u00e8les con\u00e7us sp\u00e9cifiquement pour des t\u00e2ches telles que l'ouverture d'une porte ou l'acheminement d'un c\u00e2ble.<\/p>\n<p>RT-2, le mod\u00e8le robotique vision-langage-action (VLA) de Google, permet aux robots d'apprendre \u00e0 partir de donn\u00e9es web, verbales et visuelles, puis d'agir sans avoir \u00e9t\u00e9 form\u00e9s. Lorsque les ing\u00e9nieurs ont entra\u00een\u00e9 RT-2-X avec l'ensemble de donn\u00e9es X-Embodiment, ils ont constat\u00e9 que RT-2-X \u00e9tait trois fois plus performant que RT-2 en ce qui concerne les comp\u00e9tences \u00e9mergentes.<\/p>\n<figure id=\"attachment_6135\" aria-describedby=\"caption-attachment-6135\" style=\"width: 640px\" class=\"wp-caption aligncenter\"><img loading=\"lazy\" decoding=\"async\" class=\"wp-image-6135 size-full\" src=\"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/10\/RT-2-X-demonstration.gif\" alt=\"\" width=\"640\" height=\"360\" \/><figcaption id=\"caption-attachment-6135\" class=\"wp-caption-text\">RT-2-X d\u00e9montre une compr\u00e9hension des relations spatiales entre les objets. Source : <a href=\"https:\/\/www.deepmind.com\/blog\/scaling-up-learning-across-many-different-robot-types\" target=\"_blank\" rel=\"noopener\">DeepMind<\/a><\/figcaption><\/figure>\n<p>En d'autres termes, le robot apprenait de nouvelles comp\u00e9tences qu'il ne poss\u00e9dait pas auparavant, sur la base des capacit\u00e9s que d'autres robots avaient apport\u00e9es \u00e0 l'ensemble de donn\u00e9es. Le transfert de comp\u00e9tences entre diff\u00e9rents types de robots pourrait changer la donne en mati\u00e8re de d\u00e9veloppement rapide de la robotique.<\/p>\n<p>Ces r\u00e9sultats incitent \u00e0 l'optimisme : nous verrons bient\u00f4t des robots dot\u00e9s de comp\u00e9tences plus g\u00e9n\u00e9rales et capables d'en acqu\u00e9rir de nouvelles sans avoir \u00e9t\u00e9 sp\u00e9cifiquement form\u00e9s \u00e0 cet effet.<\/p>\n<p>DeepMind affirme que cette recherche pourrait \u00eatre appliqu\u00e9e \u00e0 la propri\u00e9t\u00e9 d'auto-am\u00e9lioration du <a href=\"https:\/\/www.deepmind.com\/blog\/robocat-a-self-improving-robotic-agent\" target=\"_blank\" rel=\"noopener\">RoboCat<\/a>, son agent d'intelligence artificielle auto-am\u00e9liorant pour la robotique.<\/p>\n<p>La perspective de disposer d'un robot qui ne cesse de s'am\u00e9liorer et d'acqu\u00e9rir de nouvelles comp\u00e9tences constituerait un avantage consid\u00e9rable dans des domaines tels que la fabrication, l'agriculture ou les soins de sant\u00e9. Ces nouvelles comp\u00e9tences pourraient \u00e9galement \u00eatre appliqu\u00e9es dans le <a href=\"https:\/\/dailyai.com\/fr\/2023\/10\/darpa-wants-to-use-ai-to-make-better-battlefield-decisions\/\">industrie de la d\u00e9fense<\/a> ce qui est peut-\u00eatre une perspective moins attrayante, bien qu'in\u00e9vitable.<\/p>","protected":false},"excerpt":{"rendered":"<p>DeepMind de Google a collabor\u00e9 avec 33 laboratoires universitaires diff\u00e9rents pour cr\u00e9er un ensemble de donn\u00e9es d'entra\u00eenement \u00e0 l'IA bas\u00e9 sur 22 types de robots diff\u00e9rents. Les robots sont tr\u00e8s dou\u00e9s pour faire une chose pr\u00e9cise. Si vous voulez qu'il fasse quelque chose de l\u00e9g\u00e8rement diff\u00e9rent, le robot doit \u00eatre entra\u00een\u00e9 \u00e0 partir de z\u00e9ro. L'objectif ultime de la robotique est de disposer d'un robot capable de r\u00e9aliser un large \u00e9ventail d'actions et d'acqu\u00e9rir de nouvelles comp\u00e9tences par lui-m\u00eame. Pour former un mod\u00e8le d'IA, il faut disposer d'un vaste ensemble de donn\u00e9es en rapport avec l'objectif du mod\u00e8le. Les mod\u00e8les linguistiques tels que le GPT-4 sont form\u00e9s<\/p>","protected":false},"author":6,"featured_media":6136,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[84],"tags":[147,102,105,169],"class_list":["post-6133","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-industry","tag-deepmind","tag-google","tag-machine-learning","tag-robotics"],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v27.4 - https:\/\/yoast.com\/product\/yoast-seo-wordpress\/ -->\n<title>Open X-Embodiment dataset, RT-X model a leap for AI robots | DailyAI<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/dailyai.com\/fr\/2023\/10\/open-x-embodiment-dataset-rt-x-model-a-leap-for-ai-robots\/\" \/>\n<meta property=\"og:locale\" content=\"fr_FR\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Open X-Embodiment dataset, RT-X model a leap for AI robots | DailyAI\" \/>\n<meta property=\"og:description\" content=\"Google\u2019s DeepMind worked with 33 different academic labs to create an AI training dataset based on 22 different robot types. Robots are really good at doing one specific thing. If you want it to do something even slightly different, the robot needs to be trained from scratch. The ultimate goal for robotics is to have a robot that is good at a general range of actions with the ability to learn new skills by itself. To train an AI model you need a large dataset of data related to the purpose of the model. Language models like GPT-4 are trained\" \/>\n<meta property=\"og:url\" content=\"https:\/\/dailyai.com\/fr\/2023\/10\/open-x-embodiment-dataset-rt-x-model-a-leap-for-ai-robots\/\" \/>\n<meta property=\"og:site_name\" content=\"DailyAI\" \/>\n<meta property=\"article:published_time\" content=\"2023-10-05T12:53:51+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/10\/testing-AI-robotics.jpg\" \/>\n\t<meta property=\"og:image:width\" content=\"1000\" \/>\n\t<meta property=\"og:image:height\" content=\"563\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/jpeg\" \/>\n<meta name=\"author\" content=\"Eugene van der Watt\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@DailyAIOfficial\" \/>\n<meta name=\"twitter:site\" content=\"@DailyAIOfficial\" \/>\n<meta name=\"twitter:label1\" content=\"\u00c9crit par\" \/>\n\t<meta name=\"twitter:data1\" content=\"Eugene van der Watt\" \/>\n\t<meta name=\"twitter:label2\" content=\"Dur\u00e9e de lecture estim\u00e9e\" \/>\n\t<meta name=\"twitter:data2\" content=\"3 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\\\/\\\/schema.org\",\"@graph\":[{\"@type\":\"NewsArticle\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/2023\\\/10\\\/open-x-embodiment-dataset-rt-x-model-a-leap-for-ai-robots\\\/#article\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2023\\\/10\\\/open-x-embodiment-dataset-rt-x-model-a-leap-for-ai-robots\\\/\"},\"author\":{\"name\":\"Eugene van der Watt\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/#\\\/schema\\\/person\\\/7ce525c6d0c79838b7cc7cde96993cfa\"},\"headline\":\"Open X-Embodiment dataset, RT-X model a leap for AI robots\",\"datePublished\":\"2023-10-05T12:53:51+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2023\\\/10\\\/open-x-embodiment-dataset-rt-x-model-a-leap-for-ai-robots\\\/\"},\"wordCount\":521,\"publisher\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/#organization\"},\"image\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2023\\\/10\\\/open-x-embodiment-dataset-rt-x-model-a-leap-for-ai-robots\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/10\\\/testing-AI-robotics.jpg\",\"keywords\":[\"DeepMind\",\"Google\",\"machine learning\",\"Robotics\"],\"articleSection\":[\"Industry\"],\"inLanguage\":\"fr-FR\"},{\"@type\":\"WebPage\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/2023\\\/10\\\/open-x-embodiment-dataset-rt-x-model-a-leap-for-ai-robots\\\/\",\"url\":\"https:\\\/\\\/dailyai.com\\\/2023\\\/10\\\/open-x-embodiment-dataset-rt-x-model-a-leap-for-ai-robots\\\/\",\"name\":\"Open X-Embodiment dataset, RT-X model a leap for AI robots | DailyAI\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2023\\\/10\\\/open-x-embodiment-dataset-rt-x-model-a-leap-for-ai-robots\\\/#primaryimage\"},\"image\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2023\\\/10\\\/open-x-embodiment-dataset-rt-x-model-a-leap-for-ai-robots\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/10\\\/testing-AI-robotics.jpg\",\"datePublished\":\"2023-10-05T12:53:51+00:00\",\"breadcrumb\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2023\\\/10\\\/open-x-embodiment-dataset-rt-x-model-a-leap-for-ai-robots\\\/#breadcrumb\"},\"inLanguage\":\"fr-FR\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\\\/\\\/dailyai.com\\\/2023\\\/10\\\/open-x-embodiment-dataset-rt-x-model-a-leap-for-ai-robots\\\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"fr-FR\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/2023\\\/10\\\/open-x-embodiment-dataset-rt-x-model-a-leap-for-ai-robots\\\/#primaryimage\",\"url\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/10\\\/testing-AI-robotics.jpg\",\"contentUrl\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/10\\\/testing-AI-robotics.jpg\",\"width\":1000,\"height\":563},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/2023\\\/10\\\/open-x-embodiment-dataset-rt-x-model-a-leap-for-ai-robots\\\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\\\/\\\/dailyai.com\\\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Open X-Embodiment dataset, RT-X model a leap for AI robots\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/#website\",\"url\":\"https:\\\/\\\/dailyai.com\\\/\",\"name\":\"DailyAI\",\"description\":\"Your Daily Dose of AI News\",\"publisher\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\\\/\\\/dailyai.com\\\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"fr-FR\"},{\"@type\":\"Organization\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/#organization\",\"name\":\"DailyAI\",\"url\":\"https:\\\/\\\/dailyai.com\\\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"fr-FR\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/#\\\/schema\\\/logo\\\/image\\\/\",\"url\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/06\\\/Daily-Ai_TL_colour.png\",\"contentUrl\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/06\\\/Daily-Ai_TL_colour.png\",\"width\":4501,\"height\":934,\"caption\":\"DailyAI\"},\"image\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/#\\\/schema\\\/logo\\\/image\\\/\"},\"sameAs\":[\"https:\\\/\\\/x.com\\\/DailyAIOfficial\",\"https:\\\/\\\/www.linkedin.com\\\/company\\\/dailyaiofficial\\\/\",\"https:\\\/\\\/www.youtube.com\\\/@DailyAIOfficial\"]},{\"@type\":\"Person\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/#\\\/schema\\\/person\\\/7ce525c6d0c79838b7cc7cde96993cfa\",\"name\":\"Eugene van der Watt\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"fr-FR\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/07\\\/Eugine_Profile_Picture-96x96.png\",\"url\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/07\\\/Eugine_Profile_Picture-96x96.png\",\"contentUrl\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/07\\\/Eugine_Profile_Picture-96x96.png\",\"caption\":\"Eugene van der Watt\"},\"description\":\"Eugene comes from an electronic engineering background and loves all things tech. When he takes a break from consuming AI news you'll find him at the snooker table.\",\"sameAs\":[\"www.linkedin.com\\\/in\\\/eugene-van-der-watt-16828119\"],\"url\":\"https:\\\/\\\/dailyai.com\\\/fr\\\/author\\\/eugene\\\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"L'ensemble de donn\u00e9es Open X-Embodiment et le mod\u00e8le RT-X, un pas en avant pour les robots d'IA | DailyAI","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/dailyai.com\/fr\/2023\/10\/open-x-embodiment-dataset-rt-x-model-a-leap-for-ai-robots\/","og_locale":"fr_FR","og_type":"article","og_title":"Open X-Embodiment dataset, RT-X model a leap for AI robots | DailyAI","og_description":"Google\u2019s DeepMind worked with 33 different academic labs to create an AI training dataset based on 22 different robot types. Robots are really good at doing one specific thing. If you want it to do something even slightly different, the robot needs to be trained from scratch. The ultimate goal for robotics is to have a robot that is good at a general range of actions with the ability to learn new skills by itself. To train an AI model you need a large dataset of data related to the purpose of the model. Language models like GPT-4 are trained","og_url":"https:\/\/dailyai.com\/fr\/2023\/10\/open-x-embodiment-dataset-rt-x-model-a-leap-for-ai-robots\/","og_site_name":"DailyAI","article_published_time":"2023-10-05T12:53:51+00:00","og_image":[{"width":1000,"height":563,"url":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/10\/testing-AI-robotics.jpg","type":"image\/jpeg"}],"author":"Eugene van der Watt","twitter_card":"summary_large_image","twitter_creator":"@DailyAIOfficial","twitter_site":"@DailyAIOfficial","twitter_misc":{"\u00c9crit par":"Eugene van der Watt","Dur\u00e9e de lecture estim\u00e9e":"3 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"NewsArticle","@id":"https:\/\/dailyai.com\/2023\/10\/open-x-embodiment-dataset-rt-x-model-a-leap-for-ai-robots\/#article","isPartOf":{"@id":"https:\/\/dailyai.com\/2023\/10\/open-x-embodiment-dataset-rt-x-model-a-leap-for-ai-robots\/"},"author":{"name":"Eugene van der Watt","@id":"https:\/\/dailyai.com\/#\/schema\/person\/7ce525c6d0c79838b7cc7cde96993cfa"},"headline":"Open X-Embodiment dataset, RT-X model a leap for AI robots","datePublished":"2023-10-05T12:53:51+00:00","mainEntityOfPage":{"@id":"https:\/\/dailyai.com\/2023\/10\/open-x-embodiment-dataset-rt-x-model-a-leap-for-ai-robots\/"},"wordCount":521,"publisher":{"@id":"https:\/\/dailyai.com\/#organization"},"image":{"@id":"https:\/\/dailyai.com\/2023\/10\/open-x-embodiment-dataset-rt-x-model-a-leap-for-ai-robots\/#primaryimage"},"thumbnailUrl":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/10\/testing-AI-robotics.jpg","keywords":["DeepMind","Google","machine learning","Robotics"],"articleSection":["Industry"],"inLanguage":"fr-FR"},{"@type":"WebPage","@id":"https:\/\/dailyai.com\/2023\/10\/open-x-embodiment-dataset-rt-x-model-a-leap-for-ai-robots\/","url":"https:\/\/dailyai.com\/2023\/10\/open-x-embodiment-dataset-rt-x-model-a-leap-for-ai-robots\/","name":"L'ensemble de donn\u00e9es Open X-Embodiment et le mod\u00e8le RT-X, un pas en avant pour les robots d'IA | DailyAI","isPartOf":{"@id":"https:\/\/dailyai.com\/#website"},"primaryImageOfPage":{"@id":"https:\/\/dailyai.com\/2023\/10\/open-x-embodiment-dataset-rt-x-model-a-leap-for-ai-robots\/#primaryimage"},"image":{"@id":"https:\/\/dailyai.com\/2023\/10\/open-x-embodiment-dataset-rt-x-model-a-leap-for-ai-robots\/#primaryimage"},"thumbnailUrl":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/10\/testing-AI-robotics.jpg","datePublished":"2023-10-05T12:53:51+00:00","breadcrumb":{"@id":"https:\/\/dailyai.com\/2023\/10\/open-x-embodiment-dataset-rt-x-model-a-leap-for-ai-robots\/#breadcrumb"},"inLanguage":"fr-FR","potentialAction":[{"@type":"ReadAction","target":["https:\/\/dailyai.com\/2023\/10\/open-x-embodiment-dataset-rt-x-model-a-leap-for-ai-robots\/"]}]},{"@type":"ImageObject","inLanguage":"fr-FR","@id":"https:\/\/dailyai.com\/2023\/10\/open-x-embodiment-dataset-rt-x-model-a-leap-for-ai-robots\/#primaryimage","url":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/10\/testing-AI-robotics.jpg","contentUrl":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/10\/testing-AI-robotics.jpg","width":1000,"height":563},{"@type":"BreadcrumbList","@id":"https:\/\/dailyai.com\/2023\/10\/open-x-embodiment-dataset-rt-x-model-a-leap-for-ai-robots\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/dailyai.com\/"},{"@type":"ListItem","position":2,"name":"Open X-Embodiment dataset, RT-X model a leap for AI robots"}]},{"@type":"WebSite","@id":"https:\/\/dailyai.com\/#website","url":"https:\/\/dailyai.com\/","name":"DailyAI","description":"Votre dose quotidienne de nouvelles sur l'IA","publisher":{"@id":"https:\/\/dailyai.com\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/dailyai.com\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"fr-FR"},{"@type":"Organization","@id":"https:\/\/dailyai.com\/#organization","name":"DailyAI","url":"https:\/\/dailyai.com\/","logo":{"@type":"ImageObject","inLanguage":"fr-FR","@id":"https:\/\/dailyai.com\/#\/schema\/logo\/image\/","url":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/06\/Daily-Ai_TL_colour.png","contentUrl":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/06\/Daily-Ai_TL_colour.png","width":4501,"height":934,"caption":"DailyAI"},"image":{"@id":"https:\/\/dailyai.com\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/x.com\/DailyAIOfficial","https:\/\/www.linkedin.com\/company\/dailyaiofficial\/","https:\/\/www.youtube.com\/@DailyAIOfficial"]},{"@type":"Person","@id":"https:\/\/dailyai.com\/#\/schema\/person\/7ce525c6d0c79838b7cc7cde96993cfa","name":"Eug\u00e8ne van der Watt","image":{"@type":"ImageObject","inLanguage":"fr-FR","@id":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/07\/Eugine_Profile_Picture-96x96.png","url":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/07\/Eugine_Profile_Picture-96x96.png","contentUrl":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/07\/Eugine_Profile_Picture-96x96.png","caption":"Eugene van der Watt"},"description":"Eugene a une formation d'ing\u00e9nieur en \u00e9lectronique et adore tout ce qui touche \u00e0 la technologie. Lorsqu'il fait une pause dans sa consommation d'informations sur l'IA, vous le trouverez \u00e0 la table de snooker.","sameAs":["www.linkedin.com\/in\/eugene-van-der-watt-16828119"],"url":"https:\/\/dailyai.com\/fr\/author\/eugene\/"}]}},"_links":{"self":[{"href":"https:\/\/dailyai.com\/fr\/wp-json\/wp\/v2\/posts\/6133","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/dailyai.com\/fr\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/dailyai.com\/fr\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/dailyai.com\/fr\/wp-json\/wp\/v2\/users\/6"}],"replies":[{"embeddable":true,"href":"https:\/\/dailyai.com\/fr\/wp-json\/wp\/v2\/comments?post=6133"}],"version-history":[{"count":4,"href":"https:\/\/dailyai.com\/fr\/wp-json\/wp\/v2\/posts\/6133\/revisions"}],"predecessor-version":[{"id":6139,"href":"https:\/\/dailyai.com\/fr\/wp-json\/wp\/v2\/posts\/6133\/revisions\/6139"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/dailyai.com\/fr\/wp-json\/wp\/v2\/media\/6136"}],"wp:attachment":[{"href":"https:\/\/dailyai.com\/fr\/wp-json\/wp\/v2\/media?parent=6133"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/dailyai.com\/fr\/wp-json\/wp\/v2\/categories?post=6133"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/dailyai.com\/fr\/wp-json\/wp\/v2\/tags?post=6133"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}