{"id":8006,"date":"2023-12-05T09:02:39","date_gmt":"2023-12-05T09:02:39","guid":{"rendered":"https:\/\/dailyai.com\/?p=8006"},"modified":"2023-12-05T13:15:00","modified_gmt":"2023-12-05T13:15:00","slug":"meta-releases-ego-exo4d-a-multimodal-perception-dataset","status":"publish","type":"post","link":"https:\/\/dailyai.com\/sv\/2023\/12\/meta-releases-ego-exo4d-a-multimodal-perception-dataset\/","title":{"rendered":"Meta sl\u00e4pper Ego-Exo4D, ett multimodalt dataset f\u00f6r perception"},"content":{"rendered":"<p><strong>Tr\u00e4ning av AI-modeller som GPT-4 har mestadels f\u00f6rlitat sig p\u00e5 dataset som best\u00e5r av text och bilder. Metas Ego-Exo4D multimodala perceptionsdataset presenterar datavetare med en rik ny upps\u00e4ttning tr\u00e4ningsdata.<\/strong><\/p>\n<p>Du kan l\u00e4ra dig en ny f\u00e4rdighet genom att l\u00e4sa en bok, men det \u00e4r s\u00e5 mycket l\u00e4ttare n\u00e4r n\u00e5gon visar dig hur du g\u00f6r n\u00e5got medan du f\u00f6rklarar det f\u00f6r dig. Detta \u00e4r m\u00e5let Metas FAIR-team (Fundamental Artificial Intelligence Research) har f\u00f6r Ego-Exo4D.<\/p>\n<p>Datasetet best\u00e5r av videor i f\u00f6rstapersons- (Ego) och tredjepersons- (Exo) perspektiv av m\u00e4nniskor som utf\u00f6r olika kvalificerade m\u00e4nskliga aktiviteter. Det kan vara allt fr\u00e5n att laga mat, dansa, spela musik eller reparera en cykel. Uppgifterna samlades in i 13 st\u00e4der \u00f6ver hela v\u00e4rlden av 839 kamerab\u00e4rare, som spelade in 1422 timmar video.<\/p>\n<p>Videorna, som filmas samtidigt, kompletteras sedan med ytterligare datal\u00e4gen med hj\u00e4lp av Metas Project Aria-glas\u00f6gon.<\/p>\n<p>Project Aria-glas\u00f6gonen \u00e4r b\u00e4rbara datorer i glas\u00f6gonform. De f\u00e5ngar upp b\u00e4rarens video- och ljudinspelningar samt information om \u00f6gonstyrning och plats. Glas\u00f6gonen k\u00e4nner ocks\u00e5 av huvudpositioner och 3D-punktmoln av omgivningen.<\/p>\n<p>Resultatet \u00e4r ett dataset med samtidiga videor av en uppgift som utf\u00f6rs, med ber\u00e4ttelser i f\u00f6rsta person av kamerab\u00e4rarna som beskriver sina handlingar, och huvud- och \u00f6gonsp\u00e5rning av den person som utf\u00f6r uppgiften.<\/p>\n<blockquote class=\"twitter-tweet\" data-media-max-width=\"560\">\n<p dir=\"ltr\" lang=\"en\" style=\"text-align: center;\">Vi presenterar Ego-Exo4D - ett grundl\u00e4ggande dataset och en benchmark-svit som fokuserar p\u00e5 skickliga m\u00e4nskliga aktiviteter f\u00f6r att st\u00f6dja forskning om videoinl\u00e4rning och multimodal perception. Det \u00e4r den st\u00f6rsta offentliga dataupps\u00e4ttningen i sitt slag n\u00e5gonsin.<\/p>\n<p>Mer information \u27a1\ufe0f <a href=\"https:\/\/t.co\/82OR4msehv\">https:\/\/t.co\/82OR4msehv<\/a> <a href=\"https:\/\/t.co\/NTI1kdj1RN\">pic.twitter.com\/NTI1kdj1RN<\/a><\/p>\n<p style=\"text-align: center;\">- AI p\u00e5 Meta (@AIatMeta) <a href=\"https:\/\/twitter.com\/AIatMeta\/status\/1731739266856935796?ref_src=twsrc%5Etfw\">4 december 2023<\/a><\/p>\n<\/blockquote>\n<p><script async src=\"https:\/\/platform.twitter.com\/widgets.js\" charset=\"utf-8\"><\/script><\/p>\n<p>Meta lade sedan till tredjepersons play-by-play-beskrivningar av varje kamerab\u00e4rares handlingar. Meta anlitade ocks\u00e5 experter inom flera omr\u00e5den f\u00f6r att l\u00e4gga till tredjepersons talade expertkommentarer som kritiserade hur personen i videon utf\u00f6rde uppgiften.<\/p>\n<p>Genom att samla in b\u00e5de egocentriska och exocentriska vyer kan Ego-Exo4D-datasetet visa forskare hur aktiviteter ser ut fr\u00e5n olika perspektiv. Detta kan hj\u00e4lpa dem att s\u00e5 sm\u00e5ningom utveckla datorseendealgoritmer som kan k\u00e4nna igen vad en person g\u00f6r fr\u00e5n vilket perspektiv som helst.<\/p>\n<h2>Ego-Exo4D \u00f6ppnar nya m\u00f6jligheter till l\u00e4rande<\/h2>\n<p>Ett av de st\u00f6rsta hindren f\u00f6r att uppn\u00e5 AGI eller tr\u00e4na robotar mer effektivt \u00e4r den brist p\u00e5 sensorisk perception som datorer har. Som m\u00e4nniskor har vi s\u00e5 m\u00e5nga sinnesintryck fr\u00e5n v\u00e5r omgivning som vi ofta tar f\u00f6r givet n\u00e4r vi l\u00e4r oss nya f\u00e4rdigheter.<\/p>\n<p>Ego-Exo4D kommer att vara en extremt anv\u00e4ndbar resurs f\u00f6r att \u00f6verbrygga detta gap.<\/p>\n<p>Dr Gedas Bertasius, bitr\u00e4dande professor vid institutionen f\u00f6r datavetenskap vid University of North Carolina, s\u00e4ger: \"Ego-Exo4D handlar inte bara om att samla in data, utan om att f\u00f6r\u00e4ndra hur AI f\u00f6rst\u00e5r, uppfattar och l\u00e4r sig. Med m\u00e4nniskocentrerad inl\u00e4rning och perspektiv kan AI bli mer anv\u00e4ndbart i v\u00e5ra dagliga liv och hj\u00e4lpa oss p\u00e5 s\u00e4tt som vi bara har kunnat f\u00f6rest\u00e4lla oss.\"<\/p>\n<figure id=\"attachment_8008\" aria-describedby=\"caption-attachment-8008\" style=\"width: 1792px\" class=\"wp-caption aligncenter\"><img loading=\"lazy\" decoding=\"async\" class=\"wp-image-8008 size-full\" src=\"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/12\/Ego-Exo4D-training-data-snapshot.png\" alt=\"\" width=\"1792\" height=\"1072\" srcset=\"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/12\/Ego-Exo4D-training-data-snapshot.png 1792w, https:\/\/dailyai.com\/wp-content\/uploads\/2023\/12\/Ego-Exo4D-training-data-snapshot-300x179.png 300w, https:\/\/dailyai.com\/wp-content\/uploads\/2023\/12\/Ego-Exo4D-training-data-snapshot-1024x613.png 1024w, https:\/\/dailyai.com\/wp-content\/uploads\/2023\/12\/Ego-Exo4D-training-data-snapshot-768x459.png 768w, https:\/\/dailyai.com\/wp-content\/uploads\/2023\/12\/Ego-Exo4D-training-data-snapshot-1536x919.png 1536w, https:\/\/dailyai.com\/wp-content\/uploads\/2023\/12\/Ego-Exo4D-training-data-snapshot-370x221.png 370w, https:\/\/dailyai.com\/wp-content\/uploads\/2023\/12\/Ego-Exo4D-training-data-snapshot-800x479.png 800w, https:\/\/dailyai.com\/wp-content\/uploads\/2023\/12\/Ego-Exo4D-training-data-snapshot-20x12.png 20w, https:\/\/dailyai.com\/wp-content\/uploads\/2023\/12\/Ego-Exo4D-training-data-snapshot-740x443.png 740w, https:\/\/dailyai.com\/wp-content\/uploads\/2023\/12\/Ego-Exo4D-training-data-snapshot-1600x957.png 1600w, https:\/\/dailyai.com\/wp-content\/uploads\/2023\/12\/Ego-Exo4D-training-data-snapshot-1320x790.png 1320w, https:\/\/dailyai.com\/wp-content\/uploads\/2023\/12\/Ego-Exo4D-training-data-snapshot-80x48.png 80w\" sizes=\"auto, (max-width: 1792px) 100vw, 1792px\" \/><figcaption id=\"caption-attachment-8008\" class=\"wp-caption-text\">\u00d6gonblicksbild av data fr\u00e5n Ego-Exo4D-utbildning fr\u00e5n exempel p\u00e5 cykelreparation. K\u00e4lla: Meta Meta<\/figcaption><\/figure>\n<p>Meta s\u00e4ger att de hoppas att Ego-Exo4D kommer att \"m\u00f6jligg\u00f6ra framtidens robotar som f\u00e5r insikt om komplexa fingerf\u00e4rdiga manipuleringar genom att titta p\u00e5 skickliga m\u00e4nskliga experter i aktion.\"<\/p>\n<p>Detta dataset i kombination med Project Aria-glas\u00f6gonen kommer snart ocks\u00e5 att m\u00f6jligg\u00f6ra en verkligt uppslukande inl\u00e4rningsupplevelse f\u00f6r m\u00e4nniskor. F\u00f6rest\u00e4ll dig att du utf\u00f6r en uppgift medan dina glas\u00f6gon anv\u00e4nder f\u00f6rst\u00e4rkt verklighet (AR) f\u00f6r att \u00f6verlagra en instruktionsvideo eller prata med dig genom din uppgift.<\/p>\n<p>Du kan l\u00e4ra dig att spela piano och f\u00e5 en visuell \u00f6verlagring som visar hur h\u00e4nderna ska r\u00f6ra sig med ljud i realtid under tiden du spelar. Eller s\u00e5 kan du \u00f6ppna motorhuven p\u00e5 din bil och f\u00e5 hj\u00e4lp med att fels\u00f6ka och \u00e5tg\u00e4rda ett motorproblem.<\/p>\n<p>Det ska bli intressant att se om Metas <a href=\"https:\/\/ai.meta.com\/research\/ego-how-to\/\" target=\"_blank\" rel=\"noopener\">Ego How-To inl\u00e4rningskoncept<\/a> kommer att driva b\u00e4ttre antagande av Project Aria-glas\u00f6gon \u00e4n den misslyckade Google Glass-produkten upplevde. Det finns dock inget ord om n\u00e4r de kommer att finnas tillg\u00e4ngliga att k\u00f6pa \u00e4nnu.<\/p>\n<p>Meta kommer att g\u00f6ra Ego-Exo4D-datasetet <a href=\"https:\/\/ego-exo4d-data.org\/\" target=\"_blank\" rel=\"noopener\">tillg\u00e4nglig f\u00f6r nedladdning<\/a> f\u00f6re slutet av december.<\/p>","protected":false},"excerpt":{"rendered":"<p>Tr\u00e4ning av AI-modeller som GPT-4 har mestadels f\u00f6rlitat sig p\u00e5 dataset som best\u00e5r av text och bilder. Metas Ego-Exo4D multimodala perceptionsdataset presenterar datavetare med en rik ny upps\u00e4ttning tr\u00e4ningsdata. Du kan l\u00e4ra dig en ny f\u00e4rdighet genom att l\u00e4sa en bok, men det \u00e4r s\u00e5 mycket l\u00e4ttare n\u00e4r n\u00e5gon visar dig hur du g\u00f6r n\u00e5got medan du f\u00f6rklarar det f\u00f6r dig. Detta \u00e4r m\u00e5let Metas FAIR-team (Fundamental Artificial Intelligence Research) har f\u00f6r Ego-Exo4D. Datasetet best\u00e5r av f\u00f6rsta person (Ego) och tredje person (Exo) perspektivvideor av m\u00e4nniskor som utf\u00f6r olika skickliga m\u00e4nskliga aktiviteter. Det kan vara allt fr\u00e5n att laga mat, dansa eller spela musik,<\/p>","protected":false},"author":6,"featured_media":8009,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[84],"tags":[166,105,131],"class_list":["post-8006","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-industry","tag-computer-vision","tag-machine-learning","tag-meta"],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v27.4 - https:\/\/yoast.com\/product\/yoast-seo-wordpress\/ -->\n<title>Meta releases Ego-Exo4D, a multimodal perception dataset | DailyAI<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/dailyai.com\/sv\/2023\/12\/meta-releases-ego-exo4d-a-multimodal-perception-dataset\/\" \/>\n<meta property=\"og:locale\" content=\"sv_SE\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Meta releases Ego-Exo4D, a multimodal perception dataset | DailyAI\" \/>\n<meta property=\"og:description\" content=\"Training AI models like GPT-4 has relied mostly on datasets consisting of text and images. Meta\u2019s Ego-Exo4D multimodal perception dataset presents data scientists with a rich new set of training data. You can learn a new skill by reading a book, but it\u2019s so much easier when someone shows you how to do something while explaining it to you. This is the goal Meta\u2019s FAIR (Fundamental Artificial Intelligence Research) team has for Ego-Exo4D. The dataset consists of first-person (Ego) and third-person (Exo) perspective videos of people performing different skilled human activities. These could be anything from cooking, dancing, playing music,\" \/>\n<meta property=\"og:url\" content=\"https:\/\/dailyai.com\/sv\/2023\/12\/meta-releases-ego-exo4d-a-multimodal-perception-dataset\/\" \/>\n<meta property=\"og:site_name\" content=\"DailyAI\" \/>\n<meta property=\"article:published_time\" content=\"2023-12-05T09:02:39+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2023-12-05T13:15:00+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/12\/augmented-reality-car-repair.jpg\" \/>\n\t<meta property=\"og:image:width\" content=\"1000\" \/>\n\t<meta property=\"og:image:height\" content=\"666\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/jpeg\" \/>\n<meta name=\"author\" content=\"Eugene van der Watt\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@DailyAIOfficial\" \/>\n<meta name=\"twitter:site\" content=\"@DailyAIOfficial\" \/>\n<meta name=\"twitter:label1\" content=\"Skriven av\" \/>\n\t<meta name=\"twitter:data1\" content=\"Eugene van der Watt\" \/>\n\t<meta name=\"twitter:label2\" content=\"Ber\u00e4knad l\u00e4stid\" \/>\n\t<meta name=\"twitter:data2\" content=\"3 minuter\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\\\/\\\/schema.org\",\"@graph\":[{\"@type\":\"NewsArticle\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/2023\\\/12\\\/meta-releases-ego-exo4d-a-multimodal-perception-dataset\\\/#article\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2023\\\/12\\\/meta-releases-ego-exo4d-a-multimodal-perception-dataset\\\/\"},\"author\":{\"name\":\"Eugene van der Watt\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/#\\\/schema\\\/person\\\/7ce525c6d0c79838b7cc7cde96993cfa\"},\"headline\":\"Meta releases Ego-Exo4D, a multimodal perception dataset\",\"datePublished\":\"2023-12-05T09:02:39+00:00\",\"dateModified\":\"2023-12-05T13:15:00+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2023\\\/12\\\/meta-releases-ego-exo4d-a-multimodal-perception-dataset\\\/\"},\"wordCount\":662,\"publisher\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/#organization\"},\"image\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2023\\\/12\\\/meta-releases-ego-exo4d-a-multimodal-perception-dataset\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/12\\\/augmented-reality-car-repair.jpg\",\"keywords\":[\"Computer vision\",\"machine learning\",\"Meta\"],\"articleSection\":[\"Industry\"],\"inLanguage\":\"sv-SE\"},{\"@type\":\"WebPage\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/2023\\\/12\\\/meta-releases-ego-exo4d-a-multimodal-perception-dataset\\\/\",\"url\":\"https:\\\/\\\/dailyai.com\\\/2023\\\/12\\\/meta-releases-ego-exo4d-a-multimodal-perception-dataset\\\/\",\"name\":\"Meta releases Ego-Exo4D, a multimodal perception dataset | DailyAI\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2023\\\/12\\\/meta-releases-ego-exo4d-a-multimodal-perception-dataset\\\/#primaryimage\"},\"image\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2023\\\/12\\\/meta-releases-ego-exo4d-a-multimodal-perception-dataset\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/12\\\/augmented-reality-car-repair.jpg\",\"datePublished\":\"2023-12-05T09:02:39+00:00\",\"dateModified\":\"2023-12-05T13:15:00+00:00\",\"breadcrumb\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2023\\\/12\\\/meta-releases-ego-exo4d-a-multimodal-perception-dataset\\\/#breadcrumb\"},\"inLanguage\":\"sv-SE\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\\\/\\\/dailyai.com\\\/2023\\\/12\\\/meta-releases-ego-exo4d-a-multimodal-perception-dataset\\\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"sv-SE\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/2023\\\/12\\\/meta-releases-ego-exo4d-a-multimodal-perception-dataset\\\/#primaryimage\",\"url\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/12\\\/augmented-reality-car-repair.jpg\",\"contentUrl\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/12\\\/augmented-reality-car-repair.jpg\",\"width\":1000,\"height\":666},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/2023\\\/12\\\/meta-releases-ego-exo4d-a-multimodal-perception-dataset\\\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\\\/\\\/dailyai.com\\\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Meta releases Ego-Exo4D, a multimodal perception dataset\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/#website\",\"url\":\"https:\\\/\\\/dailyai.com\\\/\",\"name\":\"DailyAI\",\"description\":\"Your Daily Dose of AI News\",\"publisher\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\\\/\\\/dailyai.com\\\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"sv-SE\"},{\"@type\":\"Organization\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/#organization\",\"name\":\"DailyAI\",\"url\":\"https:\\\/\\\/dailyai.com\\\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"sv-SE\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/#\\\/schema\\\/logo\\\/image\\\/\",\"url\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/06\\\/Daily-Ai_TL_colour.png\",\"contentUrl\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/06\\\/Daily-Ai_TL_colour.png\",\"width\":4501,\"height\":934,\"caption\":\"DailyAI\"},\"image\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/#\\\/schema\\\/logo\\\/image\\\/\"},\"sameAs\":[\"https:\\\/\\\/x.com\\\/DailyAIOfficial\",\"https:\\\/\\\/www.linkedin.com\\\/company\\\/dailyaiofficial\\\/\",\"https:\\\/\\\/www.youtube.com\\\/@DailyAIOfficial\"]},{\"@type\":\"Person\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/#\\\/schema\\\/person\\\/7ce525c6d0c79838b7cc7cde96993cfa\",\"name\":\"Eugene van der Watt\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"sv-SE\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/07\\\/Eugine_Profile_Picture-96x96.png\",\"url\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/07\\\/Eugine_Profile_Picture-96x96.png\",\"contentUrl\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/07\\\/Eugine_Profile_Picture-96x96.png\",\"caption\":\"Eugene van der Watt\"},\"description\":\"Eugene comes from an electronic engineering background and loves all things tech. When he takes a break from consuming AI news you'll find him at the snooker table.\",\"sameAs\":[\"www.linkedin.com\\\/in\\\/eugene-van-der-watt-16828119\"],\"url\":\"https:\\\/\\\/dailyai.com\\\/sv\\\/author\\\/eugene\\\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Meta sl\u00e4pper Ego-Exo4D, en multimodal uppfattningsdataset | DailyAI","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/dailyai.com\/sv\/2023\/12\/meta-releases-ego-exo4d-a-multimodal-perception-dataset\/","og_locale":"sv_SE","og_type":"article","og_title":"Meta releases Ego-Exo4D, a multimodal perception dataset | DailyAI","og_description":"Training AI models like GPT-4 has relied mostly on datasets consisting of text and images. Meta\u2019s Ego-Exo4D multimodal perception dataset presents data scientists with a rich new set of training data. You can learn a new skill by reading a book, but it\u2019s so much easier when someone shows you how to do something while explaining it to you. This is the goal Meta\u2019s FAIR (Fundamental Artificial Intelligence Research) team has for Ego-Exo4D. The dataset consists of first-person (Ego) and third-person (Exo) perspective videos of people performing different skilled human activities. These could be anything from cooking, dancing, playing music,","og_url":"https:\/\/dailyai.com\/sv\/2023\/12\/meta-releases-ego-exo4d-a-multimodal-perception-dataset\/","og_site_name":"DailyAI","article_published_time":"2023-12-05T09:02:39+00:00","article_modified_time":"2023-12-05T13:15:00+00:00","og_image":[{"width":1000,"height":666,"url":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/12\/augmented-reality-car-repair.jpg","type":"image\/jpeg"}],"author":"Eugene van der Watt","twitter_card":"summary_large_image","twitter_creator":"@DailyAIOfficial","twitter_site":"@DailyAIOfficial","twitter_misc":{"Skriven av":"Eugene van der Watt","Ber\u00e4knad l\u00e4stid":"3 minuter"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"NewsArticle","@id":"https:\/\/dailyai.com\/2023\/12\/meta-releases-ego-exo4d-a-multimodal-perception-dataset\/#article","isPartOf":{"@id":"https:\/\/dailyai.com\/2023\/12\/meta-releases-ego-exo4d-a-multimodal-perception-dataset\/"},"author":{"name":"Eugene van der Watt","@id":"https:\/\/dailyai.com\/#\/schema\/person\/7ce525c6d0c79838b7cc7cde96993cfa"},"headline":"Meta releases Ego-Exo4D, a multimodal perception dataset","datePublished":"2023-12-05T09:02:39+00:00","dateModified":"2023-12-05T13:15:00+00:00","mainEntityOfPage":{"@id":"https:\/\/dailyai.com\/2023\/12\/meta-releases-ego-exo4d-a-multimodal-perception-dataset\/"},"wordCount":662,"publisher":{"@id":"https:\/\/dailyai.com\/#organization"},"image":{"@id":"https:\/\/dailyai.com\/2023\/12\/meta-releases-ego-exo4d-a-multimodal-perception-dataset\/#primaryimage"},"thumbnailUrl":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/12\/augmented-reality-car-repair.jpg","keywords":["Computer vision","machine learning","Meta"],"articleSection":["Industry"],"inLanguage":"sv-SE"},{"@type":"WebPage","@id":"https:\/\/dailyai.com\/2023\/12\/meta-releases-ego-exo4d-a-multimodal-perception-dataset\/","url":"https:\/\/dailyai.com\/2023\/12\/meta-releases-ego-exo4d-a-multimodal-perception-dataset\/","name":"Meta sl\u00e4pper Ego-Exo4D, en multimodal uppfattningsdataset | DailyAI","isPartOf":{"@id":"https:\/\/dailyai.com\/#website"},"primaryImageOfPage":{"@id":"https:\/\/dailyai.com\/2023\/12\/meta-releases-ego-exo4d-a-multimodal-perception-dataset\/#primaryimage"},"image":{"@id":"https:\/\/dailyai.com\/2023\/12\/meta-releases-ego-exo4d-a-multimodal-perception-dataset\/#primaryimage"},"thumbnailUrl":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/12\/augmented-reality-car-repair.jpg","datePublished":"2023-12-05T09:02:39+00:00","dateModified":"2023-12-05T13:15:00+00:00","breadcrumb":{"@id":"https:\/\/dailyai.com\/2023\/12\/meta-releases-ego-exo4d-a-multimodal-perception-dataset\/#breadcrumb"},"inLanguage":"sv-SE","potentialAction":[{"@type":"ReadAction","target":["https:\/\/dailyai.com\/2023\/12\/meta-releases-ego-exo4d-a-multimodal-perception-dataset\/"]}]},{"@type":"ImageObject","inLanguage":"sv-SE","@id":"https:\/\/dailyai.com\/2023\/12\/meta-releases-ego-exo4d-a-multimodal-perception-dataset\/#primaryimage","url":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/12\/augmented-reality-car-repair.jpg","contentUrl":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/12\/augmented-reality-car-repair.jpg","width":1000,"height":666},{"@type":"BreadcrumbList","@id":"https:\/\/dailyai.com\/2023\/12\/meta-releases-ego-exo4d-a-multimodal-perception-dataset\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/dailyai.com\/"},{"@type":"ListItem","position":2,"name":"Meta releases Ego-Exo4D, a multimodal perception dataset"}]},{"@type":"WebSite","@id":"https:\/\/dailyai.com\/#website","url":"https:\/\/dailyai.com\/","name":"DagligaAI","description":"Din dagliga dos av AI-nyheter","publisher":{"@id":"https:\/\/dailyai.com\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/dailyai.com\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"sv-SE"},{"@type":"Organization","@id":"https:\/\/dailyai.com\/#organization","name":"DagligaAI","url":"https:\/\/dailyai.com\/","logo":{"@type":"ImageObject","inLanguage":"sv-SE","@id":"https:\/\/dailyai.com\/#\/schema\/logo\/image\/","url":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/06\/Daily-Ai_TL_colour.png","contentUrl":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/06\/Daily-Ai_TL_colour.png","width":4501,"height":934,"caption":"DailyAI"},"image":{"@id":"https:\/\/dailyai.com\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/x.com\/DailyAIOfficial","https:\/\/www.linkedin.com\/company\/dailyaiofficial\/","https:\/\/www.youtube.com\/@DailyAIOfficial"]},{"@type":"Person","@id":"https:\/\/dailyai.com\/#\/schema\/person\/7ce525c6d0c79838b7cc7cde96993cfa","name":"Eugene van der Watt","image":{"@type":"ImageObject","inLanguage":"sv-SE","@id":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/07\/Eugine_Profile_Picture-96x96.png","url":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/07\/Eugine_Profile_Picture-96x96.png","contentUrl":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/07\/Eugine_Profile_Picture-96x96.png","caption":"Eugene van der Watt"},"description":"Eugene kommer fr\u00e5n en bakgrund som elektronikingenj\u00f6r och \u00e4lskar allt som har med teknik att g\u00f6ra. N\u00e4r han tar en paus fr\u00e5n att konsumera AI-nyheter hittar du honom vid snookerbordet.","sameAs":["www.linkedin.com\/in\/eugene-van-der-watt-16828119"],"url":"https:\/\/dailyai.com\/sv\/author\/eugene\/"}]}},"_links":{"self":[{"href":"https:\/\/dailyai.com\/sv\/wp-json\/wp\/v2\/posts\/8006","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/dailyai.com\/sv\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/dailyai.com\/sv\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/dailyai.com\/sv\/wp-json\/wp\/v2\/users\/6"}],"replies":[{"embeddable":true,"href":"https:\/\/dailyai.com\/sv\/wp-json\/wp\/v2\/comments?post=8006"}],"version-history":[{"count":4,"href":"https:\/\/dailyai.com\/sv\/wp-json\/wp\/v2\/posts\/8006\/revisions"}],"predecessor-version":[{"id":8021,"href":"https:\/\/dailyai.com\/sv\/wp-json\/wp\/v2\/posts\/8006\/revisions\/8021"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/dailyai.com\/sv\/wp-json\/wp\/v2\/media\/8009"}],"wp:attachment":[{"href":"https:\/\/dailyai.com\/sv\/wp-json\/wp\/v2\/media?parent=8006"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/dailyai.com\/sv\/wp-json\/wp\/v2\/categories?post=8006"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/dailyai.com\/sv\/wp-json\/wp\/v2\/tags?post=8006"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}