{"id":8006,"date":"2023-12-05T09:02:39","date_gmt":"2023-12-05T09:02:39","guid":{"rendered":"https:\/\/dailyai.com\/?p=8006"},"modified":"2023-12-05T13:15:00","modified_gmt":"2023-12-05T13:15:00","slug":"meta-releases-ego-exo4d-a-multimodal-perception-dataset","status":"publish","type":"post","link":"https:\/\/dailyai.com\/nb\/2023\/12\/meta-releases-ego-exo4d-a-multimodal-perception-dataset\/","title":{"rendered":"Meta lanserer Ego-Exo4D, et multimodalt persepsjonsdatasett"},"content":{"rendered":"<p><strong>Oppl\u00e6ring av AI-modeller som GPT-4 har stort sett v\u00e6rt basert p\u00e5 datasett som best\u00e5r av tekst og bilder. Metas Ego-Exo4D multimodale persepsjonsdatasett gir dataforskere et rikt nytt sett med treningsdata.<\/strong><\/p>\n<p>Du kan l\u00e6re en ny ferdighet ved \u00e5 lese en bok, men det er s\u00e5 mye enklere n\u00e5r noen viser deg hvordan du gj\u00f8r noe mens de forklarer det for deg. Dette er m\u00e5let Metas FAIR-team (Fundamental Artificial Intelligence Research) har for Ego-Exo4D.<\/p>\n<p>Datasettet best\u00e5r av videoer i f\u00f8rstepersons- (Ego) og tredjepersons- (Exo) perspektiv av mennesker som utf\u00f8rer ulike menneskelige aktiviteter. Det kan v\u00e6re alt fra matlaging, dansing, musikk eller sykkelreparasjon. Dataene ble samlet inn i 13 byer over hele verden av 839 kamerabrukere, og det ble tatt opp 1422 timer med video.<\/p>\n<p>Videoene, som er filmet samtidig, blir deretter utvidet med flere typer data ved hjelp av Metas Project Aria-briller.<\/p>\n<p>Project Aria-brillene er b\u00e6rbare datamaskiner i brilleform. De fanger opp video- og lydopptak, samt \u00f8yesporing og lokasjonsinformasjon. Brillene registrerer ogs\u00e5 hodestillinger og 3D-punktskyer av omgivelsene.<\/p>\n<p>Resultatet er et datasett med samtidige videoer av en oppgave som utf\u00f8res, med f\u00f8rstepersonsfortellinger fra kamerab\u00e6rerne som beskriver handlingene sine, samt hode- og \u00f8yesporing av personen som utf\u00f8rer oppgaven.<\/p>\n<blockquote class=\"twitter-tweet\" data-media-max-width=\"560\">\n<p dir=\"ltr\" lang=\"en\" style=\"text-align: center;\">Vi presenterer Ego-Exo4D - et grunnleggende datasett og en referansepakke som fokuserer p\u00e5 dyktige menneskelige aktiviteter for \u00e5 st\u00f8tte forskning p\u00e5 videol\u00e6ring og multimodal persepsjon. Det er det st\u00f8rste offentlige datasettet i sitt slag noensinne.<\/p>\n<p>Mer informasjon \u27a1\ufe0f <a href=\"https:\/\/t.co\/82OR4msehv\">https:\/\/t.co\/82OR4msehv<\/a> <a href=\"https:\/\/t.co\/NTI1kdj1RN\">pic.twitter.com\/NTI1kdj1RN<\/a><\/p>\n<p style=\"text-align: center;\">- AI p\u00e5 Meta (@AIatMeta) <a href=\"https:\/\/twitter.com\/AIatMeta\/status\/1731739266856935796?ref_src=twsrc%5Etfw\">4. desember 2023<\/a><\/p>\n<\/blockquote>\n<p><script async src=\"https:\/\/platform.twitter.com\/widgets.js\" charset=\"utf-8\"><\/script><\/p>\n<p>Meta la deretter til tredjepersons play-by-play-beskrivelser av hver kamerab\u00e6rers handlinger. Meta hyret ogs\u00e5 inn eksperter p\u00e5 flere felt for \u00e5 legge til muntlige ekspertkommentarer i tredjeperson som kritiserte m\u00e5ten personen i videoen utf\u00f8rte oppgaven p\u00e5.<\/p>\n<p>Ved \u00e5 samle inn b\u00e5de egosentriske og eksosentriske bilder kan Ego-Exo4D-datasettet vise forskerne hvordan aktiviteter ser ut fra ulike perspektiver. Dette kan hjelpe dem med \u00e5 utvikle algoritmer for datasyn som kan gjenkjenne hva en person gj\u00f8r fra alle perspektiver.<\/p>\n<h2>Ego-Exo4D \u00e5pner nye muligheter for l\u00e6ring<\/h2>\n<p>En av de st\u00f8rste hindringene for \u00e5 oppn\u00e5 AGI eller l\u00e6re opp roboter p\u00e5 en mer effektiv m\u00e5te er datamaskiners manglende sensoriske persepsjon. Som mennesker har vi s\u00e5 mange sanseinntrykk fra omgivelsene v\u00e5re at vi ofte tar dem for gitt n\u00e5r vi l\u00e6rer oss nye ferdigheter.<\/p>\n<p>Ego-Exo4D vil v\u00e6re en sv\u00e6rt nyttig ressurs for \u00e5 bygge bro over dette gapet.<\/p>\n<p>Dr. Gedas Bertasius, assisterende professor ved Institutt for informatikk ved University of North Carolina, sier: \"Ego-Exo4D handler ikke bare om \u00e5 samle inn data, men om \u00e5 endre hvordan AI forst\u00e5r, oppfatter og l\u00e6rer. Med menneskesentrert l\u00e6ring og perspektiv kan kunstig intelligens bli mer nyttig i hverdagen v\u00e5r, og hjelpe oss p\u00e5 m\u00e5ter vi bare kan forestille oss.\"<\/p>\n<figure id=\"attachment_8008\" aria-describedby=\"caption-attachment-8008\" style=\"width: 1792px\" class=\"wp-caption aligncenter\"><img loading=\"lazy\" decoding=\"async\" class=\"wp-image-8008 size-full\" src=\"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/12\/Ego-Exo4D-training-data-snapshot.png\" alt=\"\" width=\"1792\" height=\"1072\" srcset=\"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/12\/Ego-Exo4D-training-data-snapshot.png 1792w, https:\/\/dailyai.com\/wp-content\/uploads\/2023\/12\/Ego-Exo4D-training-data-snapshot-300x179.png 300w, https:\/\/dailyai.com\/wp-content\/uploads\/2023\/12\/Ego-Exo4D-training-data-snapshot-1024x613.png 1024w, https:\/\/dailyai.com\/wp-content\/uploads\/2023\/12\/Ego-Exo4D-training-data-snapshot-768x459.png 768w, https:\/\/dailyai.com\/wp-content\/uploads\/2023\/12\/Ego-Exo4D-training-data-snapshot-1536x919.png 1536w, https:\/\/dailyai.com\/wp-content\/uploads\/2023\/12\/Ego-Exo4D-training-data-snapshot-370x221.png 370w, https:\/\/dailyai.com\/wp-content\/uploads\/2023\/12\/Ego-Exo4D-training-data-snapshot-800x479.png 800w, https:\/\/dailyai.com\/wp-content\/uploads\/2023\/12\/Ego-Exo4D-training-data-snapshot-20x12.png 20w, https:\/\/dailyai.com\/wp-content\/uploads\/2023\/12\/Ego-Exo4D-training-data-snapshot-740x443.png 740w, https:\/\/dailyai.com\/wp-content\/uploads\/2023\/12\/Ego-Exo4D-training-data-snapshot-1600x957.png 1600w, https:\/\/dailyai.com\/wp-content\/uploads\/2023\/12\/Ego-Exo4D-training-data-snapshot-1320x790.png 1320w, https:\/\/dailyai.com\/wp-content\/uploads\/2023\/12\/Ego-Exo4D-training-data-snapshot-80x48.png 80w\" sizes=\"auto, (max-width: 1792px) 100vw, 1792px\" \/><figcaption id=\"caption-attachment-8008\" class=\"wp-caption-text\">\u00d8yeblikksbilde av Ego-Exo4D-oppl\u00e6ringsdata fra et eksempel p\u00e5 sykkelreparasjon. Kilde: Meta<\/figcaption><\/figure>\n<p>Meta sier at de h\u00e5per at Ego-Exo4D vil \"gj\u00f8re det mulig for fremtidens roboter \u00e5 f\u00e5 innsikt i komplekse fingerferdige manipulasjoner ved \u00e5 se p\u00e5 dyktige menneskelige eksperter i aksjon\".<\/p>\n<p>Dette datasettet kombinert med Project Aria-brillene vil snart ogs\u00e5 muliggj\u00f8re en virkelig oppslukende l\u00e6ringsopplevelse for mennesker. Forestill deg at du utf\u00f8rer en oppgave mens brillene dine bruker utvidet virkelighet (AR) til \u00e5 legge over en oppl\u00e6ringsvideo eller snakke deg gjennom oppgaven.<\/p>\n<p>Du kan l\u00e6re deg \u00e5 spille piano og f\u00e5 et visuelt overlegg som viser deg hvor hendene dine skal bevege seg, med lydinstruksjoner i sanntid mens du gj\u00f8r det. Eller du kan \u00e5pne panseret p\u00e5 bilen din og f\u00e5 veiledning i feils\u00f8king og reparasjon av et motorproblem.<\/p>\n<p>Det blir interessant \u00e5 se om Metas <a href=\"https:\/\/ai.meta.com\/research\/ego-how-to\/\" target=\"_blank\" rel=\"noopener\">Ego How-To l\u00e6ringskonsept<\/a> vil drive bedre adopsjon av Project Aria-briller enn det mislykkede Google Glass-produktet opplevde. Det er imidlertid ikke noe ord om n\u00e5r de vil v\u00e6re tilgjengelige for kj\u00f8p enn\u00e5.<\/p>\n<p>Meta vil gj\u00f8re Ego-Exo4D-datasettet <a href=\"https:\/\/ego-exo4d-data.org\/\" target=\"_blank\" rel=\"noopener\">tilgjengelig for nedlasting<\/a> f\u00f8r utgangen av desember.<\/p>","protected":false},"excerpt":{"rendered":"<p>Oppl\u00e6ring av AI-modeller som GPT-4 har stort sett v\u00e6rt basert p\u00e5 datasett som best\u00e5r av tekst og bilder. Metas Ego-Exo4D multimodale persepsjonsdatasett gir dataforskere et rikt nytt sett med treningsdata. Du kan l\u00e6re en ny ferdighet ved \u00e5 lese en bok, men det er s\u00e5 mye lettere n\u00e5r noen viser deg hvordan du gj\u00f8r noe mens du forklarer det for deg. Dette er m\u00e5let Metas FAIR-team (Fundamental Artificial Intelligence Research) har for Ego-Exo4D. Datasettet best\u00e5r av videoer i f\u00f8rstepersons- (Ego) og tredjepersonsperspektiv (Exo) av mennesker som utf\u00f8rer ulike menneskelige aktiviteter. Det kan v\u00e6re alt fra matlaging, dansing og musikk,<\/p>","protected":false},"author":6,"featured_media":8009,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[84],"tags":[166,105,131],"class_list":["post-8006","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-industry","tag-computer-vision","tag-machine-learning","tag-meta"],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v27.4 - https:\/\/yoast.com\/product\/yoast-seo-wordpress\/ -->\n<title>Meta releases Ego-Exo4D, a multimodal perception dataset | DailyAI<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/dailyai.com\/nb\/2023\/12\/meta-releases-ego-exo4d-a-multimodal-perception-dataset\/\" \/>\n<meta property=\"og:locale\" content=\"nb_NO\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Meta releases Ego-Exo4D, a multimodal perception dataset | DailyAI\" \/>\n<meta property=\"og:description\" content=\"Training AI models like GPT-4 has relied mostly on datasets consisting of text and images. Meta\u2019s Ego-Exo4D multimodal perception dataset presents data scientists with a rich new set of training data. You can learn a new skill by reading a book, but it\u2019s so much easier when someone shows you how to do something while explaining it to you. This is the goal Meta\u2019s FAIR (Fundamental Artificial Intelligence Research) team has for Ego-Exo4D. The dataset consists of first-person (Ego) and third-person (Exo) perspective videos of people performing different skilled human activities. These could be anything from cooking, dancing, playing music,\" \/>\n<meta property=\"og:url\" content=\"https:\/\/dailyai.com\/nb\/2023\/12\/meta-releases-ego-exo4d-a-multimodal-perception-dataset\/\" \/>\n<meta property=\"og:site_name\" content=\"DailyAI\" \/>\n<meta property=\"article:published_time\" content=\"2023-12-05T09:02:39+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2023-12-05T13:15:00+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/12\/augmented-reality-car-repair.jpg\" \/>\n\t<meta property=\"og:image:width\" content=\"1000\" \/>\n\t<meta property=\"og:image:height\" content=\"666\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/jpeg\" \/>\n<meta name=\"author\" content=\"Eugene van der Watt\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@DailyAIOfficial\" \/>\n<meta name=\"twitter:site\" content=\"@DailyAIOfficial\" \/>\n<meta name=\"twitter:label1\" content=\"Skrevet av\" \/>\n\t<meta name=\"twitter:data1\" content=\"Eugene van der Watt\" \/>\n\t<meta name=\"twitter:label2\" content=\"Ansl. lesetid\" \/>\n\t<meta name=\"twitter:data2\" content=\"3 minutter\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\\\/\\\/schema.org\",\"@graph\":[{\"@type\":\"NewsArticle\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/2023\\\/12\\\/meta-releases-ego-exo4d-a-multimodal-perception-dataset\\\/#article\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2023\\\/12\\\/meta-releases-ego-exo4d-a-multimodal-perception-dataset\\\/\"},\"author\":{\"name\":\"Eugene van der Watt\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/#\\\/schema\\\/person\\\/7ce525c6d0c79838b7cc7cde96993cfa\"},\"headline\":\"Meta releases Ego-Exo4D, a multimodal perception dataset\",\"datePublished\":\"2023-12-05T09:02:39+00:00\",\"dateModified\":\"2023-12-05T13:15:00+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2023\\\/12\\\/meta-releases-ego-exo4d-a-multimodal-perception-dataset\\\/\"},\"wordCount\":662,\"publisher\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/#organization\"},\"image\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2023\\\/12\\\/meta-releases-ego-exo4d-a-multimodal-perception-dataset\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/12\\\/augmented-reality-car-repair.jpg\",\"keywords\":[\"Computer vision\",\"machine learning\",\"Meta\"],\"articleSection\":[\"Industry\"],\"inLanguage\":\"nb-NO\"},{\"@type\":\"WebPage\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/2023\\\/12\\\/meta-releases-ego-exo4d-a-multimodal-perception-dataset\\\/\",\"url\":\"https:\\\/\\\/dailyai.com\\\/2023\\\/12\\\/meta-releases-ego-exo4d-a-multimodal-perception-dataset\\\/\",\"name\":\"Meta releases Ego-Exo4D, a multimodal perception dataset | DailyAI\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2023\\\/12\\\/meta-releases-ego-exo4d-a-multimodal-perception-dataset\\\/#primaryimage\"},\"image\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2023\\\/12\\\/meta-releases-ego-exo4d-a-multimodal-perception-dataset\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/12\\\/augmented-reality-car-repair.jpg\",\"datePublished\":\"2023-12-05T09:02:39+00:00\",\"dateModified\":\"2023-12-05T13:15:00+00:00\",\"breadcrumb\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2023\\\/12\\\/meta-releases-ego-exo4d-a-multimodal-perception-dataset\\\/#breadcrumb\"},\"inLanguage\":\"nb-NO\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\\\/\\\/dailyai.com\\\/2023\\\/12\\\/meta-releases-ego-exo4d-a-multimodal-perception-dataset\\\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"nb-NO\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/2023\\\/12\\\/meta-releases-ego-exo4d-a-multimodal-perception-dataset\\\/#primaryimage\",\"url\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/12\\\/augmented-reality-car-repair.jpg\",\"contentUrl\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/12\\\/augmented-reality-car-repair.jpg\",\"width\":1000,\"height\":666},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/2023\\\/12\\\/meta-releases-ego-exo4d-a-multimodal-perception-dataset\\\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\\\/\\\/dailyai.com\\\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Meta releases Ego-Exo4D, a multimodal perception dataset\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/#website\",\"url\":\"https:\\\/\\\/dailyai.com\\\/\",\"name\":\"DailyAI\",\"description\":\"Your Daily Dose of AI News\",\"publisher\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\\\/\\\/dailyai.com\\\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"nb-NO\"},{\"@type\":\"Organization\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/#organization\",\"name\":\"DailyAI\",\"url\":\"https:\\\/\\\/dailyai.com\\\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"nb-NO\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/#\\\/schema\\\/logo\\\/image\\\/\",\"url\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/06\\\/Daily-Ai_TL_colour.png\",\"contentUrl\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/06\\\/Daily-Ai_TL_colour.png\",\"width\":4501,\"height\":934,\"caption\":\"DailyAI\"},\"image\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/#\\\/schema\\\/logo\\\/image\\\/\"},\"sameAs\":[\"https:\\\/\\\/x.com\\\/DailyAIOfficial\",\"https:\\\/\\\/www.linkedin.com\\\/company\\\/dailyaiofficial\\\/\",\"https:\\\/\\\/www.youtube.com\\\/@DailyAIOfficial\"]},{\"@type\":\"Person\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/#\\\/schema\\\/person\\\/7ce525c6d0c79838b7cc7cde96993cfa\",\"name\":\"Eugene van der Watt\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"nb-NO\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/07\\\/Eugine_Profile_Picture-96x96.png\",\"url\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/07\\\/Eugine_Profile_Picture-96x96.png\",\"contentUrl\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/07\\\/Eugine_Profile_Picture-96x96.png\",\"caption\":\"Eugene van der Watt\"},\"description\":\"Eugene comes from an electronic engineering background and loves all things tech. When he takes a break from consuming AI news you'll find him at the snooker table.\",\"sameAs\":[\"www.linkedin.com\\\/in\\\/eugene-van-der-watt-16828119\"],\"url\":\"https:\\\/\\\/dailyai.com\\\/nb\\\/author\\\/eugene\\\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Meta lanserer Ego-Exo4D, et multimodalt persepsjonsdatasett | DailyAI","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/dailyai.com\/nb\/2023\/12\/meta-releases-ego-exo4d-a-multimodal-perception-dataset\/","og_locale":"nb_NO","og_type":"article","og_title":"Meta releases Ego-Exo4D, a multimodal perception dataset | DailyAI","og_description":"Training AI models like GPT-4 has relied mostly on datasets consisting of text and images. Meta\u2019s Ego-Exo4D multimodal perception dataset presents data scientists with a rich new set of training data. You can learn a new skill by reading a book, but it\u2019s so much easier when someone shows you how to do something while explaining it to you. This is the goal Meta\u2019s FAIR (Fundamental Artificial Intelligence Research) team has for Ego-Exo4D. The dataset consists of first-person (Ego) and third-person (Exo) perspective videos of people performing different skilled human activities. These could be anything from cooking, dancing, playing music,","og_url":"https:\/\/dailyai.com\/nb\/2023\/12\/meta-releases-ego-exo4d-a-multimodal-perception-dataset\/","og_site_name":"DailyAI","article_published_time":"2023-12-05T09:02:39+00:00","article_modified_time":"2023-12-05T13:15:00+00:00","og_image":[{"width":1000,"height":666,"url":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/12\/augmented-reality-car-repair.jpg","type":"image\/jpeg"}],"author":"Eugene van der Watt","twitter_card":"summary_large_image","twitter_creator":"@DailyAIOfficial","twitter_site":"@DailyAIOfficial","twitter_misc":{"Skrevet av":"Eugene van der Watt","Ansl. lesetid":"3 minutter"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"NewsArticle","@id":"https:\/\/dailyai.com\/2023\/12\/meta-releases-ego-exo4d-a-multimodal-perception-dataset\/#article","isPartOf":{"@id":"https:\/\/dailyai.com\/2023\/12\/meta-releases-ego-exo4d-a-multimodal-perception-dataset\/"},"author":{"name":"Eugene van der Watt","@id":"https:\/\/dailyai.com\/#\/schema\/person\/7ce525c6d0c79838b7cc7cde96993cfa"},"headline":"Meta releases Ego-Exo4D, a multimodal perception dataset","datePublished":"2023-12-05T09:02:39+00:00","dateModified":"2023-12-05T13:15:00+00:00","mainEntityOfPage":{"@id":"https:\/\/dailyai.com\/2023\/12\/meta-releases-ego-exo4d-a-multimodal-perception-dataset\/"},"wordCount":662,"publisher":{"@id":"https:\/\/dailyai.com\/#organization"},"image":{"@id":"https:\/\/dailyai.com\/2023\/12\/meta-releases-ego-exo4d-a-multimodal-perception-dataset\/#primaryimage"},"thumbnailUrl":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/12\/augmented-reality-car-repair.jpg","keywords":["Computer vision","machine learning","Meta"],"articleSection":["Industry"],"inLanguage":"nb-NO"},{"@type":"WebPage","@id":"https:\/\/dailyai.com\/2023\/12\/meta-releases-ego-exo4d-a-multimodal-perception-dataset\/","url":"https:\/\/dailyai.com\/2023\/12\/meta-releases-ego-exo4d-a-multimodal-perception-dataset\/","name":"Meta lanserer Ego-Exo4D, et multimodalt persepsjonsdatasett | DailyAI","isPartOf":{"@id":"https:\/\/dailyai.com\/#website"},"primaryImageOfPage":{"@id":"https:\/\/dailyai.com\/2023\/12\/meta-releases-ego-exo4d-a-multimodal-perception-dataset\/#primaryimage"},"image":{"@id":"https:\/\/dailyai.com\/2023\/12\/meta-releases-ego-exo4d-a-multimodal-perception-dataset\/#primaryimage"},"thumbnailUrl":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/12\/augmented-reality-car-repair.jpg","datePublished":"2023-12-05T09:02:39+00:00","dateModified":"2023-12-05T13:15:00+00:00","breadcrumb":{"@id":"https:\/\/dailyai.com\/2023\/12\/meta-releases-ego-exo4d-a-multimodal-perception-dataset\/#breadcrumb"},"inLanguage":"nb-NO","potentialAction":[{"@type":"ReadAction","target":["https:\/\/dailyai.com\/2023\/12\/meta-releases-ego-exo4d-a-multimodal-perception-dataset\/"]}]},{"@type":"ImageObject","inLanguage":"nb-NO","@id":"https:\/\/dailyai.com\/2023\/12\/meta-releases-ego-exo4d-a-multimodal-perception-dataset\/#primaryimage","url":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/12\/augmented-reality-car-repair.jpg","contentUrl":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/12\/augmented-reality-car-repair.jpg","width":1000,"height":666},{"@type":"BreadcrumbList","@id":"https:\/\/dailyai.com\/2023\/12\/meta-releases-ego-exo4d-a-multimodal-perception-dataset\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/dailyai.com\/"},{"@type":"ListItem","position":2,"name":"Meta releases Ego-Exo4D, a multimodal perception dataset"}]},{"@type":"WebSite","@id":"https:\/\/dailyai.com\/#website","url":"https:\/\/dailyai.com\/","name":"DagligAI","description":"Din daglige dose med AI-nyheter","publisher":{"@id":"https:\/\/dailyai.com\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/dailyai.com\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"nb-NO"},{"@type":"Organization","@id":"https:\/\/dailyai.com\/#organization","name":"DagligAI","url":"https:\/\/dailyai.com\/","logo":{"@type":"ImageObject","inLanguage":"nb-NO","@id":"https:\/\/dailyai.com\/#\/schema\/logo\/image\/","url":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/06\/Daily-Ai_TL_colour.png","contentUrl":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/06\/Daily-Ai_TL_colour.png","width":4501,"height":934,"caption":"DailyAI"},"image":{"@id":"https:\/\/dailyai.com\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/x.com\/DailyAIOfficial","https:\/\/www.linkedin.com\/company\/dailyaiofficial\/","https:\/\/www.youtube.com\/@DailyAIOfficial"]},{"@type":"Person","@id":"https:\/\/dailyai.com\/#\/schema\/person\/7ce525c6d0c79838b7cc7cde96993cfa","name":"Eugene van der Watt","image":{"@type":"ImageObject","inLanguage":"nb-NO","@id":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/07\/Eugine_Profile_Picture-96x96.png","url":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/07\/Eugine_Profile_Picture-96x96.png","contentUrl":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/07\/Eugine_Profile_Picture-96x96.png","caption":"Eugene van der Watt"},"description":"Eugene har bakgrunn som elektroingeni\u00f8r og elsker alt som har med teknologi \u00e5 gj\u00f8re. N\u00e5r han tar en pause fra AI-nyhetene, finner du ham ved snookerbordet.","sameAs":["www.linkedin.com\/in\/eugene-van-der-watt-16828119"],"url":"https:\/\/dailyai.com\/nb\/author\/eugene\/"}]}},"_links":{"self":[{"href":"https:\/\/dailyai.com\/nb\/wp-json\/wp\/v2\/posts\/8006","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/dailyai.com\/nb\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/dailyai.com\/nb\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/dailyai.com\/nb\/wp-json\/wp\/v2\/users\/6"}],"replies":[{"embeddable":true,"href":"https:\/\/dailyai.com\/nb\/wp-json\/wp\/v2\/comments?post=8006"}],"version-history":[{"count":4,"href":"https:\/\/dailyai.com\/nb\/wp-json\/wp\/v2\/posts\/8006\/revisions"}],"predecessor-version":[{"id":8021,"href":"https:\/\/dailyai.com\/nb\/wp-json\/wp\/v2\/posts\/8006\/revisions\/8021"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/dailyai.com\/nb\/wp-json\/wp\/v2\/media\/8009"}],"wp:attachment":[{"href":"https:\/\/dailyai.com\/nb\/wp-json\/wp\/v2\/media?parent=8006"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/dailyai.com\/nb\/wp-json\/wp\/v2\/categories?post=8006"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/dailyai.com\/nb\/wp-json\/wp\/v2\/tags?post=8006"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}