{"id":9706,"date":"2024-02-02T20:21:45","date_gmt":"2024-02-02T20:21:45","guid":{"rendered":"https:\/\/dailyai.com\/?p=9706"},"modified":"2024-02-07T14:54:58","modified_gmt":"2024-02-07T14:54:58","slug":"new-york-university-researchers-build-ai-that-sees-through-a-childs-eyes","status":"publish","type":"post","link":"https:\/\/dailyai.com\/nb\/2024\/02\/new-york-university-researchers-build-ai-that-sees-through-a-childs-eyes\/","title":{"rendered":"Forskere ved New York University bygger kunstig intelligens som ser gjennom et barns \u00f8yne"},"content":{"rendered":"<p><b>Forskere fra New York University lot seg inspirere av barns l\u00e6ringsprosesser for \u00e5 trene opp et AI-system.\u00a0<\/b><\/p>\n<p><span style=\"font-weight: 400;\">Metoden, som er beskrevet i <\/span><a href=\"https:\/\/www.science.org\/doi\/10.1126\/science.adi1374\" target=\"_blank\" rel=\"noopener\"><span style=\"font-weight: 400;\">tidsskriftet Science<\/span><\/a><span style=\"font-weight: 400;\">gj\u00f8r det mulig for AI \u00e5 l\u00e6re av omgivelsene uten \u00e5 v\u00e6re avhengig av merkede data, noe som er n\u00f8kkelen til studiens design. <\/span><\/p>\n<p><span style=\"font-weight: 400;\">Det gjenspeiler hvordan barn l\u00e6rer ved \u00e5 absorbere store mengder informasjon fra omgivelsene og gradvis skape mening i verden rundt seg.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Teamet skapte et datasett med 60 timer med f\u00f8rstepersons videoopptak fra et hodemontert kamera som ble b\u00e5ret av barn i alderen seks m\u00e5neder til to \u00e5r, for \u00e5 gjenskape et barns perspektiv i AI-modellen.\u00a0<\/span><\/p>\n<blockquote class=\"twitter-tweet\">\n<p dir=\"ltr\" lang=\"en\">1\/ I dagens utgave av Science trener vi opp et nevralt nett fra bunnen av gjennom \u00f8ynene og \u00f8rene til ett barn. Modellen l\u00e6rer seg \u00e5 knytte ord til visuelle referanser, noe som viser at det er mulig \u00e5 l\u00e6re spr\u00e5k fra bare ett barns perspektiv med dagens AI-verkt\u00f8y. <a href=\"https:\/\/t.co\/hPZiiQt6Vv\">https:\/\/t.co\/hPZiiQt6Vv<\/a> <a href=\"https:\/\/t.co\/wa8jfn9b5Z\">pic.twitter.com\/wa8jfn9b5Z<\/a><\/p>\n<p>- Wai Keen Vong (@wkvong) <a href=\"https:\/\/twitter.com\/wkvong\/status\/1753132293491708027?ref_src=twsrc%5Etfw\">1. februar 2024<\/a><\/p><\/blockquote>\n<p><script src=\"https:\/\/platform.twitter.com\/widgets.js\" async=\"\" charset=\"utf-8\"><\/script><\/p>\n<p><span style=\"font-weight: 400;\">Forskerne trente deretter opp en AI-modell for selvveiledet l\u00e6ring (SSL) ved hjelp av videodatasettet for \u00e5 se om AI kunne forst\u00e5 konseptet med handlinger og endringer ved \u00e5 analysere tidsmessig eller tidsrelatert informasjon i videoene, slik barn gj\u00f8r. <\/span><\/p>\n<p><span style=\"font-weight: 400;\">SSL-tiln\u00e6rminger gj\u00f8r det mulig for AI-modeller \u00e5 l\u00e6re m\u00f8nstre og strukturer i dataene uten eksplisitte merkelapper.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Forfatter av studien, Emri Orhan, <\/span><a href=\"https:\/\/sites.google.com\/view\/eminorhan\" target=\"_blank\" rel=\"noopener\"><span style=\"font-weight: 400;\">skriver i forskningsbloggen sin<\/span><\/a><span style=\"font-weight: 400;\">har tidligere tatt til orde for et st\u00f8rre fokus p\u00e5 SSL i AI-forskningen, som han mener er avgj\u00f8rende for \u00e5 forst\u00e5 komplekse l\u00e6ringsprosesser.\u00a0<\/span><\/p>\n<p>Orhan skrev: \"Det sies ofte at barn l\u00e6rer seg betydningen av ord sv\u00e6rt effektivt. For eksempel hevdes det at barn i sitt andre leve\u00e5r i gjennomsnitt l\u00e6rer noen f\u00e5 ord om dagen. Dette tyder p\u00e5 at de sannsynligvis er i stand til \u00e5 l\u00e6re de fleste ordene sine fra bare en h\u00e5ndfull eksponeringer (kanskje ofte fra bare \u00e9n eksponering), et fenomen som ogs\u00e5 er kjent som \"fast mapping\".\"<\/p>\n<blockquote class=\"twitter-tweet\">\n<p dir=\"ltr\" lang=\"en\">4\/ For \u00e5 teste dette, hva er vel bedre enn \u00e5 trene opp et nevralt nettverk, ikke p\u00e5 enorme mengder data fra nettet, men kun p\u00e5 den input som et enkelt barn mottar? Hva ville det l\u00e6re da, om noe? <a href=\"https:\/\/t.co\/bQ9aVbXUlB\">pic.twitter.com\/bQ9aVbXUlB<\/a><\/p>\n<p>- Wai Keen Vong (@wkvong) <a href=\"https:\/\/twitter.com\/wkvong\/status\/1753132300802445542?ref_src=twsrc%5Etfw\">1. februar 2024<\/a><\/p><\/blockquote>\n<p><script async src=\"https:\/\/platform.twitter.com\/widgets.js\" charset=\"utf-8\"><\/script><\/p>\n<p><span style=\"font-weight: 400;\">Studien tok ogs\u00e5 sikte p\u00e5 \u00e5 finne ut om AI trenger innebygde skjevheter eller \"snarveier\" for \u00e5 l\u00e6re effektivt, eller om den kan utvikle en forst\u00e5else av verden gjennom generelle l\u00e6ringsalgoritmer, omtrent som et barn gj\u00f8r.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Resultatene var spennende. Til tross for at videoen bare dekket omtrent 1% av barnets v\u00e5kne timer, kunne AI-systemet l\u00e6re seg en rekke ord og begreper, noe som viser hvor effektivt det er \u00e5 l\u00e6re fra begrensede, men m\u00e5lrettede data.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Resultatene inkluderer:<\/span><\/p>\n<ul>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>Ytelse for handlingsgjenkjenning<\/b><span style=\"font-weight: 400;\">: AI-modellene som ble trent opp p\u00e5 SAYCam-datasettet, var sv\u00e6rt effektive til \u00e5 gjenkjenne handlinger fra videoer. N\u00e5r modellene ble testet p\u00e5 finkornede handlingsgjenkjenningsoppgaver som Kinetics-700 og Something-Something-V2 (SSV2), viste de imponerende ytelse, selv med bare et lite antall merkede eksempler til trening.<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>Sammenligning med Kinetics-700-datasettet<\/b><span style=\"font-weight: 400;\">: De SAYCam-trente modellene ble sammenlignet med modeller trent p\u00e5 Kinetics-700, et variert datasett med korte YouTube-klipp. SAYCam-modellene presterte konkurransedyktig, noe som tyder p\u00e5 at de barnesentrerte, utviklingsrealistiske videodataene ga et rikt l\u00e6ringsmilj\u00f8 for AI-en, p\u00e5 linje med eller til og med bedre enn det varierte innholdet p\u00e5 YouTube.<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>Ferdigheter i videointerpolasjon<\/b><span style=\"font-weight: 400;\">: Et interessant resultat var modellenes evne til \u00e5 utf\u00f8re videointerpolasjon - \u00e5 forutsi manglende segmenter i en videosekvens. Dette demonstrerte en forst\u00e5else av tidsdynamikk og kontinuitet i visuelle scener, noe som gjenspeiler m\u00e5ten mennesker oppfatter og forutser handlinger p\u00e5.<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>Robuste objektrepresentasjoner<\/b><span style=\"font-weight: 400;\">: Studien viste ogs\u00e5 at videotrente modeller utviklet mer robuste objektrepresentasjoner enn modeller som var trent p\u00e5 statiske bilder. Dette var tydelig i oppgaver som krevde gjenkjenning av objekter under ulike forhold, noe som understreker verdien av temporal informasjon for \u00e5 l\u00e6re mer robuste og allsidige modeller.<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>Skalering av data og modellytelse<\/b><span style=\"font-weight: 400;\">: Forskningen unders\u00f8kte hvordan modellenes ytelse ble bedre med mer videodata fra SAYCam-datasettet. Dette tyder p\u00e5 at tilgang til mer omfattende, realistiske data vil \u00f8ke modellens ytelse.<\/span><\/li>\n<\/ul>\n<blockquote class=\"twitter-tweet\">\n<p dir=\"ltr\" lang=\"en\">6\/ Resultater: Selv med begrensede data fant vi ut at modellen kan tilegne seg ord-referent-kartlegginger fra bare noen titalls til hundrevis av eksempler, generalisere null-skudd til nye visuelle datasett og oppn\u00e5 multimodal tilpasning. Igjen er ekte spr\u00e5kinnl\u00e6ring mulig fra et barns ... <a href=\"https:\/\/t.co\/FCHfZCqftr\">pic.twitter.com\/FCHfZCqftr<\/a><\/p>\n<p>- Wai Keen Vong (@wkvong) <a href=\"https:\/\/twitter.com\/wkvong\/status\/1753132306682753462?ref_src=twsrc%5Etfw\">1. februar 2024<\/a><\/p><\/blockquote>\n<p><script async src=\"https:\/\/platform.twitter.com\/widgets.js\" charset=\"utf-8\"><\/script><\/p>\n<p><span style=\"font-weight: 400;\">Wai Keen Vong, forsker ved NYUs senter for datavitenskap, <\/span><a href=\"https:\/\/www.nyu.edu\/about\/news-publications\/news\/2024\/february\/ai-learns-through-the-eyes-and-ears-of-a-child.html\"><span style=\"font-weight: 400;\">diskuterte nyheten i denne tiln\u00e6rmingen<\/span><\/a><span style=\"font-weight: 400;\">\"Vi viser for f\u00f8rste gang at et nevralt nettverk som er trent opp p\u00e5 denne utviklingsmessig realistiske inndataen fra ett enkelt barn, kan l\u00e6re \u00e5 knytte ord til deres visuelle motstykker.\"\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Vong sa f\u00f8lgende om problemene som moderne generative AI-modeller st\u00e5r overfor: \"Dagens toppmoderne AI-systemer l\u00e6res opp ved hjelp av astronomiske datamengder (ofte milliarder\/billioner av ord), mens mennesker klarer \u00e5 l\u00e6re og bruke spr\u00e5k med langt mindre data (hundrevis av millioner av ord), s\u00e5 sammenhengen mellom disse fremskrittene innen maskinl\u00e6ring og menneskelig spr\u00e5ktilegnelse er ikke klar.\"<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Interessen for nye, \"lette\" maskinl\u00e6ringsmetoder er \u00f8kende. For det f\u00f8rste er kolossale, monolittiske modeller som GPT-3 og GPT-4 <\/span><a href=\"https:\/\/dailyai.com\/nb\/2024\/02\/data-center-energy-demands-soar-because-of-ai-how-do-we-sustain-it\/\"><span style=\"font-weight: 400;\">har enorme maktbehov<\/span><\/a><span style=\"font-weight: 400;\"> som det ikke er lett \u00e5 tilfredsstille.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">For det andre, \u00e5 skape <\/span><a href=\"https:\/\/dailyai.com\/nb\/2023\/08\/the-evolution-of-bio-inspired-ai-developments-and-future-direction\/\"><span style=\"font-weight: 400;\">bioinspirerte AI-systemer<\/span><\/a><span style=\"font-weight: 400;\"> er n\u00f8kkelen til \u00e5 utforme modeller eller roboter som \"tenker\" og \"oppf\u00f8rer seg\" som oss.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Vong erkjente ogs\u00e5 studiens begrensninger, og bemerket: \"Et forbehold er at spr\u00e5kinngangen til modellen er tekst, ikke det underliggende talesignalet som barna mottar.\"<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Denne studien utfordret tradisjonelle AI-oppl\u00e6ringsmodeller og bidro til den p\u00e5g\u00e5ende diskursen om de mest effektive m\u00e5tene \u00e5 etterligne biologisk l\u00e6ring p\u00e5. <\/span><\/p>\n<p><span style=\"font-weight: 400;\">Interessen for dette temaet vil \u00f8ke etter hvert som de kolossale AI-modellene begynner \u00e5 vise begrensninger for fremtiden.\u00a0<\/span><\/p>","protected":false},"excerpt":{"rendered":"<p>Forskere fra New York University har latt seg inspirere av barns l\u00e6ringsprosesser for \u00e5 trene opp et AI-system.  Metoden, som er beskrevet i tidsskriftet Science, gj\u00f8r det mulig for kunstig intelligens \u00e5 l\u00e6re av omgivelsene uten \u00e5 v\u00e6re avhengig av merkede data, noe som er n\u00f8kkelen til studiens design. Det gjenspeiler hvordan barn l\u00e6rer ved \u00e5 absorbere store mengder informasjon fra omgivelsene og gradvis forst\u00e5 verden rundt seg. Teamet skapte et datasett med 60 timer med f\u00f8rstepersons videoopptak fra et hodemontert kamera som ble b\u00e5ret av barn i alderen seks m\u00e5neder til to \u00e5r, for \u00e5 gjenskape et barns perspektiv i AI-modellen.  1\/<\/p>","protected":false},"author":2,"featured_media":9707,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[84],"tags":[350,105,510,331],"class_list":["post-9706","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-industry","tag-bio-inspired-ai","tag-machine-learning","tag-neuromorphic-computing","tag-neuroscience"],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v27.4 - https:\/\/yoast.com\/product\/yoast-seo-wordpress\/ -->\n<title>New York University researchers build AI that sees through a child&#039;s eyes | DailyAI<\/title>\n<meta name=\"description\" content=\"Researchers from New York University took inspiration from children&#039;s learning processes to train an AI system.\u00a0\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/dailyai.com\/nb\/2024\/02\/new-york-university-researchers-build-ai-that-sees-through-a-childs-eyes\/\" \/>\n<meta property=\"og:locale\" content=\"nb_NO\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"New York University researchers build AI that sees through a child&#039;s eyes | DailyAI\" \/>\n<meta property=\"og:description\" content=\"Researchers from New York University took inspiration from children&#039;s learning processes to train an AI system.\u00a0\" \/>\n<meta property=\"og:url\" content=\"https:\/\/dailyai.com\/nb\/2024\/02\/new-york-university-researchers-build-ai-that-sees-through-a-childs-eyes\/\" \/>\n<meta property=\"og:site_name\" content=\"DailyAI\" \/>\n<meta property=\"article:published_time\" content=\"2024-02-02T20:21:45+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2024-02-07T14:54:58+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/dailyai.com\/wp-content\/uploads\/2024\/02\/shutterstock_2116216982.jpg\" \/>\n\t<meta property=\"og:image:width\" content=\"1000\" \/>\n\t<meta property=\"og:image:height\" content=\"667\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/jpeg\" \/>\n<meta name=\"author\" content=\"Sam Jeans\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@DailyAIOfficial\" \/>\n<meta name=\"twitter:site\" content=\"@DailyAIOfficial\" \/>\n<meta name=\"twitter:label1\" content=\"Skrevet av\" \/>\n\t<meta name=\"twitter:data1\" content=\"Sam Jeans\" \/>\n\t<meta name=\"twitter:label2\" content=\"Ansl. lesetid\" \/>\n\t<meta name=\"twitter:data2\" content=\"4 minutter\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\\\/\\\/schema.org\",\"@graph\":[{\"@type\":\"NewsArticle\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/02\\\/new-york-university-researchers-build-ai-that-sees-through-a-childs-eyes\\\/#article\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/02\\\/new-york-university-researchers-build-ai-that-sees-through-a-childs-eyes\\\/\"},\"author\":{\"name\":\"Sam Jeans\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/#\\\/schema\\\/person\\\/711e81f945549438e8bbc579efdeb3c9\"},\"headline\":\"New York University researchers build AI that sees through a child&#8217;s eyes\",\"datePublished\":\"2024-02-02T20:21:45+00:00\",\"dateModified\":\"2024-02-07T14:54:58+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/02\\\/new-york-university-researchers-build-ai-that-sees-through-a-childs-eyes\\\/\"},\"wordCount\":944,\"publisher\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/#organization\"},\"image\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/02\\\/new-york-university-researchers-build-ai-that-sees-through-a-childs-eyes\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2024\\\/02\\\/shutterstock_2116216982.jpg\",\"keywords\":[\"Bio-inspired AI\",\"machine learning\",\"Neuromorphic computing\",\"Neuroscience\"],\"articleSection\":[\"Industry\"],\"inLanguage\":\"nb-NO\"},{\"@type\":\"WebPage\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/02\\\/new-york-university-researchers-build-ai-that-sees-through-a-childs-eyes\\\/\",\"url\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/02\\\/new-york-university-researchers-build-ai-that-sees-through-a-childs-eyes\\\/\",\"name\":\"New York University researchers build AI that sees through a child's eyes | DailyAI\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/02\\\/new-york-university-researchers-build-ai-that-sees-through-a-childs-eyes\\\/#primaryimage\"},\"image\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/02\\\/new-york-university-researchers-build-ai-that-sees-through-a-childs-eyes\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2024\\\/02\\\/shutterstock_2116216982.jpg\",\"datePublished\":\"2024-02-02T20:21:45+00:00\",\"dateModified\":\"2024-02-07T14:54:58+00:00\",\"description\":\"Researchers from New York University took inspiration from children's learning processes to train an AI system.\u00a0\",\"breadcrumb\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/02\\\/new-york-university-researchers-build-ai-that-sees-through-a-childs-eyes\\\/#breadcrumb\"},\"inLanguage\":\"nb-NO\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\\\/\\\/dailyai.com\\\/2024\\\/02\\\/new-york-university-researchers-build-ai-that-sees-through-a-childs-eyes\\\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"nb-NO\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/02\\\/new-york-university-researchers-build-ai-that-sees-through-a-childs-eyes\\\/#primaryimage\",\"url\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2024\\\/02\\\/shutterstock_2116216982.jpg\",\"contentUrl\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2024\\\/02\\\/shutterstock_2116216982.jpg\",\"width\":1000,\"height\":667,\"caption\":\"Child eyes AI\"},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/02\\\/new-york-university-researchers-build-ai-that-sees-through-a-childs-eyes\\\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\\\/\\\/dailyai.com\\\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"New York University researchers build AI that sees through a child&#8217;s eyes\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/#website\",\"url\":\"https:\\\/\\\/dailyai.com\\\/\",\"name\":\"DailyAI\",\"description\":\"Your Daily Dose of AI News\",\"publisher\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\\\/\\\/dailyai.com\\\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"nb-NO\"},{\"@type\":\"Organization\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/#organization\",\"name\":\"DailyAI\",\"url\":\"https:\\\/\\\/dailyai.com\\\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"nb-NO\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/#\\\/schema\\\/logo\\\/image\\\/\",\"url\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/06\\\/Daily-Ai_TL_colour.png\",\"contentUrl\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/06\\\/Daily-Ai_TL_colour.png\",\"width\":4501,\"height\":934,\"caption\":\"DailyAI\"},\"image\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/#\\\/schema\\\/logo\\\/image\\\/\"},\"sameAs\":[\"https:\\\/\\\/x.com\\\/DailyAIOfficial\",\"https:\\\/\\\/www.linkedin.com\\\/company\\\/dailyaiofficial\\\/\",\"https:\\\/\\\/www.youtube.com\\\/@DailyAIOfficial\"]},{\"@type\":\"Person\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/#\\\/schema\\\/person\\\/711e81f945549438e8bbc579efdeb3c9\",\"name\":\"Sam Jeans\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"nb-NO\",\"@id\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/a24a4a8f8e2a1a275b7491dc9c9f032c401eabf23c3206da4628dc84b6dac5c8?s=96&d=robohash&r=g\",\"url\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/a24a4a8f8e2a1a275b7491dc9c9f032c401eabf23c3206da4628dc84b6dac5c8?s=96&d=robohash&r=g\",\"contentUrl\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/a24a4a8f8e2a1a275b7491dc9c9f032c401eabf23c3206da4628dc84b6dac5c8?s=96&d=robohash&r=g\",\"caption\":\"Sam Jeans\"},\"description\":\"Sam is a science and technology writer who has worked in various AI startups. When he\u2019s not writing, he can be found reading medical journals or digging through boxes of vinyl records.\",\"sameAs\":[\"https:\\\/\\\/www.linkedin.com\\\/in\\\/sam-jeans-6746b9142\\\/\"],\"url\":\"https:\\\/\\\/dailyai.com\\\/nb\\\/author\\\/samjeans\\\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Forskere ved New York University bygger AI som ser gjennom et barns \u00f8yne | DailyAI","description":"Forskere fra New York University lot seg inspirere av barns l\u00e6ringsprosesser for \u00e5 trene opp et AI-system.\u00a0","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/dailyai.com\/nb\/2024\/02\/new-york-university-researchers-build-ai-that-sees-through-a-childs-eyes\/","og_locale":"nb_NO","og_type":"article","og_title":"New York University researchers build AI that sees through a child's eyes | DailyAI","og_description":"Researchers from New York University took inspiration from children's learning processes to train an AI system.\u00a0","og_url":"https:\/\/dailyai.com\/nb\/2024\/02\/new-york-university-researchers-build-ai-that-sees-through-a-childs-eyes\/","og_site_name":"DailyAI","article_published_time":"2024-02-02T20:21:45+00:00","article_modified_time":"2024-02-07T14:54:58+00:00","og_image":[{"width":1000,"height":667,"url":"https:\/\/dailyai.com\/wp-content\/uploads\/2024\/02\/shutterstock_2116216982.jpg","type":"image\/jpeg"}],"author":"Sam Jeans","twitter_card":"summary_large_image","twitter_creator":"@DailyAIOfficial","twitter_site":"@DailyAIOfficial","twitter_misc":{"Skrevet av":"Sam Jeans","Ansl. lesetid":"4 minutter"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"NewsArticle","@id":"https:\/\/dailyai.com\/2024\/02\/new-york-university-researchers-build-ai-that-sees-through-a-childs-eyes\/#article","isPartOf":{"@id":"https:\/\/dailyai.com\/2024\/02\/new-york-university-researchers-build-ai-that-sees-through-a-childs-eyes\/"},"author":{"name":"Sam Jeans","@id":"https:\/\/dailyai.com\/#\/schema\/person\/711e81f945549438e8bbc579efdeb3c9"},"headline":"New York University researchers build AI that sees through a child&#8217;s eyes","datePublished":"2024-02-02T20:21:45+00:00","dateModified":"2024-02-07T14:54:58+00:00","mainEntityOfPage":{"@id":"https:\/\/dailyai.com\/2024\/02\/new-york-university-researchers-build-ai-that-sees-through-a-childs-eyes\/"},"wordCount":944,"publisher":{"@id":"https:\/\/dailyai.com\/#organization"},"image":{"@id":"https:\/\/dailyai.com\/2024\/02\/new-york-university-researchers-build-ai-that-sees-through-a-childs-eyes\/#primaryimage"},"thumbnailUrl":"https:\/\/dailyai.com\/wp-content\/uploads\/2024\/02\/shutterstock_2116216982.jpg","keywords":["Bio-inspired AI","machine learning","Neuromorphic computing","Neuroscience"],"articleSection":["Industry"],"inLanguage":"nb-NO"},{"@type":"WebPage","@id":"https:\/\/dailyai.com\/2024\/02\/new-york-university-researchers-build-ai-that-sees-through-a-childs-eyes\/","url":"https:\/\/dailyai.com\/2024\/02\/new-york-university-researchers-build-ai-that-sees-through-a-childs-eyes\/","name":"Forskere ved New York University bygger AI som ser gjennom et barns \u00f8yne | DailyAI","isPartOf":{"@id":"https:\/\/dailyai.com\/#website"},"primaryImageOfPage":{"@id":"https:\/\/dailyai.com\/2024\/02\/new-york-university-researchers-build-ai-that-sees-through-a-childs-eyes\/#primaryimage"},"image":{"@id":"https:\/\/dailyai.com\/2024\/02\/new-york-university-researchers-build-ai-that-sees-through-a-childs-eyes\/#primaryimage"},"thumbnailUrl":"https:\/\/dailyai.com\/wp-content\/uploads\/2024\/02\/shutterstock_2116216982.jpg","datePublished":"2024-02-02T20:21:45+00:00","dateModified":"2024-02-07T14:54:58+00:00","description":"Forskere fra New York University lot seg inspirere av barns l\u00e6ringsprosesser for \u00e5 trene opp et AI-system.\u00a0","breadcrumb":{"@id":"https:\/\/dailyai.com\/2024\/02\/new-york-university-researchers-build-ai-that-sees-through-a-childs-eyes\/#breadcrumb"},"inLanguage":"nb-NO","potentialAction":[{"@type":"ReadAction","target":["https:\/\/dailyai.com\/2024\/02\/new-york-university-researchers-build-ai-that-sees-through-a-childs-eyes\/"]}]},{"@type":"ImageObject","inLanguage":"nb-NO","@id":"https:\/\/dailyai.com\/2024\/02\/new-york-university-researchers-build-ai-that-sees-through-a-childs-eyes\/#primaryimage","url":"https:\/\/dailyai.com\/wp-content\/uploads\/2024\/02\/shutterstock_2116216982.jpg","contentUrl":"https:\/\/dailyai.com\/wp-content\/uploads\/2024\/02\/shutterstock_2116216982.jpg","width":1000,"height":667,"caption":"Child eyes AI"},{"@type":"BreadcrumbList","@id":"https:\/\/dailyai.com\/2024\/02\/new-york-university-researchers-build-ai-that-sees-through-a-childs-eyes\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/dailyai.com\/"},{"@type":"ListItem","position":2,"name":"New York University researchers build AI that sees through a child&#8217;s eyes"}]},{"@type":"WebSite","@id":"https:\/\/dailyai.com\/#website","url":"https:\/\/dailyai.com\/","name":"DagligAI","description":"Din daglige dose med AI-nyheter","publisher":{"@id":"https:\/\/dailyai.com\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/dailyai.com\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"nb-NO"},{"@type":"Organization","@id":"https:\/\/dailyai.com\/#organization","name":"DagligAI","url":"https:\/\/dailyai.com\/","logo":{"@type":"ImageObject","inLanguage":"nb-NO","@id":"https:\/\/dailyai.com\/#\/schema\/logo\/image\/","url":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/06\/Daily-Ai_TL_colour.png","contentUrl":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/06\/Daily-Ai_TL_colour.png","width":4501,"height":934,"caption":"DailyAI"},"image":{"@id":"https:\/\/dailyai.com\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/x.com\/DailyAIOfficial","https:\/\/www.linkedin.com\/company\/dailyaiofficial\/","https:\/\/www.youtube.com\/@DailyAIOfficial"]},{"@type":"Person","@id":"https:\/\/dailyai.com\/#\/schema\/person\/711e81f945549438e8bbc579efdeb3c9","name":"Sam Jeans","image":{"@type":"ImageObject","inLanguage":"nb-NO","@id":"https:\/\/secure.gravatar.com\/avatar\/a24a4a8f8e2a1a275b7491dc9c9f032c401eabf23c3206da4628dc84b6dac5c8?s=96&d=robohash&r=g","url":"https:\/\/secure.gravatar.com\/avatar\/a24a4a8f8e2a1a275b7491dc9c9f032c401eabf23c3206da4628dc84b6dac5c8?s=96&d=robohash&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/a24a4a8f8e2a1a275b7491dc9c9f032c401eabf23c3206da4628dc84b6dac5c8?s=96&d=robohash&r=g","caption":"Sam Jeans"},"description":"Sam er en vitenskaps- og teknologiskribent som har jobbet i ulike oppstartsbedrifter innen kunstig intelligens. N\u00e5r han ikke skriver, leser han medisinske tidsskrifter eller graver seg gjennom esker med vinylplater.","sameAs":["https:\/\/www.linkedin.com\/in\/sam-jeans-6746b9142\/"],"url":"https:\/\/dailyai.com\/nb\/author\/samjeans\/"}]}},"_links":{"self":[{"href":"https:\/\/dailyai.com\/nb\/wp-json\/wp\/v2\/posts\/9706","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/dailyai.com\/nb\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/dailyai.com\/nb\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/dailyai.com\/nb\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/dailyai.com\/nb\/wp-json\/wp\/v2\/comments?post=9706"}],"version-history":[{"count":7,"href":"https:\/\/dailyai.com\/nb\/wp-json\/wp\/v2\/posts\/9706\/revisions"}],"predecessor-version":[{"id":9793,"href":"https:\/\/dailyai.com\/nb\/wp-json\/wp\/v2\/posts\/9706\/revisions\/9793"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/dailyai.com\/nb\/wp-json\/wp\/v2\/media\/9707"}],"wp:attachment":[{"href":"https:\/\/dailyai.com\/nb\/wp-json\/wp\/v2\/media?parent=9706"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/dailyai.com\/nb\/wp-json\/wp\/v2\/categories?post=9706"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/dailyai.com\/nb\/wp-json\/wp\/v2\/tags?post=9706"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}