{"id":6254,"date":"2023-10-09T07:07:00","date_gmt":"2023-10-09T07:07:00","guid":{"rendered":"https:\/\/dailyai.com\/?p=6254"},"modified":"2023-10-09T07:07:00","modified_gmt":"2023-10-09T07:07:00","slug":"ai-decodes-speech-from-non-invasive-brain-recordings","status":"publish","type":"post","link":"https:\/\/dailyai.com\/da\/2023\/10\/ai-decodes-speech-from-non-invasive-brain-recordings\/","title":{"rendered":"AI afkoder tale fra ikke-invasive hjerneoptagelser"},"content":{"rendered":"<p><strong>Pr\u00e6cis hvordan vores hjerner bearbejder og formulerer sprog er stadig et stort mysterium. Forskere ved Meta AI har fundet en ny m\u00e5de at m\u00e5le hjerneb\u00f8lger p\u00e5 og afkode de ord, der er forbundet med dem.<\/strong><\/p>\n<p>Mennesker, der har st\u00e6rkt begr\u00e6nsede motoriske f\u00e6rdigheder, som ALS-ramte, finder det s\u00e6rligt udfordrende at kommunikere. Det er sv\u00e6rt at forestille sig frustrationen hos en person som Stephen Hawking, der m\u00f8jsommeligt konstruerer en s\u00e6tning med \u00f8jenbev\u00e6gelser eller rykker i en kindmuskel.<\/p>\n<p>Der er lavet en masse forskning for at <a href=\"https:\/\/dailyai.com\/da\/2023\/08\/unbabels-ai-tool-makes-telepathic-communication-possible\/\">afkode tale ud fra hjerneaktivitet<\/a>, men de bedste resultater afh\u00e6nger af invasiv <a href=\"https:\/\/dailyai.com\/da\/2023\/08\/ai-replenishes-speech-and-facial-expressions-of-stroke-survivor\/\">hjerne-computer-implantater.<\/a><\/p>\n<p>Meta AI-forskere brugte magneto-encefalografi (MEG) og elektroencefalografi (EEG) til at registrere hjerneb\u00f8lgerne hos 175 frivillige, mens de lyttede til korte historier og isolerede s\u00e6tninger.<\/p>\n<p>De brugte en forudtr\u00e6net talemodel og kontrastiv l\u00e6ring til at identificere, hvilke hjerneb\u00f8lgem\u00f8nstre der var forbundet med specifikke ord, som fors\u00f8gspersonerne lyttede til.<\/p>\n<blockquote class=\"twitter-tweet\">\n<p dir=\"ltr\" lang=\"en\" style=\"text-align: center;\">\"Afkodning af taleopfattelse fra ikke-invasive hjerneoptagelser\",<br \/>\nledet af den eneste ene <a href=\"https:\/\/twitter.com\/honualx?ref_src=twsrc%5Etfw\">@honualx<\/a><br \/>\ner netop udkommet i det seneste nummer af Nature Machine Intelligence:<\/p>\n<p>- papir med \u00e5ben adgang: <a href=\"https:\/\/t.co\/1jtpTezQzM\">https:\/\/t.co\/1jtpTezQzM<\/a><br \/>\n- fuld tr\u00e6ningskode: <a href=\"https:\/\/t.co\/Al2alBxeUC\">https:\/\/t.co\/Al2alBxeUC<\/a> <a href=\"https:\/\/t.co\/imLxRjRQ6h\">pic.twitter.com\/imLxRjRQ6h<\/a><\/p>\n<p style=\"text-align: center;\">- Jean-R\u00e9mi King (@JeanRemiKing) <a href=\"https:\/\/twitter.com\/JeanRemiKing\/status\/1709957806030311798?ref_src=twsrc%5Etfw\">5. oktober 2023<\/a><\/p>\n<\/blockquote>\n<p><script async src=\"https:\/\/platform.twitter.com\/widgets.js\" charset=\"utf-8\"><\/script><\/p>\n<p>Forskerne delte lyden op i segmenter p\u00e5 3 sekunder og testede derefter deres model for at se, om den kunne identificere korrekt, hvilket af de 1.500 segmenter den frivillige lyttede til. Modellen forudsagde en slags ordsky, hvor det mest sandsynlige ord blev v\u00e6gtet h\u00f8jest.<\/p>\n<p>De opn\u00e5ede en n\u00f8jagtighed p\u00e5 41% i gennemsnit og 95,9% n\u00f8jagtighed med deres bedste deltagere.<\/p>\n<figure id=\"attachment_6256\" aria-describedby=\"caption-attachment-6256\" style=\"width: 1024px\" class=\"wp-caption aligncenter\"><img loading=\"lazy\" decoding=\"async\" class=\"wp-image-6256 size-large\" src=\"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/10\/Speech-predictions-from-brain-waves-1024x578.webp\" alt=\"Forudsigelser af tale ud fra hjerneb\u00f8lger\" width=\"1024\" height=\"578\" srcset=\"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/10\/Speech-predictions-from-brain-waves-1024x578.webp 1024w, https:\/\/dailyai.com\/wp-content\/uploads\/2023\/10\/Speech-predictions-from-brain-waves-300x169.webp 300w, https:\/\/dailyai.com\/wp-content\/uploads\/2023\/10\/Speech-predictions-from-brain-waves-768x434.webp 768w, https:\/\/dailyai.com\/wp-content\/uploads\/2023\/10\/Speech-predictions-from-brain-waves-1536x867.webp 1536w, https:\/\/dailyai.com\/wp-content\/uploads\/2023\/10\/Speech-predictions-from-brain-waves-2048x1156.webp 2048w, https:\/\/dailyai.com\/wp-content\/uploads\/2023\/10\/Speech-predictions-from-brain-waves-370x209.webp 370w, https:\/\/dailyai.com\/wp-content\/uploads\/2023\/10\/Speech-predictions-from-brain-waves-800x452.webp 800w, https:\/\/dailyai.com\/wp-content\/uploads\/2023\/10\/Speech-predictions-from-brain-waves-20x11.webp 20w, https:\/\/dailyai.com\/wp-content\/uploads\/2023\/10\/Speech-predictions-from-brain-waves-740x418.webp 740w, https:\/\/dailyai.com\/wp-content\/uploads\/2023\/10\/Speech-predictions-from-brain-waves-1600x903.webp 1600w, https:\/\/dailyai.com\/wp-content\/uploads\/2023\/10\/Speech-predictions-from-brain-waves-1320x745.webp 1320w, https:\/\/dailyai.com\/wp-content\/uploads\/2023\/10\/Speech-predictions-from-brain-waves-85x48.webp 85w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><figcaption id=\"caption-attachment-6256\" class=\"wp-caption-text\">Forudsigelser p\u00e5 ordniveau, mens deltagerne lyttede til s\u00e6tningen \"Tak, fordi du kom, Ed\". Bl\u00e5 ord svarer til det korrekte ord, og sorte ord svarer til negative kandidater. Tekstst\u00f8rrelsen er proportional med modellens log-sandsynlighedsoutput. Kilde: <a href=\"https:\/\/www.nature.com\/articles\/s42256-023-00714-5\/figures\/3\" target=\"_blank\" rel=\"noopener\">Naturen<\/a><\/figcaption><\/figure>\n<p>Forskningen viser, at det er muligt at f\u00e5 en ret god id\u00e9 om, hvilken tale en person h\u00f8rer, men nu skal processen vendes om for at v\u00e6re brugbar. Vi er n\u00f8dt til at m\u00e5le deres hjerneb\u00f8lger og vide, hvilket ord de t\u00e6nker p\u00e5.<\/p>\n<p>Artiklen foresl\u00e5r, at man tr\u00e6ner et neuralt netv\u00e6rk, mens fors\u00f8gspersoner producerer ord ved at tale eller skrive. Den generelle model kunne s\u00e5 bruges til at forst\u00e5 hjerneb\u00f8lger og de tilh\u00f8rende ord, som en ALS-ramt t\u00e6nkte p\u00e5.<\/p>\n<p>Forskerne var i stand til at identificere talesegmenter fra et begr\u00e6nset, forudbestemt s\u00e6t. For at kunne kommunikere ordentligt skal man kunne identificere mange flere ord. At bruge en generativ AI til at forudsige det n\u00e6ste mest sandsynlige ord, som en person fors\u00f8ger at sige, kan hj\u00e6lpe med det.<\/p>\n<p>Selv om processen var ikke-invasiv, kr\u00e6ver den stadig, at man er tilsluttet en <a href=\"https:\/\/dailyai.com\/da\/2023\/08\/ai-mind-reading-medical-breakthrough-or-step-towards-dystopia\/\">MEG-enhed<\/a>. Desv\u00e6rre var resultaterne fra EEG-m\u00e5lingerne ikke gode.<\/p>\n<p>Forskningen viser, at AI i sidste ende kan bruges til at hj\u00e6lpe stemmel\u00f8se som ALS-ramte med at kommunikere. Ved at bruge en forudtr\u00e6net model undgik man ogs\u00e5 behovet for mere omhyggelig ord-for-ord-tr\u00e6ning.<\/p>\n<p>Meta AI har offentliggjort modellen og dataene, s\u00e5 andre forskere forh\u00e5bentlig vil bygge videre p\u00e5 deres arbejde.<\/p>","protected":false},"excerpt":{"rendered":"<p>Pr\u00e6cis hvordan vores hjerner bearbejder og formulerer sprog er stadig et stort mysterium. Forskere ved Meta AI har fundet en ny m\u00e5de at m\u00e5le hjerneb\u00f8lger p\u00e5 og afkode de ord, der er forbundet med dem. Mennesker, der har st\u00e6rkt begr\u00e6nsede motoriske f\u00e6rdigheder, som ALS-ramte, finder det s\u00e6rligt udfordrende at kommunikere. Det er sv\u00e6rt at forestille sig frustrationen hos en person som Stephen Hawking, der m\u00f8jsommeligt konstruerer en s\u00e6tning med \u00f8jenbev\u00e6gelser eller rykker i en kindmuskel. Der er blevet forsket meget i at afkode tale ud fra hjerneaktivitet, men de bedste resultater afh\u00e6nger af invasive hjerne-computer-implantater. Meta AI-forskere brugte magneto-encefalografi (MEG)<\/p>","protected":false},"author":6,"featured_media":6257,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[84],"tags":[150,105,101,131],"class_list":["post-6254","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-industry","tag-ai-benefits","tag-machine-learning","tag-medtech","tag-meta"],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v27.4 - https:\/\/yoast.com\/product\/yoast-seo-wordpress\/ -->\n<title>AI decodes speech from non-invasive brain recordings | DailyAI<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/dailyai.com\/da\/2023\/10\/ai-decodes-speech-from-non-invasive-brain-recordings\/\" \/>\n<meta property=\"og:locale\" content=\"da_DK\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"AI decodes speech from non-invasive brain recordings | DailyAI\" \/>\n<meta property=\"og:description\" content=\"Exactly how our brains process and formulate language is still largely a mystery. Researchers at Meta AI found a new way to measure brain waves and decode the words associated with them. People who have severely limited motor skills, like ALS sufferers, find it particularly challenging to communicate. The frustration of a person like Stephen Hawking painstakingly constructing a sentence with eye movements or twitching a cheek muscle is hard to imagine. A lot of research has been done to decode speech from brain activity, but the best results depend on invasive brain-computer implants. Meta AI researchers used magneto-encephalography (MEG)\" \/>\n<meta property=\"og:url\" content=\"https:\/\/dailyai.com\/da\/2023\/10\/ai-decodes-speech-from-non-invasive-brain-recordings\/\" \/>\n<meta property=\"og:site_name\" content=\"DailyAI\" \/>\n<meta property=\"article:published_time\" content=\"2023-10-09T07:07:00+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/10\/Brain-waves-to-speech.jpg\" \/>\n\t<meta property=\"og:image:width\" content=\"1000\" \/>\n\t<meta property=\"og:image:height\" content=\"864\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/jpeg\" \/>\n<meta name=\"author\" content=\"Eugene van der Watt\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@DailyAIOfficial\" \/>\n<meta name=\"twitter:site\" content=\"@DailyAIOfficial\" \/>\n<meta name=\"twitter:label1\" content=\"Skrevet af\" \/>\n\t<meta name=\"twitter:data1\" content=\"Eugene van der Watt\" \/>\n\t<meta name=\"twitter:label2\" content=\"Estimeret l\u00e6setid\" \/>\n\t<meta name=\"twitter:data2\" content=\"3 minutter\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\\\/\\\/schema.org\",\"@graph\":[{\"@type\":\"NewsArticle\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/2023\\\/10\\\/ai-decodes-speech-from-non-invasive-brain-recordings\\\/#article\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2023\\\/10\\\/ai-decodes-speech-from-non-invasive-brain-recordings\\\/\"},\"author\":{\"name\":\"Eugene van der Watt\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/#\\\/schema\\\/person\\\/7ce525c6d0c79838b7cc7cde96993cfa\"},\"headline\":\"AI decodes speech from non-invasive brain recordings\",\"datePublished\":\"2023-10-09T07:07:00+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2023\\\/10\\\/ai-decodes-speech-from-non-invasive-brain-recordings\\\/\"},\"wordCount\":519,\"publisher\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/#organization\"},\"image\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2023\\\/10\\\/ai-decodes-speech-from-non-invasive-brain-recordings\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/10\\\/Brain-waves-to-speech.jpg\",\"keywords\":[\"AI benefits\",\"machine learning\",\"MedTech\",\"Meta\"],\"articleSection\":[\"Industry\"],\"inLanguage\":\"da-DK\"},{\"@type\":\"WebPage\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/2023\\\/10\\\/ai-decodes-speech-from-non-invasive-brain-recordings\\\/\",\"url\":\"https:\\\/\\\/dailyai.com\\\/2023\\\/10\\\/ai-decodes-speech-from-non-invasive-brain-recordings\\\/\",\"name\":\"AI decodes speech from non-invasive brain recordings | DailyAI\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2023\\\/10\\\/ai-decodes-speech-from-non-invasive-brain-recordings\\\/#primaryimage\"},\"image\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2023\\\/10\\\/ai-decodes-speech-from-non-invasive-brain-recordings\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/10\\\/Brain-waves-to-speech.jpg\",\"datePublished\":\"2023-10-09T07:07:00+00:00\",\"breadcrumb\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2023\\\/10\\\/ai-decodes-speech-from-non-invasive-brain-recordings\\\/#breadcrumb\"},\"inLanguage\":\"da-DK\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\\\/\\\/dailyai.com\\\/2023\\\/10\\\/ai-decodes-speech-from-non-invasive-brain-recordings\\\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"da-DK\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/2023\\\/10\\\/ai-decodes-speech-from-non-invasive-brain-recordings\\\/#primaryimage\",\"url\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/10\\\/Brain-waves-to-speech.jpg\",\"contentUrl\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/10\\\/Brain-waves-to-speech.jpg\",\"width\":1000,\"height\":864},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/2023\\\/10\\\/ai-decodes-speech-from-non-invasive-brain-recordings\\\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\\\/\\\/dailyai.com\\\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"AI decodes speech from non-invasive brain recordings\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/#website\",\"url\":\"https:\\\/\\\/dailyai.com\\\/\",\"name\":\"DailyAI\",\"description\":\"Your Daily Dose of AI News\",\"publisher\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\\\/\\\/dailyai.com\\\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"da-DK\"},{\"@type\":\"Organization\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/#organization\",\"name\":\"DailyAI\",\"url\":\"https:\\\/\\\/dailyai.com\\\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"da-DK\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/#\\\/schema\\\/logo\\\/image\\\/\",\"url\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/06\\\/Daily-Ai_TL_colour.png\",\"contentUrl\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/06\\\/Daily-Ai_TL_colour.png\",\"width\":4501,\"height\":934,\"caption\":\"DailyAI\"},\"image\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/#\\\/schema\\\/logo\\\/image\\\/\"},\"sameAs\":[\"https:\\\/\\\/x.com\\\/DailyAIOfficial\",\"https:\\\/\\\/www.linkedin.com\\\/company\\\/dailyaiofficial\\\/\",\"https:\\\/\\\/www.youtube.com\\\/@DailyAIOfficial\"]},{\"@type\":\"Person\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/#\\\/schema\\\/person\\\/7ce525c6d0c79838b7cc7cde96993cfa\",\"name\":\"Eugene van der Watt\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"da-DK\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/07\\\/Eugine_Profile_Picture-96x96.png\",\"url\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/07\\\/Eugine_Profile_Picture-96x96.png\",\"contentUrl\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/07\\\/Eugine_Profile_Picture-96x96.png\",\"caption\":\"Eugene van der Watt\"},\"description\":\"Eugene comes from an electronic engineering background and loves all things tech. When he takes a break from consuming AI news you'll find him at the snooker table.\",\"sameAs\":[\"www.linkedin.com\\\/in\\\/eugene-van-der-watt-16828119\"],\"url\":\"https:\\\/\\\/dailyai.com\\\/da\\\/author\\\/eugene\\\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"AI afkoder tale fra ikke-invasive hjerneoptagelser | DailyAI","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/dailyai.com\/da\/2023\/10\/ai-decodes-speech-from-non-invasive-brain-recordings\/","og_locale":"da_DK","og_type":"article","og_title":"AI decodes speech from non-invasive brain recordings | DailyAI","og_description":"Exactly how our brains process and formulate language is still largely a mystery. Researchers at Meta AI found a new way to measure brain waves and decode the words associated with them. People who have severely limited motor skills, like ALS sufferers, find it particularly challenging to communicate. The frustration of a person like Stephen Hawking painstakingly constructing a sentence with eye movements or twitching a cheek muscle is hard to imagine. A lot of research has been done to decode speech from brain activity, but the best results depend on invasive brain-computer implants. Meta AI researchers used magneto-encephalography (MEG)","og_url":"https:\/\/dailyai.com\/da\/2023\/10\/ai-decodes-speech-from-non-invasive-brain-recordings\/","og_site_name":"DailyAI","article_published_time":"2023-10-09T07:07:00+00:00","og_image":[{"width":1000,"height":864,"url":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/10\/Brain-waves-to-speech.jpg","type":"image\/jpeg"}],"author":"Eugene van der Watt","twitter_card":"summary_large_image","twitter_creator":"@DailyAIOfficial","twitter_site":"@DailyAIOfficial","twitter_misc":{"Skrevet af":"Eugene van der Watt","Estimeret l\u00e6setid":"3 minutter"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"NewsArticle","@id":"https:\/\/dailyai.com\/2023\/10\/ai-decodes-speech-from-non-invasive-brain-recordings\/#article","isPartOf":{"@id":"https:\/\/dailyai.com\/2023\/10\/ai-decodes-speech-from-non-invasive-brain-recordings\/"},"author":{"name":"Eugene van der Watt","@id":"https:\/\/dailyai.com\/#\/schema\/person\/7ce525c6d0c79838b7cc7cde96993cfa"},"headline":"AI decodes speech from non-invasive brain recordings","datePublished":"2023-10-09T07:07:00+00:00","mainEntityOfPage":{"@id":"https:\/\/dailyai.com\/2023\/10\/ai-decodes-speech-from-non-invasive-brain-recordings\/"},"wordCount":519,"publisher":{"@id":"https:\/\/dailyai.com\/#organization"},"image":{"@id":"https:\/\/dailyai.com\/2023\/10\/ai-decodes-speech-from-non-invasive-brain-recordings\/#primaryimage"},"thumbnailUrl":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/10\/Brain-waves-to-speech.jpg","keywords":["AI benefits","machine learning","MedTech","Meta"],"articleSection":["Industry"],"inLanguage":"da-DK"},{"@type":"WebPage","@id":"https:\/\/dailyai.com\/2023\/10\/ai-decodes-speech-from-non-invasive-brain-recordings\/","url":"https:\/\/dailyai.com\/2023\/10\/ai-decodes-speech-from-non-invasive-brain-recordings\/","name":"AI afkoder tale fra ikke-invasive hjerneoptagelser | DailyAI","isPartOf":{"@id":"https:\/\/dailyai.com\/#website"},"primaryImageOfPage":{"@id":"https:\/\/dailyai.com\/2023\/10\/ai-decodes-speech-from-non-invasive-brain-recordings\/#primaryimage"},"image":{"@id":"https:\/\/dailyai.com\/2023\/10\/ai-decodes-speech-from-non-invasive-brain-recordings\/#primaryimage"},"thumbnailUrl":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/10\/Brain-waves-to-speech.jpg","datePublished":"2023-10-09T07:07:00+00:00","breadcrumb":{"@id":"https:\/\/dailyai.com\/2023\/10\/ai-decodes-speech-from-non-invasive-brain-recordings\/#breadcrumb"},"inLanguage":"da-DK","potentialAction":[{"@type":"ReadAction","target":["https:\/\/dailyai.com\/2023\/10\/ai-decodes-speech-from-non-invasive-brain-recordings\/"]}]},{"@type":"ImageObject","inLanguage":"da-DK","@id":"https:\/\/dailyai.com\/2023\/10\/ai-decodes-speech-from-non-invasive-brain-recordings\/#primaryimage","url":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/10\/Brain-waves-to-speech.jpg","contentUrl":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/10\/Brain-waves-to-speech.jpg","width":1000,"height":864},{"@type":"BreadcrumbList","@id":"https:\/\/dailyai.com\/2023\/10\/ai-decodes-speech-from-non-invasive-brain-recordings\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/dailyai.com\/"},{"@type":"ListItem","position":2,"name":"AI decodes speech from non-invasive brain recordings"}]},{"@type":"WebSite","@id":"https:\/\/dailyai.com\/#website","url":"https:\/\/dailyai.com\/","name":"DailyAI","description":"Din daglige dosis af AI-nyheder","publisher":{"@id":"https:\/\/dailyai.com\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/dailyai.com\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"da-DK"},{"@type":"Organization","@id":"https:\/\/dailyai.com\/#organization","name":"DailyAI","url":"https:\/\/dailyai.com\/","logo":{"@type":"ImageObject","inLanguage":"da-DK","@id":"https:\/\/dailyai.com\/#\/schema\/logo\/image\/","url":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/06\/Daily-Ai_TL_colour.png","contentUrl":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/06\/Daily-Ai_TL_colour.png","width":4501,"height":934,"caption":"DailyAI"},"image":{"@id":"https:\/\/dailyai.com\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/x.com\/DailyAIOfficial","https:\/\/www.linkedin.com\/company\/dailyaiofficial\/","https:\/\/www.youtube.com\/@DailyAIOfficial"]},{"@type":"Person","@id":"https:\/\/dailyai.com\/#\/schema\/person\/7ce525c6d0c79838b7cc7cde96993cfa","name":"Eugene van der Watt","image":{"@type":"ImageObject","inLanguage":"da-DK","@id":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/07\/Eugine_Profile_Picture-96x96.png","url":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/07\/Eugine_Profile_Picture-96x96.png","contentUrl":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/07\/Eugine_Profile_Picture-96x96.png","caption":"Eugene van der Watt"},"description":"Eugene har en baggrund som elektronikingeni\u00f8r og elsker alt, hvad der har med teknologi at g\u00f8re. N\u00e5r han tager en pause fra at l\u00e6se AI-nyheder, kan du finde ham ved snookerbordet.","sameAs":["www.linkedin.com\/in\/eugene-van-der-watt-16828119"],"url":"https:\/\/dailyai.com\/da\/author\/eugene\/"}]}},"_links":{"self":[{"href":"https:\/\/dailyai.com\/da\/wp-json\/wp\/v2\/posts\/6254","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/dailyai.com\/da\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/dailyai.com\/da\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/dailyai.com\/da\/wp-json\/wp\/v2\/users\/6"}],"replies":[{"embeddable":true,"href":"https:\/\/dailyai.com\/da\/wp-json\/wp\/v2\/comments?post=6254"}],"version-history":[{"count":3,"href":"https:\/\/dailyai.com\/da\/wp-json\/wp\/v2\/posts\/6254\/revisions"}],"predecessor-version":[{"id":6259,"href":"https:\/\/dailyai.com\/da\/wp-json\/wp\/v2\/posts\/6254\/revisions\/6259"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/dailyai.com\/da\/wp-json\/wp\/v2\/media\/6257"}],"wp:attachment":[{"href":"https:\/\/dailyai.com\/da\/wp-json\/wp\/v2\/media?parent=6254"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/dailyai.com\/da\/wp-json\/wp\/v2\/categories?post=6254"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/dailyai.com\/da\/wp-json\/wp\/v2\/tags?post=6254"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}