{"id":6254,"date":"2023-10-09T07:07:00","date_gmt":"2023-10-09T07:07:00","guid":{"rendered":"https:\/\/dailyai.com\/?p=6254"},"modified":"2023-10-09T07:07:00","modified_gmt":"2023-10-09T07:07:00","slug":"ai-decodes-speech-from-non-invasive-brain-recordings","status":"publish","type":"post","link":"https:\/\/dailyai.com\/it\/2023\/10\/ai-decodes-speech-from-non-invasive-brain-recordings\/","title":{"rendered":"L'intelligenza artificiale decodifica il parlato da registrazioni cerebrali non invasive"},"content":{"rendered":"<p><strong>Il modo esatto in cui il nostro cervello elabora e formula il linguaggio \u00e8 ancora in gran parte un mistero. I ricercatori di Meta AI hanno trovato un nuovo modo per misurare le onde cerebrali e decodificare le parole ad esse associate.<\/strong><\/p>\n<p>Per le persone con capacit\u00e0 motorie fortemente limitate, come i malati di SLA, \u00e8 particolarmente difficile comunicare. \u00c8 difficile immaginare la frustrazione di una persona come Stephen Hawking che costruisce faticosamente una frase con i movimenti degli occhi o contraendo un muscolo della guancia.<\/p>\n<p>Sono state fatte molte ricerche per <a href=\"https:\/\/dailyai.com\/it\/2023\/08\/unbabels-ai-tool-makes-telepathic-communication-possible\/\">decodificare il parlato dall'attivit\u00e0 cerebrale<\/a>ma i risultati migliori dipendono dall'invasivit\u00e0 <a href=\"https:\/\/dailyai.com\/it\/2023\/08\/ai-replenishes-speech-and-facial-expressions-of-stroke-survivor\/\">impianti cervello-computer.<\/a><\/p>\n<p>I ricercatori di Meta AI hanno utilizzato la magnetoencefalografia (MEG) e l'elettroencefalografia (EEG) per registrare le onde cerebrali di 175 volontari mentre ascoltavano brevi storie e frasi isolate.<\/p>\n<p>Hanno utilizzato un modello vocale pre-addestrato e l'apprendimento contrastivo per identificare quali modelli di onde cerebrali erano associati a parole specifiche che i soggetti stavano ascoltando.<\/p>\n<blockquote class=\"twitter-tweet\">\n<p dir=\"ltr\" lang=\"en\" style=\"text-align: center;\">Decodificare la percezione del parlato da registrazioni cerebrali non invasive,<br \/>\nguidato dall'unico e solo <a href=\"https:\/\/twitter.com\/honualx?ref_src=twsrc%5Etfw\">@honualx<\/a><br \/>\n\u00e8 appena uscito nell'ultimo numero di Nature Machine Intelligence:<\/p>\n<p>- documento ad accesso libero: <a href=\"https:\/\/t.co\/1jtpTezQzM\">https:\/\/t.co\/1jtpTezQzM<\/a><br \/>\n- codice di formazione completo: <a href=\"https:\/\/t.co\/Al2alBxeUC\">https:\/\/t.co\/Al2alBxeUC<\/a> <a href=\"https:\/\/t.co\/imLxRjRQ6h\">pic.twitter.com\/imLxRjRQ6h<\/a><\/p>\n<p style=\"text-align: center;\">- Jean-R\u00e9mi King (@JeanRemiKing) <a href=\"https:\/\/twitter.com\/JeanRemiKing\/status\/1709957806030311798?ref_src=twsrc%5Etfw\">5 ottobre 2023<\/a><\/p>\n<\/blockquote>\n<p><script async src=\"https:\/\/platform.twitter.com\/widgets.js\" charset=\"utf-8\"><\/script><\/p>\n<p>I ricercatori hanno suddiviso l'audio in segmenti di 3 secondi e poi hanno testato il loro modello per vedere se era in grado di identificare correttamente quale dei 1.500 segmenti il volontario stava ascoltando. Il modello ha previsto una sorta di nuvola di parole, con la parola pi\u00f9 probabile a cui \u00e8 stato dato il maggior peso.<\/p>\n<p>Hanno ottenuto una precisione media di 41% e una precisione di 95,9% con i partecipanti migliori.<\/p>\n<figure id=\"attachment_6256\" aria-describedby=\"caption-attachment-6256\" style=\"width: 1024px\" class=\"wp-caption aligncenter\"><img loading=\"lazy\" decoding=\"async\" class=\"wp-image-6256 size-large\" src=\"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/10\/Speech-predictions-from-brain-waves-1024x578.webp\" alt=\"Previsioni vocali dalle onde cerebrali\" width=\"1024\" height=\"578\" srcset=\"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/10\/Speech-predictions-from-brain-waves-1024x578.webp 1024w, https:\/\/dailyai.com\/wp-content\/uploads\/2023\/10\/Speech-predictions-from-brain-waves-300x169.webp 300w, https:\/\/dailyai.com\/wp-content\/uploads\/2023\/10\/Speech-predictions-from-brain-waves-768x434.webp 768w, https:\/\/dailyai.com\/wp-content\/uploads\/2023\/10\/Speech-predictions-from-brain-waves-1536x867.webp 1536w, https:\/\/dailyai.com\/wp-content\/uploads\/2023\/10\/Speech-predictions-from-brain-waves-2048x1156.webp 2048w, https:\/\/dailyai.com\/wp-content\/uploads\/2023\/10\/Speech-predictions-from-brain-waves-370x209.webp 370w, https:\/\/dailyai.com\/wp-content\/uploads\/2023\/10\/Speech-predictions-from-brain-waves-800x452.webp 800w, https:\/\/dailyai.com\/wp-content\/uploads\/2023\/10\/Speech-predictions-from-brain-waves-20x11.webp 20w, https:\/\/dailyai.com\/wp-content\/uploads\/2023\/10\/Speech-predictions-from-brain-waves-740x418.webp 740w, https:\/\/dailyai.com\/wp-content\/uploads\/2023\/10\/Speech-predictions-from-brain-waves-1600x903.webp 1600w, https:\/\/dailyai.com\/wp-content\/uploads\/2023\/10\/Speech-predictions-from-brain-waves-1320x745.webp 1320w, https:\/\/dailyai.com\/wp-content\/uploads\/2023\/10\/Speech-predictions-from-brain-waves-85x48.webp 85w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><figcaption id=\"caption-attachment-6256\" class=\"wp-caption-text\">Previsioni a livello di parola mentre i partecipanti ascoltavano la frase \"Grazie per essere venuto, Ed\". Le parole blu corrispondono alla parola corretta e quelle nere ai candidati negativi. La dimensione del testo \u00e8 proporzionale al risultato di log-probabilit\u00e0 del modello. Fonte: <a href=\"https:\/\/www.nature.com\/articles\/s42256-023-00714-5\/figures\/3\" target=\"_blank\" rel=\"noopener\">Natura<\/a><\/figcaption><\/figure>\n<p>La ricerca dimostra che \u00e8 possibile farsi un'idea abbastanza precisa del discorso che una persona sta ascoltando, ma ora il processo deve essere invertito per essere utile. Dobbiamo misurare le loro onde cerebrali e sapere a quale parola stanno pensando.<\/p>\n<p>Il documento suggerisce di addestrare una rete neurale mentre i soggetti producono parole parlando o scrivendo. Questo modello generale potrebbe poi essere utilizzato per dare un senso alle onde cerebrali e alle parole associate che un malato di SLA stava pensando.<\/p>\n<p>I ricercatori sono stati in grado di identificare segmenti di discorso da un insieme limitato e predeterminato. Per una comunicazione corretta, bisognerebbe essere in grado di identificare molte pi\u00f9 parole. L'uso di un'intelligenza artificiale generativa per prevedere la parola successiva pi\u00f9 probabile che una persona sta cercando di dire potrebbe aiutare in questo senso.<\/p>\n<p>Anche se il processo non \u00e8 invasivo, \u00e8 comunque necessario essere collegati a un sistema di <a href=\"https:\/\/dailyai.com\/it\/2023\/08\/ai-mind-reading-medical-breakthrough-or-step-towards-dystopia\/\">Dispositivo MEG<\/a>. Purtroppo, i risultati delle misurazioni EEG non sono stati eccellenti.<\/p>\n<p>La ricerca promette che l'intelligenza artificiale potrebbe essere utilizzata per aiutare le persone senza voce, come i malati di SLA, a comunicare. L'uso di un modello pre-addestrato ha anche evitato la necessit\u00e0 di un addestramento pi\u00f9 minuzioso, parola per parola.<\/p>\n<p>Meta AI ha reso pubblici il modello e i dati, sperando che altri ricercatori si basino sul loro lavoro.<\/p>","protected":false},"excerpt":{"rendered":"<p>Il modo esatto in cui il nostro cervello elabora e formula il linguaggio \u00e8 ancora in gran parte un mistero. I ricercatori di Meta AI hanno trovato un nuovo modo per misurare le onde cerebrali e decodificare le parole ad esse associate. Le persone con capacit\u00e0 motorie fortemente limitate, come i malati di SLA, trovano particolarmente difficile comunicare. \u00c8 difficile immaginare la frustrazione di una persona come Stephen Hawking che costruisce faticosamente una frase con i movimenti degli occhi o contraendo un muscolo della guancia. Sono state condotte molte ricerche per decodificare il parlato dall'attivit\u00e0 cerebrale, ma i risultati migliori dipendono da impianti cervello-computer invasivi. I ricercatori di Meta AI hanno utilizzato la magneto-encefalografia (MEG)<\/p>","protected":false},"author":6,"featured_media":6257,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[84],"tags":[150,105,101,131],"class_list":["post-6254","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-industry","tag-ai-benefits","tag-machine-learning","tag-medtech","tag-meta"],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v27.4 - https:\/\/yoast.com\/product\/yoast-seo-wordpress\/ -->\n<title>AI decodes speech from non-invasive brain recordings | DailyAI<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/dailyai.com\/it\/2023\/10\/ai-decodes-speech-from-non-invasive-brain-recordings\/\" \/>\n<meta property=\"og:locale\" content=\"it_IT\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"AI decodes speech from non-invasive brain recordings | DailyAI\" \/>\n<meta property=\"og:description\" content=\"Exactly how our brains process and formulate language is still largely a mystery. Researchers at Meta AI found a new way to measure brain waves and decode the words associated with them. People who have severely limited motor skills, like ALS sufferers, find it particularly challenging to communicate. The frustration of a person like Stephen Hawking painstakingly constructing a sentence with eye movements or twitching a cheek muscle is hard to imagine. A lot of research has been done to decode speech from brain activity, but the best results depend on invasive brain-computer implants. Meta AI researchers used magneto-encephalography (MEG)\" \/>\n<meta property=\"og:url\" content=\"https:\/\/dailyai.com\/it\/2023\/10\/ai-decodes-speech-from-non-invasive-brain-recordings\/\" \/>\n<meta property=\"og:site_name\" content=\"DailyAI\" \/>\n<meta property=\"article:published_time\" content=\"2023-10-09T07:07:00+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/10\/Brain-waves-to-speech.jpg\" \/>\n\t<meta property=\"og:image:width\" content=\"1000\" \/>\n\t<meta property=\"og:image:height\" content=\"864\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/jpeg\" \/>\n<meta name=\"author\" content=\"Eugene van der Watt\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@DailyAIOfficial\" \/>\n<meta name=\"twitter:site\" content=\"@DailyAIOfficial\" \/>\n<meta name=\"twitter:label1\" content=\"Scritto da\" \/>\n\t<meta name=\"twitter:data1\" content=\"Eugene van der Watt\" \/>\n\t<meta name=\"twitter:label2\" content=\"Tempo di lettura stimato\" \/>\n\t<meta name=\"twitter:data2\" content=\"3 minuti\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\\\/\\\/schema.org\",\"@graph\":[{\"@type\":\"NewsArticle\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/2023\\\/10\\\/ai-decodes-speech-from-non-invasive-brain-recordings\\\/#article\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2023\\\/10\\\/ai-decodes-speech-from-non-invasive-brain-recordings\\\/\"},\"author\":{\"name\":\"Eugene van der Watt\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/#\\\/schema\\\/person\\\/7ce525c6d0c79838b7cc7cde96993cfa\"},\"headline\":\"AI decodes speech from non-invasive brain recordings\",\"datePublished\":\"2023-10-09T07:07:00+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2023\\\/10\\\/ai-decodes-speech-from-non-invasive-brain-recordings\\\/\"},\"wordCount\":519,\"publisher\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/#organization\"},\"image\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2023\\\/10\\\/ai-decodes-speech-from-non-invasive-brain-recordings\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/10\\\/Brain-waves-to-speech.jpg\",\"keywords\":[\"AI benefits\",\"machine learning\",\"MedTech\",\"Meta\"],\"articleSection\":[\"Industry\"],\"inLanguage\":\"it-IT\"},{\"@type\":\"WebPage\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/2023\\\/10\\\/ai-decodes-speech-from-non-invasive-brain-recordings\\\/\",\"url\":\"https:\\\/\\\/dailyai.com\\\/2023\\\/10\\\/ai-decodes-speech-from-non-invasive-brain-recordings\\\/\",\"name\":\"AI decodes speech from non-invasive brain recordings | DailyAI\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2023\\\/10\\\/ai-decodes-speech-from-non-invasive-brain-recordings\\\/#primaryimage\"},\"image\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2023\\\/10\\\/ai-decodes-speech-from-non-invasive-brain-recordings\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/10\\\/Brain-waves-to-speech.jpg\",\"datePublished\":\"2023-10-09T07:07:00+00:00\",\"breadcrumb\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2023\\\/10\\\/ai-decodes-speech-from-non-invasive-brain-recordings\\\/#breadcrumb\"},\"inLanguage\":\"it-IT\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\\\/\\\/dailyai.com\\\/2023\\\/10\\\/ai-decodes-speech-from-non-invasive-brain-recordings\\\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"it-IT\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/2023\\\/10\\\/ai-decodes-speech-from-non-invasive-brain-recordings\\\/#primaryimage\",\"url\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/10\\\/Brain-waves-to-speech.jpg\",\"contentUrl\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/10\\\/Brain-waves-to-speech.jpg\",\"width\":1000,\"height\":864},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/2023\\\/10\\\/ai-decodes-speech-from-non-invasive-brain-recordings\\\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\\\/\\\/dailyai.com\\\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"AI decodes speech from non-invasive brain recordings\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/#website\",\"url\":\"https:\\\/\\\/dailyai.com\\\/\",\"name\":\"DailyAI\",\"description\":\"Your Daily Dose of AI News\",\"publisher\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\\\/\\\/dailyai.com\\\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"it-IT\"},{\"@type\":\"Organization\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/#organization\",\"name\":\"DailyAI\",\"url\":\"https:\\\/\\\/dailyai.com\\\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"it-IT\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/#\\\/schema\\\/logo\\\/image\\\/\",\"url\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/06\\\/Daily-Ai_TL_colour.png\",\"contentUrl\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/06\\\/Daily-Ai_TL_colour.png\",\"width\":4501,\"height\":934,\"caption\":\"DailyAI\"},\"image\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/#\\\/schema\\\/logo\\\/image\\\/\"},\"sameAs\":[\"https:\\\/\\\/x.com\\\/DailyAIOfficial\",\"https:\\\/\\\/www.linkedin.com\\\/company\\\/dailyaiofficial\\\/\",\"https:\\\/\\\/www.youtube.com\\\/@DailyAIOfficial\"]},{\"@type\":\"Person\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/#\\\/schema\\\/person\\\/7ce525c6d0c79838b7cc7cde96993cfa\",\"name\":\"Eugene van der Watt\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"it-IT\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/07\\\/Eugine_Profile_Picture-96x96.png\",\"url\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/07\\\/Eugine_Profile_Picture-96x96.png\",\"contentUrl\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/07\\\/Eugine_Profile_Picture-96x96.png\",\"caption\":\"Eugene van der Watt\"},\"description\":\"Eugene comes from an electronic engineering background and loves all things tech. When he takes a break from consuming AI news you'll find him at the snooker table.\",\"sameAs\":[\"www.linkedin.com\\\/in\\\/eugene-van-der-watt-16828119\"],\"url\":\"https:\\\/\\\/dailyai.com\\\/it\\\/author\\\/eugene\\\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"L'intelligenza artificiale decodifica il parlato da registrazioni cerebrali non invasive | DailyAI","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/dailyai.com\/it\/2023\/10\/ai-decodes-speech-from-non-invasive-brain-recordings\/","og_locale":"it_IT","og_type":"article","og_title":"AI decodes speech from non-invasive brain recordings | DailyAI","og_description":"Exactly how our brains process and formulate language is still largely a mystery. Researchers at Meta AI found a new way to measure brain waves and decode the words associated with them. People who have severely limited motor skills, like ALS sufferers, find it particularly challenging to communicate. The frustration of a person like Stephen Hawking painstakingly constructing a sentence with eye movements or twitching a cheek muscle is hard to imagine. A lot of research has been done to decode speech from brain activity, but the best results depend on invasive brain-computer implants. Meta AI researchers used magneto-encephalography (MEG)","og_url":"https:\/\/dailyai.com\/it\/2023\/10\/ai-decodes-speech-from-non-invasive-brain-recordings\/","og_site_name":"DailyAI","article_published_time":"2023-10-09T07:07:00+00:00","og_image":[{"width":1000,"height":864,"url":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/10\/Brain-waves-to-speech.jpg","type":"image\/jpeg"}],"author":"Eugene van der Watt","twitter_card":"summary_large_image","twitter_creator":"@DailyAIOfficial","twitter_site":"@DailyAIOfficial","twitter_misc":{"Scritto da":"Eugene van der Watt","Tempo di lettura stimato":"3 minuti"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"NewsArticle","@id":"https:\/\/dailyai.com\/2023\/10\/ai-decodes-speech-from-non-invasive-brain-recordings\/#article","isPartOf":{"@id":"https:\/\/dailyai.com\/2023\/10\/ai-decodes-speech-from-non-invasive-brain-recordings\/"},"author":{"name":"Eugene van der Watt","@id":"https:\/\/dailyai.com\/#\/schema\/person\/7ce525c6d0c79838b7cc7cde96993cfa"},"headline":"AI decodes speech from non-invasive brain recordings","datePublished":"2023-10-09T07:07:00+00:00","mainEntityOfPage":{"@id":"https:\/\/dailyai.com\/2023\/10\/ai-decodes-speech-from-non-invasive-brain-recordings\/"},"wordCount":519,"publisher":{"@id":"https:\/\/dailyai.com\/#organization"},"image":{"@id":"https:\/\/dailyai.com\/2023\/10\/ai-decodes-speech-from-non-invasive-brain-recordings\/#primaryimage"},"thumbnailUrl":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/10\/Brain-waves-to-speech.jpg","keywords":["AI benefits","machine learning","MedTech","Meta"],"articleSection":["Industry"],"inLanguage":"it-IT"},{"@type":"WebPage","@id":"https:\/\/dailyai.com\/2023\/10\/ai-decodes-speech-from-non-invasive-brain-recordings\/","url":"https:\/\/dailyai.com\/2023\/10\/ai-decodes-speech-from-non-invasive-brain-recordings\/","name":"L'intelligenza artificiale decodifica il parlato da registrazioni cerebrali non invasive | DailyAI","isPartOf":{"@id":"https:\/\/dailyai.com\/#website"},"primaryImageOfPage":{"@id":"https:\/\/dailyai.com\/2023\/10\/ai-decodes-speech-from-non-invasive-brain-recordings\/#primaryimage"},"image":{"@id":"https:\/\/dailyai.com\/2023\/10\/ai-decodes-speech-from-non-invasive-brain-recordings\/#primaryimage"},"thumbnailUrl":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/10\/Brain-waves-to-speech.jpg","datePublished":"2023-10-09T07:07:00+00:00","breadcrumb":{"@id":"https:\/\/dailyai.com\/2023\/10\/ai-decodes-speech-from-non-invasive-brain-recordings\/#breadcrumb"},"inLanguage":"it-IT","potentialAction":[{"@type":"ReadAction","target":["https:\/\/dailyai.com\/2023\/10\/ai-decodes-speech-from-non-invasive-brain-recordings\/"]}]},{"@type":"ImageObject","inLanguage":"it-IT","@id":"https:\/\/dailyai.com\/2023\/10\/ai-decodes-speech-from-non-invasive-brain-recordings\/#primaryimage","url":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/10\/Brain-waves-to-speech.jpg","contentUrl":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/10\/Brain-waves-to-speech.jpg","width":1000,"height":864},{"@type":"BreadcrumbList","@id":"https:\/\/dailyai.com\/2023\/10\/ai-decodes-speech-from-non-invasive-brain-recordings\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/dailyai.com\/"},{"@type":"ListItem","position":2,"name":"AI decodes speech from non-invasive brain recordings"}]},{"@type":"WebSite","@id":"https:\/\/dailyai.com\/#website","url":"https:\/\/dailyai.com\/","name":"DailyAI","description":"La vostra dose quotidiana di notizie sull'intelligenza artificiale","publisher":{"@id":"https:\/\/dailyai.com\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/dailyai.com\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"it-IT"},{"@type":"Organization","@id":"https:\/\/dailyai.com\/#organization","name":"DailyAI","url":"https:\/\/dailyai.com\/","logo":{"@type":"ImageObject","inLanguage":"it-IT","@id":"https:\/\/dailyai.com\/#\/schema\/logo\/image\/","url":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/06\/Daily-Ai_TL_colour.png","contentUrl":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/06\/Daily-Ai_TL_colour.png","width":4501,"height":934,"caption":"DailyAI"},"image":{"@id":"https:\/\/dailyai.com\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/x.com\/DailyAIOfficial","https:\/\/www.linkedin.com\/company\/dailyaiofficial\/","https:\/\/www.youtube.com\/@DailyAIOfficial"]},{"@type":"Person","@id":"https:\/\/dailyai.com\/#\/schema\/person\/7ce525c6d0c79838b7cc7cde96993cfa","name":"Eugene van der Watt","image":{"@type":"ImageObject","inLanguage":"it-IT","@id":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/07\/Eugine_Profile_Picture-96x96.png","url":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/07\/Eugine_Profile_Picture-96x96.png","contentUrl":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/07\/Eugine_Profile_Picture-96x96.png","caption":"Eugene van der Watt"},"description":"Eugene proviene da un background di ingegneria elettronica e ama tutto ci\u00f2 che \u00e8 tecnologico. Quando si prende una pausa dal consumo di notizie sull'intelligenza artificiale, lo si pu\u00f2 trovare al tavolo da biliardo.","sameAs":["www.linkedin.com\/in\/eugene-van-der-watt-16828119"],"url":"https:\/\/dailyai.com\/it\/author\/eugene\/"}]}},"_links":{"self":[{"href":"https:\/\/dailyai.com\/it\/wp-json\/wp\/v2\/posts\/6254","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/dailyai.com\/it\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/dailyai.com\/it\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/dailyai.com\/it\/wp-json\/wp\/v2\/users\/6"}],"replies":[{"embeddable":true,"href":"https:\/\/dailyai.com\/it\/wp-json\/wp\/v2\/comments?post=6254"}],"version-history":[{"count":3,"href":"https:\/\/dailyai.com\/it\/wp-json\/wp\/v2\/posts\/6254\/revisions"}],"predecessor-version":[{"id":6259,"href":"https:\/\/dailyai.com\/it\/wp-json\/wp\/v2\/posts\/6254\/revisions\/6259"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/dailyai.com\/it\/wp-json\/wp\/v2\/media\/6257"}],"wp:attachment":[{"href":"https:\/\/dailyai.com\/it\/wp-json\/wp\/v2\/media?parent=6254"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/dailyai.com\/it\/wp-json\/wp\/v2\/categories?post=6254"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/dailyai.com\/it\/wp-json\/wp\/v2\/tags?post=6254"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}