{"id":11999,"date":"2024-05-07T15:02:38","date_gmt":"2024-05-07T15:02:38","guid":{"rendered":"https:\/\/dailyai.com\/?p=11999"},"modified":"2024-05-07T15:02:38","modified_gmt":"2024-05-07T15:02:38","slug":"microsoft-reportedly-building-a-500b-llm-called-mai-1","status":"publish","type":"post","link":"https:\/\/dailyai.com\/it\/2024\/05\/microsoft-reportedly-building-a-500b-llm-called-mai-1\/","title":{"rendered":"Microsoft starebbe costruendo un LLM da 500 miliardi chiamato MAI-1"},"content":{"rendered":"<p><strong>Secondo un rapporto di The Information, Microsoft sta lavorando a un LLM con parametro 500B chiamato MAI-1 che potrebbe sfidare GPT-4 e Google. <span class=\"noTranslate\" data-no-translation=\"\">Gemini<\/span> modelli.<\/strong><\/p>\n<p>Di recente abbiamo parlato dell'iniziativa di Microsoft <a href=\"https:\/\/dailyai.com\/it\/2024\/04\/microsoft-launches-phi-3-mini-a-tiny-but-powerful-lm\/\">Phi-3 Mini<\/a> famiglia di modelli linguistici di piccole dimensioni che vanno da 3,8B a 14B parametri. Con 500B parametri, MAI-1 \u00e8 destinato a essere il modello pi\u00f9 grande distribuito da Microsoft.<\/p>\n<p>Le sue dimensioni lo pongono allo stesso livello del GPT-4 e di quello pi\u00f9 grande di Google. <span class=\"noTranslate\" data-no-translation=\"\">Gemini<\/span> modelli. Si dice che il GPT-4 abbia 1,76T di parametri, ma \u00e8 un modello Mixture of Experts (MoE), quindi solo circa 280B parametri sono in gioco durante l'inferenza.<\/p>\n<p>Non sono disponibili informazioni sull'architettura del MAI-1, ma se si tratta di un modello denso, al contrario del MoE, allora sar\u00e0 piuttosto potente. Il modello Llama 3 previsto da Meta dovrebbe avere 400B parametri.<\/p>\n<p>Lo sviluppo di MAI-1 \u00e8 guidato da Mustafa Suleyman, cofondatore ed ex responsabile dell'IA applicata presso <span class=\"noTranslate\" data-no-translation=\"\">DeepMind<\/span>.<\/p>\n<p>Mustafa ha lasciato <span class=\"noTranslate\" data-no-translation=\"\">DeepMind<\/span> per co-fondare la startup di intelligenza artificiale Inflection nel 2022. Nel marzo di quest'anno, Microsoft ha assunto la maggior parte del personale di Inflection e ha pagato $650 milioni per i diritti sulla propriet\u00e0 intellettuale dell'azienda.<\/p>\n<p>A quanto pare, MAI-1 \u00e8 un progetto Microsoft completamente nuovo e non la continuazione di un progetto Inflection esistente. Non ci sono notizie sulla data di uscita, ma potremmo vedere un'anteprima di MAI-1 il 16 maggio alla conferenza per sviluppatori Build di Microsoft.<\/p>\n<p>Microsoft \u00e8 <span class=\"noTranslate\" data-no-translation=\"\">OpenAI<\/span>\u00e8 il pi\u00f9 grande investitore, per cui il fatto che stia sviluppando i propri LLM per rivaleggiare con quelli di <span class=\"noTranslate\" data-no-translation=\"\">OpenAI<\/span> \u00e8 un po' sorprendente per alcuni. Microsoft sta coprendo le sue scommesse, perseguendo strategie di sviluppo multiple o qualcosa di completamente diverso?<\/p>\n<p>Kevin Scott, CTO di Microsoft, ha cercato di minimizzare la questione. In un post su LinkedIn Scott ha dichiarato: \"Non so perch\u00e9 questa sia una notizia, ma per riassumere l'ovvio: noi costruiamo grandi supercomputer per addestrare modelli di IA; il nostro partner Open AI usa questi supercomputer per addestrare modelli che definiscono le frontiere; e poi entrambi rendiamo disponibili questi modelli in prodotti e servizi in modo che molte persone possano beneficiarne. Questo accordo ci piace molto\".<\/p>\n<p>Scott potrebbe essere sincero in questa affermazione, ma quando l'AMI-1 verr\u00e0 rilasciato potrebbe mettere Microsoft in competizione con l'azienda in cui ha investito miliardi di dollari.<\/p>\n<p>L'AMI-1 sar\u00e0 rilasciato giusto in tempo per <span class=\"noTranslate\" data-no-translation=\"\">OpenAI<\/span> per metterlo in secondo piano rilasciando il GPT-5? <span class=\"noTranslate\" data-no-translation=\"\">OpenAI<\/span> L'azienda aveva programmato un evento per gioved\u00ec prossimo in cui avrebbe dovuto condividere aggiornamenti e dimostrazioni di prodotti, ma l'evento \u00e8 stato rinviato.<\/p>\n<p>Con <a href=\"https:\/\/dailyai.com\/it\/2024\/04\/\/\">misteriosi chatbot GPT-2<\/a> che appaiono, scompaiono e ora riappaiono, Microsoft che costruisce modelli enormi, e <span class=\"noTranslate\" data-no-translation=\"\">OpenAI<\/span> Tenendoci indovinati, il dramma dell'IA \u00e8 implacabile.<\/p>","protected":false},"excerpt":{"rendered":"<p>Secondo un articolo di The Information, Microsoft sta lavorando a un LLM da 500B parametri chiamato MAI-1 che potrebbe sfidare i modelli GPT-4 e Gemini di Google. Di recente abbiamo parlato della famiglia Phi-3 Mini di Microsoft, una serie di modelli linguistici di piccole dimensioni con parametri da 3,8B a 14B. Con 500B parametri, MAI-1 \u00e8 destinato a essere il modello pi\u00f9 grande distribuito da Microsoft. Le sue dimensioni lo pongono allo stesso livello di GPT-4 e dei pi\u00f9 grandi modelli Gemini di Google. Si dice che GPT-4 abbia 1,76T di parametri, ma \u00e8 un modello Mixture of Experts (MoE), quindi solo circa 280B di parametri sono in gioco durante l'inferenza. Non c'\u00e8<\/p>","protected":false},"author":6,"featured_media":12007,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[84],"tags":[118,121],"class_list":["post-11999","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-industry","tag-llms","tag-microsoft"],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v27.3 - https:\/\/yoast.com\/product\/yoast-seo-wordpress\/ -->\n<title>Microsoft reportedly building a 500B LLM called MAI-1 | DailyAI<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/dailyai.com\/it\/2024\/05\/microsoft-reportedly-building-a-500b-llm-called-mai-1\/\" \/>\n<meta property=\"og:locale\" content=\"it_IT\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Microsoft reportedly building a 500B LLM called MAI-1 | DailyAI\" \/>\n<meta property=\"og:description\" content=\"According to a report by The Information, Microsoft is working on a 500B parameter LLM called MAI-1 that could take on GPT-4 and Google\u2019s Gemini models. We recently reported on Microsoft\u2019s Phi-3 Mini family of small language models ranging from 3.8B to 14B parameters. At 500B parameters, MAI-1 is set to be the largest model Microsoft has deployed. Its size puts it in the same ballpark as GPT-4 and Google\u2019s bigger Gemini models. GPT-4 is rumored to have 1.76T parameters but it\u2019s a Mixture of Experts (MoE) model so only around 280B parameters are in play during inference. There isn\u2019t\" \/>\n<meta property=\"og:url\" content=\"https:\/\/dailyai.com\/it\/2024\/05\/microsoft-reportedly-building-a-500b-llm-called-mai-1\/\" \/>\n<meta property=\"og:site_name\" content=\"DailyAI\" \/>\n<meta property=\"article:published_time\" content=\"2024-05-07T15:02:38+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/dailyai.com\/wp-content\/uploads\/2024\/05\/Microsoft-MAI-1.webp\" \/>\n\t<meta property=\"og:image:width\" content=\"1792\" \/>\n\t<meta property=\"og:image:height\" content=\"1024\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/webp\" \/>\n<meta name=\"author\" content=\"Eugene van der Watt\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@DailyAIOfficial\" \/>\n<meta name=\"twitter:site\" content=\"@DailyAIOfficial\" \/>\n<meta name=\"twitter:label1\" content=\"Scritto da\" \/>\n\t<meta name=\"twitter:data1\" content=\"Eugene van der Watt\" \/>\n\t<meta name=\"twitter:label2\" content=\"Tempo di lettura stimato\" \/>\n\t<meta name=\"twitter:data2\" content=\"3 minuti\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\\\/\\\/schema.org\",\"@graph\":[{\"@type\":\"NewsArticle\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/05\\\/microsoft-reportedly-building-a-500b-llm-called-mai-1\\\/#article\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/05\\\/microsoft-reportedly-building-a-500b-llm-called-mai-1\\\/\"},\"author\":{\"name\":\"Eugene van der Watt\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/#\\\/schema\\\/person\\\/7ce525c6d0c79838b7cc7cde96993cfa\"},\"headline\":\"Microsoft reportedly building a 500B LLM called MAI-1\",\"datePublished\":\"2024-05-07T15:02:38+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/05\\\/microsoft-reportedly-building-a-500b-llm-called-mai-1\\\/\"},\"wordCount\":455,\"publisher\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/#organization\"},\"image\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/05\\\/microsoft-reportedly-building-a-500b-llm-called-mai-1\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2024\\\/05\\\/Microsoft-MAI-1.webp\",\"keywords\":[\"LLMS\",\"Microsoft\"],\"articleSection\":[\"Industry\"],\"inLanguage\":\"it-IT\"},{\"@type\":\"WebPage\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/05\\\/microsoft-reportedly-building-a-500b-llm-called-mai-1\\\/\",\"url\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/05\\\/microsoft-reportedly-building-a-500b-llm-called-mai-1\\\/\",\"name\":\"Microsoft reportedly building a 500B LLM called MAI-1 | DailyAI\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/05\\\/microsoft-reportedly-building-a-500b-llm-called-mai-1\\\/#primaryimage\"},\"image\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/05\\\/microsoft-reportedly-building-a-500b-llm-called-mai-1\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2024\\\/05\\\/Microsoft-MAI-1.webp\",\"datePublished\":\"2024-05-07T15:02:38+00:00\",\"breadcrumb\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/05\\\/microsoft-reportedly-building-a-500b-llm-called-mai-1\\\/#breadcrumb\"},\"inLanguage\":\"it-IT\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\\\/\\\/dailyai.com\\\/2024\\\/05\\\/microsoft-reportedly-building-a-500b-llm-called-mai-1\\\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"it-IT\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/05\\\/microsoft-reportedly-building-a-500b-llm-called-mai-1\\\/#primaryimage\",\"url\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2024\\\/05\\\/Microsoft-MAI-1.webp\",\"contentUrl\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2024\\\/05\\\/Microsoft-MAI-1.webp\",\"width\":1792,\"height\":1024},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/05\\\/microsoft-reportedly-building-a-500b-llm-called-mai-1\\\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\\\/\\\/dailyai.com\\\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Microsoft reportedly building a 500B LLM called MAI-1\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/#website\",\"url\":\"https:\\\/\\\/dailyai.com\\\/\",\"name\":\"DailyAI\",\"description\":\"Your Daily Dose of AI News\",\"publisher\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\\\/\\\/dailyai.com\\\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"it-IT\"},{\"@type\":\"Organization\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/#organization\",\"name\":\"DailyAI\",\"url\":\"https:\\\/\\\/dailyai.com\\\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"it-IT\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/#\\\/schema\\\/logo\\\/image\\\/\",\"url\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/06\\\/Daily-Ai_TL_colour.png\",\"contentUrl\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/06\\\/Daily-Ai_TL_colour.png\",\"width\":4501,\"height\":934,\"caption\":\"DailyAI\"},\"image\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/#\\\/schema\\\/logo\\\/image\\\/\"},\"sameAs\":[\"https:\\\/\\\/x.com\\\/DailyAIOfficial\",\"https:\\\/\\\/www.linkedin.com\\\/company\\\/dailyaiofficial\\\/\",\"https:\\\/\\\/www.youtube.com\\\/@DailyAIOfficial\"]},{\"@type\":\"Person\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/#\\\/schema\\\/person\\\/7ce525c6d0c79838b7cc7cde96993cfa\",\"name\":\"Eugene van der Watt\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"it-IT\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/07\\\/Eugine_Profile_Picture-96x96.png\",\"url\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/07\\\/Eugine_Profile_Picture-96x96.png\",\"contentUrl\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/07\\\/Eugine_Profile_Picture-96x96.png\",\"caption\":\"Eugene van der Watt\"},\"description\":\"Eugene comes from an electronic engineering background and loves all things tech. When he takes a break from consuming AI news you'll find him at the snooker table.\",\"sameAs\":[\"www.linkedin.com\\\/in\\\/eugene-van-der-watt-16828119\"],\"url\":\"https:\\\/\\\/dailyai.com\\\/it\\\/author\\\/eugene\\\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Microsoft starebbe costruendo un LLM da 500 miliardi chiamato MAI-1 | DailyAI","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/dailyai.com\/it\/2024\/05\/microsoft-reportedly-building-a-500b-llm-called-mai-1\/","og_locale":"it_IT","og_type":"article","og_title":"Microsoft reportedly building a 500B LLM called MAI-1 | DailyAI","og_description":"According to a report by The Information, Microsoft is working on a 500B parameter LLM called MAI-1 that could take on GPT-4 and Google\u2019s Gemini models. We recently reported on Microsoft\u2019s Phi-3 Mini family of small language models ranging from 3.8B to 14B parameters. At 500B parameters, MAI-1 is set to be the largest model Microsoft has deployed. Its size puts it in the same ballpark as GPT-4 and Google\u2019s bigger Gemini models. GPT-4 is rumored to have 1.76T parameters but it\u2019s a Mixture of Experts (MoE) model so only around 280B parameters are in play during inference. There isn\u2019t","og_url":"https:\/\/dailyai.com\/it\/2024\/05\/microsoft-reportedly-building-a-500b-llm-called-mai-1\/","og_site_name":"DailyAI","article_published_time":"2024-05-07T15:02:38+00:00","og_image":[{"width":1792,"height":1024,"url":"https:\/\/dailyai.com\/wp-content\/uploads\/2024\/05\/Microsoft-MAI-1.webp","type":"image\/webp"}],"author":"Eugene van der Watt","twitter_card":"summary_large_image","twitter_creator":"@DailyAIOfficial","twitter_site":"@DailyAIOfficial","twitter_misc":{"Scritto da":"Eugene van der Watt","Tempo di lettura stimato":"3 minuti"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"NewsArticle","@id":"https:\/\/dailyai.com\/2024\/05\/microsoft-reportedly-building-a-500b-llm-called-mai-1\/#article","isPartOf":{"@id":"https:\/\/dailyai.com\/2024\/05\/microsoft-reportedly-building-a-500b-llm-called-mai-1\/"},"author":{"name":"Eugene van der Watt","@id":"https:\/\/dailyai.com\/#\/schema\/person\/7ce525c6d0c79838b7cc7cde96993cfa"},"headline":"Microsoft reportedly building a 500B LLM called MAI-1","datePublished":"2024-05-07T15:02:38+00:00","mainEntityOfPage":{"@id":"https:\/\/dailyai.com\/2024\/05\/microsoft-reportedly-building-a-500b-llm-called-mai-1\/"},"wordCount":455,"publisher":{"@id":"https:\/\/dailyai.com\/#organization"},"image":{"@id":"https:\/\/dailyai.com\/2024\/05\/microsoft-reportedly-building-a-500b-llm-called-mai-1\/#primaryimage"},"thumbnailUrl":"https:\/\/dailyai.com\/wp-content\/uploads\/2024\/05\/Microsoft-MAI-1.webp","keywords":["LLMS","Microsoft"],"articleSection":["Industry"],"inLanguage":"it-IT"},{"@type":"WebPage","@id":"https:\/\/dailyai.com\/2024\/05\/microsoft-reportedly-building-a-500b-llm-called-mai-1\/","url":"https:\/\/dailyai.com\/2024\/05\/microsoft-reportedly-building-a-500b-llm-called-mai-1\/","name":"Microsoft starebbe costruendo un LLM da 500 miliardi chiamato MAI-1 | DailyAI","isPartOf":{"@id":"https:\/\/dailyai.com\/#website"},"primaryImageOfPage":{"@id":"https:\/\/dailyai.com\/2024\/05\/microsoft-reportedly-building-a-500b-llm-called-mai-1\/#primaryimage"},"image":{"@id":"https:\/\/dailyai.com\/2024\/05\/microsoft-reportedly-building-a-500b-llm-called-mai-1\/#primaryimage"},"thumbnailUrl":"https:\/\/dailyai.com\/wp-content\/uploads\/2024\/05\/Microsoft-MAI-1.webp","datePublished":"2024-05-07T15:02:38+00:00","breadcrumb":{"@id":"https:\/\/dailyai.com\/2024\/05\/microsoft-reportedly-building-a-500b-llm-called-mai-1\/#breadcrumb"},"inLanguage":"it-IT","potentialAction":[{"@type":"ReadAction","target":["https:\/\/dailyai.com\/2024\/05\/microsoft-reportedly-building-a-500b-llm-called-mai-1\/"]}]},{"@type":"ImageObject","inLanguage":"it-IT","@id":"https:\/\/dailyai.com\/2024\/05\/microsoft-reportedly-building-a-500b-llm-called-mai-1\/#primaryimage","url":"https:\/\/dailyai.com\/wp-content\/uploads\/2024\/05\/Microsoft-MAI-1.webp","contentUrl":"https:\/\/dailyai.com\/wp-content\/uploads\/2024\/05\/Microsoft-MAI-1.webp","width":1792,"height":1024},{"@type":"BreadcrumbList","@id":"https:\/\/dailyai.com\/2024\/05\/microsoft-reportedly-building-a-500b-llm-called-mai-1\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/dailyai.com\/"},{"@type":"ListItem","position":2,"name":"Microsoft reportedly building a 500B LLM called MAI-1"}]},{"@type":"WebSite","@id":"https:\/\/dailyai.com\/#website","url":"https:\/\/dailyai.com\/","name":"DailyAI","description":"La vostra dose quotidiana di notizie sull'intelligenza artificiale","publisher":{"@id":"https:\/\/dailyai.com\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/dailyai.com\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"it-IT"},{"@type":"Organization","@id":"https:\/\/dailyai.com\/#organization","name":"DailyAI","url":"https:\/\/dailyai.com\/","logo":{"@type":"ImageObject","inLanguage":"it-IT","@id":"https:\/\/dailyai.com\/#\/schema\/logo\/image\/","url":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/06\/Daily-Ai_TL_colour.png","contentUrl":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/06\/Daily-Ai_TL_colour.png","width":4501,"height":934,"caption":"DailyAI"},"image":{"@id":"https:\/\/dailyai.com\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/x.com\/DailyAIOfficial","https:\/\/www.linkedin.com\/company\/dailyaiofficial\/","https:\/\/www.youtube.com\/@DailyAIOfficial"]},{"@type":"Person","@id":"https:\/\/dailyai.com\/#\/schema\/person\/7ce525c6d0c79838b7cc7cde96993cfa","name":"Eugene van der Watt","image":{"@type":"ImageObject","inLanguage":"it-IT","@id":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/07\/Eugine_Profile_Picture-96x96.png","url":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/07\/Eugine_Profile_Picture-96x96.png","contentUrl":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/07\/Eugine_Profile_Picture-96x96.png","caption":"Eugene van der Watt"},"description":"Eugene proviene da un background di ingegneria elettronica e ama tutto ci\u00f2 che \u00e8 tecnologico. Quando si prende una pausa dal consumo di notizie sull'intelligenza artificiale, lo si pu\u00f2 trovare al tavolo da biliardo.","sameAs":["www.linkedin.com\/in\/eugene-van-der-watt-16828119"],"url":"https:\/\/dailyai.com\/it\/author\/eugene\/"}]}},"_links":{"self":[{"href":"https:\/\/dailyai.com\/it\/wp-json\/wp\/v2\/posts\/11999","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/dailyai.com\/it\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/dailyai.com\/it\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/dailyai.com\/it\/wp-json\/wp\/v2\/users\/6"}],"replies":[{"embeddable":true,"href":"https:\/\/dailyai.com\/it\/wp-json\/wp\/v2\/comments?post=11999"}],"version-history":[{"count":3,"href":"https:\/\/dailyai.com\/it\/wp-json\/wp\/v2\/posts\/11999\/revisions"}],"predecessor-version":[{"id":12009,"href":"https:\/\/dailyai.com\/it\/wp-json\/wp\/v2\/posts\/11999\/revisions\/12009"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/dailyai.com\/it\/wp-json\/wp\/v2\/media\/12007"}],"wp:attachment":[{"href":"https:\/\/dailyai.com\/it\/wp-json\/wp\/v2\/media?parent=11999"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/dailyai.com\/it\/wp-json\/wp\/v2\/categories?post=11999"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/dailyai.com\/it\/wp-json\/wp\/v2\/tags?post=11999"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}