{"id":12729,"date":"2024-06-05T09:50:37","date_gmt":"2024-06-05T09:50:37","guid":{"rendered":"https:\/\/dailyai.com\/?p=12729"},"modified":"2024-06-05T09:50:37","modified_gmt":"2024-06-05T09:50:37","slug":"former-openai-employees-publish-right-to-warn-open-letter","status":"publish","type":"post","link":"https:\/\/dailyai.com\/it\/2024\/06\/former-openai-employees-publish-right-to-warn-open-letter\/","title":{"rendered":"Gli ex dipendenti di OpenAI pubblicano la lettera aperta \"Right to Warn\"."},"content":{"rendered":"<p><strong>Un gruppo di ex e attuali dipendenti di OpenAI e di Google sta chiamando in causa le aziende di IA per quella che, a loro dire, \u00e8 una pericolosa cultura della segretezza che circonda i rischi dell'IA.<\/strong><\/p>\n<p>La lettera, intitolata \"Il diritto di mettere in guardia sull'intelligenza artificiale avanzata\", afferma che le aziende produttrici di IA hanno forti incentivi finanziari per evitare una sorveglianza efficace dei potenziali rischi dell'IA.<\/p>\n<p>Oltre a essere imprudenti, concentrandosi sugli obiettivi finanziari invece che sulla sicurezza, la lettera afferma che le aziende utilizzano accordi di riservatezza punitivi per scoraggiare attivamente i dipendenti dal sollevare dubbi.<\/p>\n<p>I firmatari sono tutti ex dipendenti di OpenAI e Google, con Neel Nanda, l'unico che lavora ancora in Google. La lettera \u00e8 stata approvata anche da Yoshua Bengio, Geoffrey Hinton e Stuart Russell, menti di spicco dell'intelligenza artificiale.<\/p>\n<p>A riprova della preoccupazione di chiamare in causa i loro ex datori di lavoro, 6 dei firmatari non hanno voluto rivelare i loro nomi nella lettera.<\/p>\n<p>Gli ex ricercatori di OpenAI Daniel Kokotajlo e William Saunders, anch'essi firmatari della lettera, hanno lasciato l'azienda all'inizio dell'anno.<\/p>\n<p>Kokotajlo faceva parte del team di governance e Saunders lavorava nel team Superalignment di OpenAI che \u00e8 stato sciolto il mese scorso quando <a href=\"https:\/\/dailyai.com\/it\/2024\/05\/openais-superalignment-meltdown-can-the-company-salvage-any-trust\/\">Anche Ilya Sutskever e Jan Leike hanno lasciato per problemi di sicurezza.<\/a>.<\/p>\n<p>Kokotajlo ha spiegato il motivo del suo abbandono su un forum, dicendo che non crede che OpenAI \"si comporter\u00e0 in modo responsabile al momento dell'AGI\".<\/p>\n<h2>Un invito all'azione<\/h2>\n<p>La lettera chiede un maggiore impegno da parte delle aziende di IA in assenza di una normativa che disciplini i rischi dell'IA di cui il pubblico non \u00e8 a conoscenza.<\/p>\n<p>Nella lettera si legge: \"Le normali tutele per gli informatori sono insufficienti perch\u00e9 si concentrano sulle attivit\u00e0 illegali, mentre molti dei rischi che ci preoccupano non sono ancora regolamentati\".<\/p>\n<p>La lettera chiede alle aziende di IA di impegnarsi a rispettare quattro principi. In breve, si chiede alle aziende di:<\/p>\n<ul>\n<li>Non stipulare o applicare accordi che vietino di criticare l'azienda per questioni di sicurezza o di trattenere i benefici finanziari dovuti al dipendente. (ahem, OpenAI)<\/li>\n<li>Facilitare un processo anonimo che consenta ai dipendenti di segnalare al consiglio di amministrazione dell'azienda o ad altre organizzazioni di regolamentazione le preoccupazioni relative ai rischi.<\/li>\n<li>Sostenere una cultura di critica aperta che consenta ai dipendenti di rendere pubbliche le preoccupazioni relative ai rischi senza rivelare la propriet\u00e0 intellettuale.<\/li>\n<li>Non subire ritorsioni nei confronti di dipendenti ed ex dipendenti che condividono pubblicamente informazioni riservate sui rischi dopo che altre procedure sono fallite.<\/li>\n<\/ul>\n<p>Molti dei nomi presenti nell'elenco dei firmatari si considerano altruisti effettivi. Dai loro post e commenti \u00e8 chiaro che persone come Daniel Kokotajlo (Less Wrong) e William Saunders (AI Alignment Forum) credono che le cose potrebbero finire molto male se i rischi dell'IA non vengono gestiti.<\/p>\n<p>Ma non si tratta di troll catastrofisti di un forum che si esprimono in disparte. Si tratta di intelletti di spicco che aziende come OpenAI e Google hanno ritenuto opportuno impiegare per creare la tecnologia che ora temono.<\/p>\n<p>E ora dicono: \"Abbiamo visto cose che ci spaventano. Vogliamo avvertire la gente, ma non ci \u00e8 permesso\".<\/p>\n<p>\u00c8 possibile leggere il <a href=\"https:\/\/righttowarn.ai\/\" target=\"_blank\" rel=\"noopener\">lettera qui<\/a>.<\/p>","protected":false},"excerpt":{"rendered":"<p>Un gruppo di ex e attuali dipendenti di OpenAI e di Google ha denunciato alle aziende di IA quella che, a loro dire, \u00e8 una pericolosa cultura di segretezza sui rischi dell'IA. La lettera, intitolata \"Il diritto di mettere in guardia sull'intelligenza artificiale avanzata\", afferma che le aziende di IA hanno forti incentivi finanziari per evitare un'efficace supervisione dei potenziali rischi dell'IA. Oltre a essere imprudenti, concentrandosi sugli obiettivi finanziari invece che sulla sicurezza, la lettera afferma che le aziende utilizzano accordi di riservatezza punitivi per scoraggiare attivamente i dipendenti dal sollevare dubbi. I firmatari sono tutti ex dipendenti di OpenAI e Google, con Neel Nanda, l'unico che lavora ancora presso l'azienda.<\/p>","protected":false},"author":6,"featured_media":12731,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[88],"tags":[163,102,93],"class_list":["post-12729","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-ethics","tag-ai-risks","tag-google","tag-openai"],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v27.4 - https:\/\/yoast.com\/product\/yoast-seo-wordpress\/ -->\n<title>Former OpenAI employees publish \u2018Right to Warn\u2019 open letter | DailyAI<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/dailyai.com\/it\/2024\/06\/former-openai-employees-publish-right-to-warn-open-letter\/\" \/>\n<meta property=\"og:locale\" content=\"it_IT\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Former OpenAI employees publish \u2018Right to Warn\u2019 open letter | DailyAI\" \/>\n<meta property=\"og:description\" content=\"A group of former and current OpenAI and Google employees are calling AI companies out on what they say is a dangerous culture of secrecy surrounding AI risks. The letter titled \u201cA right to warn about advanced artificial intelligence\u201d states that AI companies have strong financial incentives to avoid effective oversight of potential AI risks. Besides being reckless by focusing on financial objectives instead of safety, the letter says that companies use punitive confidentiality agreements to actively discourage employees from raising concerns. The signatories are all former OpenAI and Google employees, with Neel Nanda, the only one still working at\" \/>\n<meta property=\"og:url\" content=\"https:\/\/dailyai.com\/it\/2024\/06\/former-openai-employees-publish-right-to-warn-open-letter\/\" \/>\n<meta property=\"og:site_name\" content=\"DailyAI\" \/>\n<meta property=\"article:published_time\" content=\"2024-06-05T09:50:37+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/dailyai.com\/wp-content\/uploads\/2024\/06\/Warning-letter.webp\" \/>\n\t<meta property=\"og:image:width\" content=\"1792\" \/>\n\t<meta property=\"og:image:height\" content=\"1024\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/webp\" \/>\n<meta name=\"author\" content=\"Eugene van der Watt\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@DailyAIOfficial\" \/>\n<meta name=\"twitter:site\" content=\"@DailyAIOfficial\" \/>\n<meta name=\"twitter:label1\" content=\"Scritto da\" \/>\n\t<meta name=\"twitter:data1\" content=\"Eugene van der Watt\" \/>\n\t<meta name=\"twitter:label2\" content=\"Tempo di lettura stimato\" \/>\n\t<meta name=\"twitter:data2\" content=\"3 minuti\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\\\/\\\/schema.org\",\"@graph\":[{\"@type\":\"NewsArticle\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/06\\\/former-openai-employees-publish-right-to-warn-open-letter\\\/#article\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/06\\\/former-openai-employees-publish-right-to-warn-open-letter\\\/\"},\"author\":{\"name\":\"Eugene van der Watt\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/#\\\/schema\\\/person\\\/7ce525c6d0c79838b7cc7cde96993cfa\"},\"headline\":\"Former OpenAI employees publish \u2018Right to Warn\u2019 open letter\",\"datePublished\":\"2024-06-05T09:50:37+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/06\\\/former-openai-employees-publish-right-to-warn-open-letter\\\/\"},\"wordCount\":487,\"publisher\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/#organization\"},\"image\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/06\\\/former-openai-employees-publish-right-to-warn-open-letter\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2024\\\/06\\\/Warning-letter.webp\",\"keywords\":[\"AI risks\",\"Google\",\"OpenAI\"],\"articleSection\":[\"Ethics &amp; Society\"],\"inLanguage\":\"it-IT\"},{\"@type\":\"WebPage\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/06\\\/former-openai-employees-publish-right-to-warn-open-letter\\\/\",\"url\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/06\\\/former-openai-employees-publish-right-to-warn-open-letter\\\/\",\"name\":\"Former OpenAI employees publish \u2018Right to Warn\u2019 open letter | DailyAI\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/06\\\/former-openai-employees-publish-right-to-warn-open-letter\\\/#primaryimage\"},\"image\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/06\\\/former-openai-employees-publish-right-to-warn-open-letter\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2024\\\/06\\\/Warning-letter.webp\",\"datePublished\":\"2024-06-05T09:50:37+00:00\",\"breadcrumb\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/06\\\/former-openai-employees-publish-right-to-warn-open-letter\\\/#breadcrumb\"},\"inLanguage\":\"it-IT\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\\\/\\\/dailyai.com\\\/2024\\\/06\\\/former-openai-employees-publish-right-to-warn-open-letter\\\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"it-IT\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/06\\\/former-openai-employees-publish-right-to-warn-open-letter\\\/#primaryimage\",\"url\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2024\\\/06\\\/Warning-letter.webp\",\"contentUrl\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2024\\\/06\\\/Warning-letter.webp\",\"width\":1792,\"height\":1024},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/06\\\/former-openai-employees-publish-right-to-warn-open-letter\\\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\\\/\\\/dailyai.com\\\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Former OpenAI employees publish \u2018Right to Warn\u2019 open letter\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/#website\",\"url\":\"https:\\\/\\\/dailyai.com\\\/\",\"name\":\"DailyAI\",\"description\":\"Your Daily Dose of AI News\",\"publisher\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\\\/\\\/dailyai.com\\\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"it-IT\"},{\"@type\":\"Organization\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/#organization\",\"name\":\"DailyAI\",\"url\":\"https:\\\/\\\/dailyai.com\\\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"it-IT\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/#\\\/schema\\\/logo\\\/image\\\/\",\"url\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/06\\\/Daily-Ai_TL_colour.png\",\"contentUrl\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/06\\\/Daily-Ai_TL_colour.png\",\"width\":4501,\"height\":934,\"caption\":\"DailyAI\"},\"image\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/#\\\/schema\\\/logo\\\/image\\\/\"},\"sameAs\":[\"https:\\\/\\\/x.com\\\/DailyAIOfficial\",\"https:\\\/\\\/www.linkedin.com\\\/company\\\/dailyaiofficial\\\/\",\"https:\\\/\\\/www.youtube.com\\\/@DailyAIOfficial\"]},{\"@type\":\"Person\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/#\\\/schema\\\/person\\\/7ce525c6d0c79838b7cc7cde96993cfa\",\"name\":\"Eugene van der Watt\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"it-IT\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/07\\\/Eugine_Profile_Picture-96x96.png\",\"url\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/07\\\/Eugine_Profile_Picture-96x96.png\",\"contentUrl\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/07\\\/Eugine_Profile_Picture-96x96.png\",\"caption\":\"Eugene van der Watt\"},\"description\":\"Eugene comes from an electronic engineering background and loves all things tech. When he takes a break from consuming AI news you'll find him at the snooker table.\",\"sameAs\":[\"www.linkedin.com\\\/in\\\/eugene-van-der-watt-16828119\"],\"url\":\"https:\\\/\\\/dailyai.com\\\/it\\\/author\\\/eugene\\\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Gli ex dipendenti di OpenAI pubblicano la lettera aperta \"Right to Warn\" | DailyAI","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/dailyai.com\/it\/2024\/06\/former-openai-employees-publish-right-to-warn-open-letter\/","og_locale":"it_IT","og_type":"article","og_title":"Former OpenAI employees publish \u2018Right to Warn\u2019 open letter | DailyAI","og_description":"A group of former and current OpenAI and Google employees are calling AI companies out on what they say is a dangerous culture of secrecy surrounding AI risks. The letter titled \u201cA right to warn about advanced artificial intelligence\u201d states that AI companies have strong financial incentives to avoid effective oversight of potential AI risks. Besides being reckless by focusing on financial objectives instead of safety, the letter says that companies use punitive confidentiality agreements to actively discourage employees from raising concerns. The signatories are all former OpenAI and Google employees, with Neel Nanda, the only one still working at","og_url":"https:\/\/dailyai.com\/it\/2024\/06\/former-openai-employees-publish-right-to-warn-open-letter\/","og_site_name":"DailyAI","article_published_time":"2024-06-05T09:50:37+00:00","og_image":[{"width":1792,"height":1024,"url":"https:\/\/dailyai.com\/wp-content\/uploads\/2024\/06\/Warning-letter.webp","type":"image\/webp"}],"author":"Eugene van der Watt","twitter_card":"summary_large_image","twitter_creator":"@DailyAIOfficial","twitter_site":"@DailyAIOfficial","twitter_misc":{"Scritto da":"Eugene van der Watt","Tempo di lettura stimato":"3 minuti"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"NewsArticle","@id":"https:\/\/dailyai.com\/2024\/06\/former-openai-employees-publish-right-to-warn-open-letter\/#article","isPartOf":{"@id":"https:\/\/dailyai.com\/2024\/06\/former-openai-employees-publish-right-to-warn-open-letter\/"},"author":{"name":"Eugene van der Watt","@id":"https:\/\/dailyai.com\/#\/schema\/person\/7ce525c6d0c79838b7cc7cde96993cfa"},"headline":"Former OpenAI employees publish \u2018Right to Warn\u2019 open letter","datePublished":"2024-06-05T09:50:37+00:00","mainEntityOfPage":{"@id":"https:\/\/dailyai.com\/2024\/06\/former-openai-employees-publish-right-to-warn-open-letter\/"},"wordCount":487,"publisher":{"@id":"https:\/\/dailyai.com\/#organization"},"image":{"@id":"https:\/\/dailyai.com\/2024\/06\/former-openai-employees-publish-right-to-warn-open-letter\/#primaryimage"},"thumbnailUrl":"https:\/\/dailyai.com\/wp-content\/uploads\/2024\/06\/Warning-letter.webp","keywords":["AI risks","Google","OpenAI"],"articleSection":["Ethics &amp; Society"],"inLanguage":"it-IT"},{"@type":"WebPage","@id":"https:\/\/dailyai.com\/2024\/06\/former-openai-employees-publish-right-to-warn-open-letter\/","url":"https:\/\/dailyai.com\/2024\/06\/former-openai-employees-publish-right-to-warn-open-letter\/","name":"Gli ex dipendenti di OpenAI pubblicano la lettera aperta \"Right to Warn\" | DailyAI","isPartOf":{"@id":"https:\/\/dailyai.com\/#website"},"primaryImageOfPage":{"@id":"https:\/\/dailyai.com\/2024\/06\/former-openai-employees-publish-right-to-warn-open-letter\/#primaryimage"},"image":{"@id":"https:\/\/dailyai.com\/2024\/06\/former-openai-employees-publish-right-to-warn-open-letter\/#primaryimage"},"thumbnailUrl":"https:\/\/dailyai.com\/wp-content\/uploads\/2024\/06\/Warning-letter.webp","datePublished":"2024-06-05T09:50:37+00:00","breadcrumb":{"@id":"https:\/\/dailyai.com\/2024\/06\/former-openai-employees-publish-right-to-warn-open-letter\/#breadcrumb"},"inLanguage":"it-IT","potentialAction":[{"@type":"ReadAction","target":["https:\/\/dailyai.com\/2024\/06\/former-openai-employees-publish-right-to-warn-open-letter\/"]}]},{"@type":"ImageObject","inLanguage":"it-IT","@id":"https:\/\/dailyai.com\/2024\/06\/former-openai-employees-publish-right-to-warn-open-letter\/#primaryimage","url":"https:\/\/dailyai.com\/wp-content\/uploads\/2024\/06\/Warning-letter.webp","contentUrl":"https:\/\/dailyai.com\/wp-content\/uploads\/2024\/06\/Warning-letter.webp","width":1792,"height":1024},{"@type":"BreadcrumbList","@id":"https:\/\/dailyai.com\/2024\/06\/former-openai-employees-publish-right-to-warn-open-letter\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/dailyai.com\/"},{"@type":"ListItem","position":2,"name":"Former OpenAI employees publish \u2018Right to Warn\u2019 open letter"}]},{"@type":"WebSite","@id":"https:\/\/dailyai.com\/#website","url":"https:\/\/dailyai.com\/","name":"DailyAI","description":"La vostra dose quotidiana di notizie sull'intelligenza artificiale","publisher":{"@id":"https:\/\/dailyai.com\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/dailyai.com\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"it-IT"},{"@type":"Organization","@id":"https:\/\/dailyai.com\/#organization","name":"DailyAI","url":"https:\/\/dailyai.com\/","logo":{"@type":"ImageObject","inLanguage":"it-IT","@id":"https:\/\/dailyai.com\/#\/schema\/logo\/image\/","url":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/06\/Daily-Ai_TL_colour.png","contentUrl":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/06\/Daily-Ai_TL_colour.png","width":4501,"height":934,"caption":"DailyAI"},"image":{"@id":"https:\/\/dailyai.com\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/x.com\/DailyAIOfficial","https:\/\/www.linkedin.com\/company\/dailyaiofficial\/","https:\/\/www.youtube.com\/@DailyAIOfficial"]},{"@type":"Person","@id":"https:\/\/dailyai.com\/#\/schema\/person\/7ce525c6d0c79838b7cc7cde96993cfa","name":"Eugene van der Watt","image":{"@type":"ImageObject","inLanguage":"it-IT","@id":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/07\/Eugine_Profile_Picture-96x96.png","url":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/07\/Eugine_Profile_Picture-96x96.png","contentUrl":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/07\/Eugine_Profile_Picture-96x96.png","caption":"Eugene van der Watt"},"description":"Eugene proviene da un background di ingegneria elettronica e ama tutto ci\u00f2 che \u00e8 tecnologico. Quando si prende una pausa dal consumo di notizie sull'intelligenza artificiale, lo si pu\u00f2 trovare al tavolo da biliardo.","sameAs":["www.linkedin.com\/in\/eugene-van-der-watt-16828119"],"url":"https:\/\/dailyai.com\/it\/author\/eugene\/"}]}},"_links":{"self":[{"href":"https:\/\/dailyai.com\/it\/wp-json\/wp\/v2\/posts\/12729","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/dailyai.com\/it\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/dailyai.com\/it\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/dailyai.com\/it\/wp-json\/wp\/v2\/users\/6"}],"replies":[{"embeddable":true,"href":"https:\/\/dailyai.com\/it\/wp-json\/wp\/v2\/comments?post=12729"}],"version-history":[{"count":3,"href":"https:\/\/dailyai.com\/it\/wp-json\/wp\/v2\/posts\/12729\/revisions"}],"predecessor-version":[{"id":12733,"href":"https:\/\/dailyai.com\/it\/wp-json\/wp\/v2\/posts\/12729\/revisions\/12733"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/dailyai.com\/it\/wp-json\/wp\/v2\/media\/12731"}],"wp:attachment":[{"href":"https:\/\/dailyai.com\/it\/wp-json\/wp\/v2\/media?parent=12729"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/dailyai.com\/it\/wp-json\/wp\/v2\/categories?post=12729"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/dailyai.com\/it\/wp-json\/wp\/v2\/tags?post=12729"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}