{"id":12729,"date":"2024-06-05T09:50:37","date_gmt":"2024-06-05T09:50:37","guid":{"rendered":"https:\/\/dailyai.com\/?p=12729"},"modified":"2024-06-05T09:50:37","modified_gmt":"2024-06-05T09:50:37","slug":"former-openai-employees-publish-right-to-warn-open-letter","status":"publish","type":"post","link":"https:\/\/dailyai.com\/da\/2024\/06\/former-openai-employees-publish-right-to-warn-open-letter\/","title":{"rendered":"Tidligere OpenAI-medarbejdere udgiver \u00e5bent brev om 'retten til at advare'"},"content":{"rendered":"<p><strong>En gruppe tidligere og nuv\u00e6rende OpenAI- og Google-medarbejdere kritiserer AI-virksomhederne for, hvad de mener, er en farlig hemmeligholdelseskultur omkring AI-risici.<\/strong><\/p>\n<p>I brevet med titlen \"En ret til at advare om avanceret kunstig intelligens\" st\u00e5r der, at AI-virksomheder har st\u00e6rke \u00f8konomiske incitamenter til at undg\u00e5 effektivt tilsyn med potentielle AI-risici.<\/p>\n<p>Ud over at v\u00e6re hensynsl\u00f8se ved at fokusere p\u00e5 \u00f8konomiske m\u00e5l i stedet for sikkerhed, siger brevet, at virksomheder bruger straffende fortrolighedsaftaler til aktivt at afskr\u00e6kke medarbejdere fra at give udtryk for bekymringer.<\/p>\n<p>Underskriverne er alle tidligere OpenAI- og Google-medarbejdere, med Neel Nanda som den eneste, der stadig arbejder hos Google. Brevet blev ogs\u00e5 st\u00f8ttet af de f\u00f8rende AI-forskere Yoshua Bengio, Geoffrey Hinton og Stuart Russell.<\/p>\n<p>Som bevis p\u00e5 bekymringen for at h\u00e6nge deres tidligere arbejdsgivere ud, var 6 af underskriverne ikke villige til at oplyse deres navne i brevet.<\/p>\n<p>De tidligere OpenAI-forskere Daniel Kokotajlo og William Saunders, som ogs\u00e5 underskrev brevet, forlod virksomheden tidligere i \u00e5r.<\/p>\n<p>Kokotajlo var med i governance-teamet, og Saunders arbejdede i OpenAI's Superalignment-team, som blev opl\u00f8st i sidste m\u00e5ned, da <a href=\"https:\/\/dailyai.com\/da\/2024\/05\/openais-superalignment-meltdown-can-the-company-salvage-any-trust\/\">Ilya Sutskever og Jan Leike forlod ogs\u00e5 stedet p\u00e5 grund af sikkerhedsproblemer<\/a>.<\/p>\n<p>Kokotajlo forklarede \u00e5rsagen til sin afgang p\u00e5 et forum, hvor han sagde, at han ikke tror, at OpenAI vil \"opf\u00f8re sig ansvarligt omkring AGI-tidspunktet\".<\/p>\n<h2>En opfordring til handling<\/h2>\n<p>Brevet opfordrer til et st\u00f8rre engagement fra AI-virksomheder i mangel af regulering af AI-risici, som offentligheden ikke kender til.<\/p>\n<p>I brevet st\u00e5r der: \"Almindelig whistleblower-beskyttelse er utilstr\u00e6kkelig, fordi den fokuserer p\u00e5 ulovlig aktivitet, mens mange af de risici, vi er bekymrede for, endnu ikke er reguleret.\"<\/p>\n<p>Brevet opfordrer AI-virksomheder til at forpligte sig til fire principper. Kort sagt \u00f8nsker de, at virksomhederne:<\/p>\n<ul>\n<li>Ikke indg\u00e5 eller h\u00e5ndh\u00e6ve aftaler, der forbyder kritik af virksomheden p\u00e5 grund af sikkerhedsproblemer eller tilbageholder \u00f8konomiske fordele, som medarbejderen har krav p\u00e5. (ahem, OpenAI)<\/li>\n<li>Facilit\u00e9r en anonym proces, hvor medarbejderne kan rejse risikorelaterede bekymringer over for virksomhedens bestyrelse eller andre regulerende organisationer.<\/li>\n<li>At underst\u00f8tte en kultur med \u00e5ben kritik, der giver medarbejderne mulighed for at offentligg\u00f8re risikorelaterede bekymringer uden at afsl\u00f8re intellektuel ejendom.<\/li>\n<li>Ikke at ud\u00f8ve repressalier mod nuv\u00e6rende og tidligere medarbejdere, der offentligt deler risikorelaterede fortrolige oplysninger, efter at andre processer har sl\u00e5et fejl.<\/li>\n<\/ul>\n<p>Flere af navnene p\u00e5 listen over underskrivere betragter sig selv som effektive altruister. Af deres indl\u00e6g og kommentarer fremg\u00e5r det tydeligt, at folk som Daniel Kokotajlo (Less Wrong) og William Saunders (AI Alignment Forum) mener, at tingene kan ende meget galt, hvis AI-risici ikke h\u00e5ndteres.<\/p>\n<p>Men det er ikke dommedagstrolde p\u00e5 et forum, der r\u00e5ber op fra sidelinjen. Det er f\u00f8rende intellekter, som virksomheder som OpenAI og Google fandt det passende at ans\u00e6tte for at skabe den teknologi, de nu frygter.<\/p>\n<p>Og nu siger de: \"Vi har set ting, der skr\u00e6mmer os. Vi vil gerne advare folk, men det m\u00e5 vi ikke.<\/p>\n<p>Du kan l\u00e6se <a href=\"https:\/\/righttowarn.ai\/\" target=\"_blank\" rel=\"noopener\">brev her<\/a>.<\/p>","protected":false},"excerpt":{"rendered":"<p>En gruppe tidligere og nuv\u00e6rende OpenAI- og Google-medarbejdere kritiserer AI-virksomhederne for det, de kalder en farlig hemmeligholdelseskultur omkring AI-risici. I brevet med titlen \"En ret til at advare om avanceret kunstig intelligens\" st\u00e5r der, at AI-virksomheder har st\u00e6rke \u00f8konomiske incitamenter til at undg\u00e5 effektivt tilsyn med potentielle AI-risici. Ud over at v\u00e6re hensynsl\u00f8se ved at fokusere p\u00e5 \u00f8konomiske m\u00e5l i stedet for sikkerhed, siger brevet, at virksomheder bruger straffende fortrolighedsaftaler til aktivt at afskr\u00e6kke medarbejdere fra at give udtryk for bekymringer. Underskriverne er alle tidligere OpenAI- og Google-medarbejdere, med Neel Nanda som den eneste, der stadig arbejder hos<\/p>","protected":false},"author":6,"featured_media":12731,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[88],"tags":[163,102,93],"class_list":["post-12729","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-ethics","tag-ai-risks","tag-google","tag-openai"],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v27.4 - https:\/\/yoast.com\/product\/yoast-seo-wordpress\/ -->\n<title>Former OpenAI employees publish \u2018Right to Warn\u2019 open letter | DailyAI<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/dailyai.com\/da\/2024\/06\/former-openai-employees-publish-right-to-warn-open-letter\/\" \/>\n<meta property=\"og:locale\" content=\"da_DK\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Former OpenAI employees publish \u2018Right to Warn\u2019 open letter | DailyAI\" \/>\n<meta property=\"og:description\" content=\"A group of former and current OpenAI and Google employees are calling AI companies out on what they say is a dangerous culture of secrecy surrounding AI risks. The letter titled \u201cA right to warn about advanced artificial intelligence\u201d states that AI companies have strong financial incentives to avoid effective oversight of potential AI risks. Besides being reckless by focusing on financial objectives instead of safety, the letter says that companies use punitive confidentiality agreements to actively discourage employees from raising concerns. The signatories are all former OpenAI and Google employees, with Neel Nanda, the only one still working at\" \/>\n<meta property=\"og:url\" content=\"https:\/\/dailyai.com\/da\/2024\/06\/former-openai-employees-publish-right-to-warn-open-letter\/\" \/>\n<meta property=\"og:site_name\" content=\"DailyAI\" \/>\n<meta property=\"article:published_time\" content=\"2024-06-05T09:50:37+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/dailyai.com\/wp-content\/uploads\/2024\/06\/Warning-letter.webp\" \/>\n\t<meta property=\"og:image:width\" content=\"1792\" \/>\n\t<meta property=\"og:image:height\" content=\"1024\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/webp\" \/>\n<meta name=\"author\" content=\"Eugene van der Watt\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@DailyAIOfficial\" \/>\n<meta name=\"twitter:site\" content=\"@DailyAIOfficial\" \/>\n<meta name=\"twitter:label1\" content=\"Skrevet af\" \/>\n\t<meta name=\"twitter:data1\" content=\"Eugene van der Watt\" \/>\n\t<meta name=\"twitter:label2\" content=\"Estimeret l\u00e6setid\" \/>\n\t<meta name=\"twitter:data2\" content=\"3 minutter\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\\\/\\\/schema.org\",\"@graph\":[{\"@type\":\"NewsArticle\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/06\\\/former-openai-employees-publish-right-to-warn-open-letter\\\/#article\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/06\\\/former-openai-employees-publish-right-to-warn-open-letter\\\/\"},\"author\":{\"name\":\"Eugene van der Watt\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/#\\\/schema\\\/person\\\/7ce525c6d0c79838b7cc7cde96993cfa\"},\"headline\":\"Former OpenAI employees publish \u2018Right to Warn\u2019 open letter\",\"datePublished\":\"2024-06-05T09:50:37+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/06\\\/former-openai-employees-publish-right-to-warn-open-letter\\\/\"},\"wordCount\":487,\"publisher\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/#organization\"},\"image\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/06\\\/former-openai-employees-publish-right-to-warn-open-letter\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2024\\\/06\\\/Warning-letter.webp\",\"keywords\":[\"AI risks\",\"Google\",\"OpenAI\"],\"articleSection\":[\"Ethics &amp; Society\"],\"inLanguage\":\"da-DK\"},{\"@type\":\"WebPage\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/06\\\/former-openai-employees-publish-right-to-warn-open-letter\\\/\",\"url\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/06\\\/former-openai-employees-publish-right-to-warn-open-letter\\\/\",\"name\":\"Former OpenAI employees publish \u2018Right to Warn\u2019 open letter | DailyAI\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/06\\\/former-openai-employees-publish-right-to-warn-open-letter\\\/#primaryimage\"},\"image\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/06\\\/former-openai-employees-publish-right-to-warn-open-letter\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2024\\\/06\\\/Warning-letter.webp\",\"datePublished\":\"2024-06-05T09:50:37+00:00\",\"breadcrumb\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/06\\\/former-openai-employees-publish-right-to-warn-open-letter\\\/#breadcrumb\"},\"inLanguage\":\"da-DK\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\\\/\\\/dailyai.com\\\/2024\\\/06\\\/former-openai-employees-publish-right-to-warn-open-letter\\\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"da-DK\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/06\\\/former-openai-employees-publish-right-to-warn-open-letter\\\/#primaryimage\",\"url\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2024\\\/06\\\/Warning-letter.webp\",\"contentUrl\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2024\\\/06\\\/Warning-letter.webp\",\"width\":1792,\"height\":1024},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/06\\\/former-openai-employees-publish-right-to-warn-open-letter\\\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\\\/\\\/dailyai.com\\\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Former OpenAI employees publish \u2018Right to Warn\u2019 open letter\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/#website\",\"url\":\"https:\\\/\\\/dailyai.com\\\/\",\"name\":\"DailyAI\",\"description\":\"Your Daily Dose of AI News\",\"publisher\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\\\/\\\/dailyai.com\\\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"da-DK\"},{\"@type\":\"Organization\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/#organization\",\"name\":\"DailyAI\",\"url\":\"https:\\\/\\\/dailyai.com\\\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"da-DK\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/#\\\/schema\\\/logo\\\/image\\\/\",\"url\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/06\\\/Daily-Ai_TL_colour.png\",\"contentUrl\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/06\\\/Daily-Ai_TL_colour.png\",\"width\":4501,\"height\":934,\"caption\":\"DailyAI\"},\"image\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/#\\\/schema\\\/logo\\\/image\\\/\"},\"sameAs\":[\"https:\\\/\\\/x.com\\\/DailyAIOfficial\",\"https:\\\/\\\/www.linkedin.com\\\/company\\\/dailyaiofficial\\\/\",\"https:\\\/\\\/www.youtube.com\\\/@DailyAIOfficial\"]},{\"@type\":\"Person\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/#\\\/schema\\\/person\\\/7ce525c6d0c79838b7cc7cde96993cfa\",\"name\":\"Eugene van der Watt\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"da-DK\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/07\\\/Eugine_Profile_Picture-96x96.png\",\"url\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/07\\\/Eugine_Profile_Picture-96x96.png\",\"contentUrl\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/07\\\/Eugine_Profile_Picture-96x96.png\",\"caption\":\"Eugene van der Watt\"},\"description\":\"Eugene comes from an electronic engineering background and loves all things tech. When he takes a break from consuming AI news you'll find him at the snooker table.\",\"sameAs\":[\"www.linkedin.com\\\/in\\\/eugene-van-der-watt-16828119\"],\"url\":\"https:\\\/\\\/dailyai.com\\\/da\\\/author\\\/eugene\\\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Tidligere OpenAI-medarbejdere udgiver \u00e5bent brev om \"retten til at advare\" | DailyAI","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/dailyai.com\/da\/2024\/06\/former-openai-employees-publish-right-to-warn-open-letter\/","og_locale":"da_DK","og_type":"article","og_title":"Former OpenAI employees publish \u2018Right to Warn\u2019 open letter | DailyAI","og_description":"A group of former and current OpenAI and Google employees are calling AI companies out on what they say is a dangerous culture of secrecy surrounding AI risks. The letter titled \u201cA right to warn about advanced artificial intelligence\u201d states that AI companies have strong financial incentives to avoid effective oversight of potential AI risks. Besides being reckless by focusing on financial objectives instead of safety, the letter says that companies use punitive confidentiality agreements to actively discourage employees from raising concerns. The signatories are all former OpenAI and Google employees, with Neel Nanda, the only one still working at","og_url":"https:\/\/dailyai.com\/da\/2024\/06\/former-openai-employees-publish-right-to-warn-open-letter\/","og_site_name":"DailyAI","article_published_time":"2024-06-05T09:50:37+00:00","og_image":[{"width":1792,"height":1024,"url":"https:\/\/dailyai.com\/wp-content\/uploads\/2024\/06\/Warning-letter.webp","type":"image\/webp"}],"author":"Eugene van der Watt","twitter_card":"summary_large_image","twitter_creator":"@DailyAIOfficial","twitter_site":"@DailyAIOfficial","twitter_misc":{"Skrevet af":"Eugene van der Watt","Estimeret l\u00e6setid":"3 minutter"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"NewsArticle","@id":"https:\/\/dailyai.com\/2024\/06\/former-openai-employees-publish-right-to-warn-open-letter\/#article","isPartOf":{"@id":"https:\/\/dailyai.com\/2024\/06\/former-openai-employees-publish-right-to-warn-open-letter\/"},"author":{"name":"Eugene van der Watt","@id":"https:\/\/dailyai.com\/#\/schema\/person\/7ce525c6d0c79838b7cc7cde96993cfa"},"headline":"Former OpenAI employees publish \u2018Right to Warn\u2019 open letter","datePublished":"2024-06-05T09:50:37+00:00","mainEntityOfPage":{"@id":"https:\/\/dailyai.com\/2024\/06\/former-openai-employees-publish-right-to-warn-open-letter\/"},"wordCount":487,"publisher":{"@id":"https:\/\/dailyai.com\/#organization"},"image":{"@id":"https:\/\/dailyai.com\/2024\/06\/former-openai-employees-publish-right-to-warn-open-letter\/#primaryimage"},"thumbnailUrl":"https:\/\/dailyai.com\/wp-content\/uploads\/2024\/06\/Warning-letter.webp","keywords":["AI risks","Google","OpenAI"],"articleSection":["Ethics &amp; Society"],"inLanguage":"da-DK"},{"@type":"WebPage","@id":"https:\/\/dailyai.com\/2024\/06\/former-openai-employees-publish-right-to-warn-open-letter\/","url":"https:\/\/dailyai.com\/2024\/06\/former-openai-employees-publish-right-to-warn-open-letter\/","name":"Tidligere OpenAI-medarbejdere udgiver \u00e5bent brev om \"retten til at advare\" | DailyAI","isPartOf":{"@id":"https:\/\/dailyai.com\/#website"},"primaryImageOfPage":{"@id":"https:\/\/dailyai.com\/2024\/06\/former-openai-employees-publish-right-to-warn-open-letter\/#primaryimage"},"image":{"@id":"https:\/\/dailyai.com\/2024\/06\/former-openai-employees-publish-right-to-warn-open-letter\/#primaryimage"},"thumbnailUrl":"https:\/\/dailyai.com\/wp-content\/uploads\/2024\/06\/Warning-letter.webp","datePublished":"2024-06-05T09:50:37+00:00","breadcrumb":{"@id":"https:\/\/dailyai.com\/2024\/06\/former-openai-employees-publish-right-to-warn-open-letter\/#breadcrumb"},"inLanguage":"da-DK","potentialAction":[{"@type":"ReadAction","target":["https:\/\/dailyai.com\/2024\/06\/former-openai-employees-publish-right-to-warn-open-letter\/"]}]},{"@type":"ImageObject","inLanguage":"da-DK","@id":"https:\/\/dailyai.com\/2024\/06\/former-openai-employees-publish-right-to-warn-open-letter\/#primaryimage","url":"https:\/\/dailyai.com\/wp-content\/uploads\/2024\/06\/Warning-letter.webp","contentUrl":"https:\/\/dailyai.com\/wp-content\/uploads\/2024\/06\/Warning-letter.webp","width":1792,"height":1024},{"@type":"BreadcrumbList","@id":"https:\/\/dailyai.com\/2024\/06\/former-openai-employees-publish-right-to-warn-open-letter\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/dailyai.com\/"},{"@type":"ListItem","position":2,"name":"Former OpenAI employees publish \u2018Right to Warn\u2019 open letter"}]},{"@type":"WebSite","@id":"https:\/\/dailyai.com\/#website","url":"https:\/\/dailyai.com\/","name":"DailyAI","description":"Din daglige dosis af AI-nyheder","publisher":{"@id":"https:\/\/dailyai.com\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/dailyai.com\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"da-DK"},{"@type":"Organization","@id":"https:\/\/dailyai.com\/#organization","name":"DailyAI","url":"https:\/\/dailyai.com\/","logo":{"@type":"ImageObject","inLanguage":"da-DK","@id":"https:\/\/dailyai.com\/#\/schema\/logo\/image\/","url":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/06\/Daily-Ai_TL_colour.png","contentUrl":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/06\/Daily-Ai_TL_colour.png","width":4501,"height":934,"caption":"DailyAI"},"image":{"@id":"https:\/\/dailyai.com\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/x.com\/DailyAIOfficial","https:\/\/www.linkedin.com\/company\/dailyaiofficial\/","https:\/\/www.youtube.com\/@DailyAIOfficial"]},{"@type":"Person","@id":"https:\/\/dailyai.com\/#\/schema\/person\/7ce525c6d0c79838b7cc7cde96993cfa","name":"Eugene van der Watt","image":{"@type":"ImageObject","inLanguage":"da-DK","@id":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/07\/Eugine_Profile_Picture-96x96.png","url":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/07\/Eugine_Profile_Picture-96x96.png","contentUrl":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/07\/Eugine_Profile_Picture-96x96.png","caption":"Eugene van der Watt"},"description":"Eugene har en baggrund som elektronikingeni\u00f8r og elsker alt, hvad der har med teknologi at g\u00f8re. N\u00e5r han tager en pause fra at l\u00e6se AI-nyheder, kan du finde ham ved snookerbordet.","sameAs":["www.linkedin.com\/in\/eugene-van-der-watt-16828119"],"url":"https:\/\/dailyai.com\/da\/author\/eugene\/"}]}},"_links":{"self":[{"href":"https:\/\/dailyai.com\/da\/wp-json\/wp\/v2\/posts\/12729","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/dailyai.com\/da\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/dailyai.com\/da\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/dailyai.com\/da\/wp-json\/wp\/v2\/users\/6"}],"replies":[{"embeddable":true,"href":"https:\/\/dailyai.com\/da\/wp-json\/wp\/v2\/comments?post=12729"}],"version-history":[{"count":3,"href":"https:\/\/dailyai.com\/da\/wp-json\/wp\/v2\/posts\/12729\/revisions"}],"predecessor-version":[{"id":12733,"href":"https:\/\/dailyai.com\/da\/wp-json\/wp\/v2\/posts\/12729\/revisions\/12733"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/dailyai.com\/da\/wp-json\/wp\/v2\/media\/12731"}],"wp:attachment":[{"href":"https:\/\/dailyai.com\/da\/wp-json\/wp\/v2\/media?parent=12729"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/dailyai.com\/da\/wp-json\/wp\/v2\/categories?post=12729"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/dailyai.com\/da\/wp-json\/wp\/v2\/tags?post=12729"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}