{"id":12729,"date":"2024-06-05T09:50:37","date_gmt":"2024-06-05T09:50:37","guid":{"rendered":"https:\/\/dailyai.com\/?p=12729"},"modified":"2024-06-05T09:50:37","modified_gmt":"2024-06-05T09:50:37","slug":"former-openai-employees-publish-right-to-warn-open-letter","status":"publish","type":"post","link":"https:\/\/dailyai.com\/sv\/2024\/06\/former-openai-employees-publish-right-to-warn-open-letter\/","title":{"rendered":"Tidigare OpenAI-anst\u00e4llda publicerar \u00f6ppet brev om \"r\u00e4tten att varna"},"content":{"rendered":"<p><strong>En grupp tidigare och nuvarande OpenAI- och Google-anst\u00e4llda kritiserar AI-f\u00f6retag f\u00f6r vad de menar \u00e4r en farlig kultur av hemlighetsmakeri kring AI-risker.<\/strong><\/p>\n<p>I brevet med titeln \"A right to warn about advanced artificial intelligence\" konstateras att AI-f\u00f6retag har starka ekonomiska incitament att undvika effektiv \u00f6vervakning av potentiella AI-risker.<\/p>\n<p>F\u00f6rutom att det \u00e4r v\u00e5rdsl\u00f6st att fokusera p\u00e5 finansiella m\u00e5l ist\u00e4llet f\u00f6r s\u00e4kerhet, s\u00e4ger brevet att f\u00f6retag anv\u00e4nder straffande sekretessavtal f\u00f6r att aktivt avskr\u00e4cka anst\u00e4llda fr\u00e5n att ta upp problem.<\/p>\n<p>Undertecknarna \u00e4r alla tidigare OpenAI- och Google-anst\u00e4llda, med Neel Nanda, den enda som fortfarande arbetar p\u00e5 Google. Brevet st\u00f6ddes ocks\u00e5 av de ledande AI-tankarna Yoshua Bengio, Geoffrey Hinton och Stuart Russell.<\/p>\n<p>Som ett bevis p\u00e5 oron f\u00f6r att h\u00e4nga ut sina tidigare arbetsgivare ville 6 av undertecknarna inte uppge sina namn i brevet.<\/p>\n<p>De tidigare OpenAI-forskarna Daniel Kokotajlo och William Saunders, som ocks\u00e5 undertecknade brevet, l\u00e4mnade f\u00f6retaget tidigare i \u00e5r.<\/p>\n<p>Kokotajlo var med i styrgruppen och Saunders arbetade i OpenAI:s Superalignment-grupp som uppl\u00f6stes f\u00f6rra m\u00e5naden n\u00e4r <a href=\"https:\/\/dailyai.com\/sv\/2024\/05\/openais-superalignment-meltdown-can-the-company-salvage-any-trust\/\">Ilya Sutskever och Jan Leike l\u00e4mnade ocks\u00e5 p\u00e5 grund av s\u00e4kerhetsrisker<\/a>.<\/p>\n<p>Kokotajlo f\u00f6rklarade sitt avhopp p\u00e5 ett forum genom att s\u00e4ga att han inte tror att OpenAI kommer att \"upptr\u00e4da ansvarsfullt n\u00e4r AGI kommer\".<\/p>\n<h2>En uppmaning till handling<\/h2>\n<p>I brevet efterlyses ett st\u00f6rre engagemang fr\u00e5n AI-f\u00f6retag i avsaknad av reglering av AI-risker som allm\u00e4nheten inte k\u00e4nner till.<\/p>\n<p>I brevet st\u00e5r det: \"Det vanliga skyddet f\u00f6r visselbl\u00e5sare \u00e4r otillr\u00e4ckligt eftersom det fokuserar p\u00e5 olaglig verksamhet, medan m\u00e5nga av de risker som vi oroar oss f\u00f6r \u00e4nnu inte \u00e4r reglerade.\"<\/p>\n<p>I brevet uppmanas AI-f\u00f6retagen att f\u00f6rbinda sig att f\u00f6lja fyra principer. I korthet vill de att f\u00f6retagen ska:<\/p>\n<ul>\n<li>Inte ing\u00e5 eller verkst\u00e4lla avtal som f\u00f6rbjuder kritik av f\u00f6retaget p\u00e5 grund av s\u00e4kerhetsproblem eller h\u00e5ller tillbaka ekonomiska f\u00f6rm\u00e5ner som den anst\u00e4llde har r\u00e4tt till. (ahem, OpenAI)<\/li>\n<li>Underl\u00e4tta en anonym process f\u00f6r anst\u00e4llda att ta upp riskrelaterade fr\u00e5gor med f\u00f6retagets styrelse eller andra tillsynsorganisationer.<\/li>\n<li>Att st\u00f6dja en kultur av \u00f6ppen kritik som g\u00f6r det m\u00f6jligt f\u00f6r medarbetare att offentligg\u00f6ra riskrelaterade problem utan att avsl\u00f6ja immateriella r\u00e4ttigheter.<\/li>\n<li>Att inte vidta repressalier mot nuvarande och tidigare anst\u00e4llda som offentligt delar med sig av riskrelaterad konfidentiell information efter att andra processer har misslyckats.<\/li>\n<\/ul>\n<p>Flera av namnen p\u00e5 listan \u00f6ver undertecknare betraktar sig sj\u00e4lva som effektiva altruister. Av deras inl\u00e4gg och kommentarer framg\u00e5r det tydligt att personer som Daniel Kokotajlo (Less Wrong) och William Saunders (AI Alignment Forum) tror att saker och ting kan sluta mycket illa om AI-riskerna inte hanteras.<\/p>\n<p>Men det h\u00e4r \u00e4r inte domedagstroll p\u00e5 ett forum som ropar ut fr\u00e5n sidlinjen. Det h\u00e4r \u00e4r ledande intellekt som f\u00f6retag som OpenAI och Google ans\u00e5g vara l\u00e4mpliga att anst\u00e4lla f\u00f6r att skapa den teknik som de nu fruktar.<\/p>\n<p>Och nu s\u00e4ger de: \"Vi har sett saker som skr\u00e4mmer oss. Vi vill varna m\u00e4nniskor, men det f\u00e5r vi inte.<\/p>\n<p>Du kan l\u00e4sa <a href=\"https:\/\/righttowarn.ai\/\" target=\"_blank\" rel=\"noopener\">brev h\u00e4r<\/a>.<\/p>","protected":false},"excerpt":{"rendered":"<p>En grupp tidigare och nuvarande anst\u00e4llda vid OpenAI och Google kritiserar AI-f\u00f6retag f\u00f6r vad de menar \u00e4r en farlig hemlighetskultur kring AI-risker. I brevet med titeln \"A right to warn about advanced artificial intelligence\" konstateras att AI-f\u00f6retag har starka ekonomiska incitament att undvika effektiv \u00f6vervakning av potentiella AI-risker. F\u00f6rutom att vara v\u00e5rdsl\u00f6sa genom att fokusera p\u00e5 finansiella m\u00e5l ist\u00e4llet f\u00f6r s\u00e4kerhet, s\u00e4ger brevet att f\u00f6retag anv\u00e4nder straffande sekretessavtal f\u00f6r att aktivt avskr\u00e4cka anst\u00e4llda fr\u00e5n att ta upp farh\u00e5gor. Undertecknarna \u00e4r alla tidigare OpenAI- och Google-anst\u00e4llda, med Neel Nanda, den enda som fortfarande arbetar p\u00e5<\/p>","protected":false},"author":6,"featured_media":12731,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[88],"tags":[163,102,93],"class_list":["post-12729","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-ethics","tag-ai-risks","tag-google","tag-openai"],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v27.4 - https:\/\/yoast.com\/product\/yoast-seo-wordpress\/ -->\n<title>Former OpenAI employees publish \u2018Right to Warn\u2019 open letter | DailyAI<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/dailyai.com\/sv\/2024\/06\/former-openai-employees-publish-right-to-warn-open-letter\/\" \/>\n<meta property=\"og:locale\" content=\"sv_SE\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Former OpenAI employees publish \u2018Right to Warn\u2019 open letter | DailyAI\" \/>\n<meta property=\"og:description\" content=\"A group of former and current OpenAI and Google employees are calling AI companies out on what they say is a dangerous culture of secrecy surrounding AI risks. The letter titled \u201cA right to warn about advanced artificial intelligence\u201d states that AI companies have strong financial incentives to avoid effective oversight of potential AI risks. Besides being reckless by focusing on financial objectives instead of safety, the letter says that companies use punitive confidentiality agreements to actively discourage employees from raising concerns. The signatories are all former OpenAI and Google employees, with Neel Nanda, the only one still working at\" \/>\n<meta property=\"og:url\" content=\"https:\/\/dailyai.com\/sv\/2024\/06\/former-openai-employees-publish-right-to-warn-open-letter\/\" \/>\n<meta property=\"og:site_name\" content=\"DailyAI\" \/>\n<meta property=\"article:published_time\" content=\"2024-06-05T09:50:37+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/dailyai.com\/wp-content\/uploads\/2024\/06\/Warning-letter.webp\" \/>\n\t<meta property=\"og:image:width\" content=\"1792\" \/>\n\t<meta property=\"og:image:height\" content=\"1024\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/webp\" \/>\n<meta name=\"author\" content=\"Eugene van der Watt\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@DailyAIOfficial\" \/>\n<meta name=\"twitter:site\" content=\"@DailyAIOfficial\" \/>\n<meta name=\"twitter:label1\" content=\"Skriven av\" \/>\n\t<meta name=\"twitter:data1\" content=\"Eugene van der Watt\" \/>\n\t<meta name=\"twitter:label2\" content=\"Ber\u00e4knad l\u00e4stid\" \/>\n\t<meta name=\"twitter:data2\" content=\"3 minuter\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\\\/\\\/schema.org\",\"@graph\":[{\"@type\":\"NewsArticle\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/06\\\/former-openai-employees-publish-right-to-warn-open-letter\\\/#article\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/06\\\/former-openai-employees-publish-right-to-warn-open-letter\\\/\"},\"author\":{\"name\":\"Eugene van der Watt\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/#\\\/schema\\\/person\\\/7ce525c6d0c79838b7cc7cde96993cfa\"},\"headline\":\"Former OpenAI employees publish \u2018Right to Warn\u2019 open letter\",\"datePublished\":\"2024-06-05T09:50:37+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/06\\\/former-openai-employees-publish-right-to-warn-open-letter\\\/\"},\"wordCount\":487,\"publisher\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/#organization\"},\"image\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/06\\\/former-openai-employees-publish-right-to-warn-open-letter\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2024\\\/06\\\/Warning-letter.webp\",\"keywords\":[\"AI risks\",\"Google\",\"OpenAI\"],\"articleSection\":[\"Ethics &amp; Society\"],\"inLanguage\":\"sv-SE\"},{\"@type\":\"WebPage\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/06\\\/former-openai-employees-publish-right-to-warn-open-letter\\\/\",\"url\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/06\\\/former-openai-employees-publish-right-to-warn-open-letter\\\/\",\"name\":\"Former OpenAI employees publish \u2018Right to Warn\u2019 open letter | DailyAI\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/06\\\/former-openai-employees-publish-right-to-warn-open-letter\\\/#primaryimage\"},\"image\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/06\\\/former-openai-employees-publish-right-to-warn-open-letter\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2024\\\/06\\\/Warning-letter.webp\",\"datePublished\":\"2024-06-05T09:50:37+00:00\",\"breadcrumb\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/06\\\/former-openai-employees-publish-right-to-warn-open-letter\\\/#breadcrumb\"},\"inLanguage\":\"sv-SE\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\\\/\\\/dailyai.com\\\/2024\\\/06\\\/former-openai-employees-publish-right-to-warn-open-letter\\\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"sv-SE\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/06\\\/former-openai-employees-publish-right-to-warn-open-letter\\\/#primaryimage\",\"url\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2024\\\/06\\\/Warning-letter.webp\",\"contentUrl\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2024\\\/06\\\/Warning-letter.webp\",\"width\":1792,\"height\":1024},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/06\\\/former-openai-employees-publish-right-to-warn-open-letter\\\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\\\/\\\/dailyai.com\\\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Former OpenAI employees publish \u2018Right to Warn\u2019 open letter\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/#website\",\"url\":\"https:\\\/\\\/dailyai.com\\\/\",\"name\":\"DailyAI\",\"description\":\"Your Daily Dose of AI News\",\"publisher\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\\\/\\\/dailyai.com\\\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"sv-SE\"},{\"@type\":\"Organization\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/#organization\",\"name\":\"DailyAI\",\"url\":\"https:\\\/\\\/dailyai.com\\\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"sv-SE\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/#\\\/schema\\\/logo\\\/image\\\/\",\"url\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/06\\\/Daily-Ai_TL_colour.png\",\"contentUrl\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/06\\\/Daily-Ai_TL_colour.png\",\"width\":4501,\"height\":934,\"caption\":\"DailyAI\"},\"image\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/#\\\/schema\\\/logo\\\/image\\\/\"},\"sameAs\":[\"https:\\\/\\\/x.com\\\/DailyAIOfficial\",\"https:\\\/\\\/www.linkedin.com\\\/company\\\/dailyaiofficial\\\/\",\"https:\\\/\\\/www.youtube.com\\\/@DailyAIOfficial\"]},{\"@type\":\"Person\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/#\\\/schema\\\/person\\\/7ce525c6d0c79838b7cc7cde96993cfa\",\"name\":\"Eugene van der Watt\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"sv-SE\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/07\\\/Eugine_Profile_Picture-96x96.png\",\"url\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/07\\\/Eugine_Profile_Picture-96x96.png\",\"contentUrl\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/07\\\/Eugine_Profile_Picture-96x96.png\",\"caption\":\"Eugene van der Watt\"},\"description\":\"Eugene comes from an electronic engineering background and loves all things tech. When he takes a break from consuming AI news you'll find him at the snooker table.\",\"sameAs\":[\"www.linkedin.com\\\/in\\\/eugene-van-der-watt-16828119\"],\"url\":\"https:\\\/\\\/dailyai.com\\\/sv\\\/author\\\/eugene\\\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Tidigare OpenAI-anst\u00e4llda publicerar \u00f6ppet brev om \"r\u00e4tten att varna\" | DailyAI","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/dailyai.com\/sv\/2024\/06\/former-openai-employees-publish-right-to-warn-open-letter\/","og_locale":"sv_SE","og_type":"article","og_title":"Former OpenAI employees publish \u2018Right to Warn\u2019 open letter | DailyAI","og_description":"A group of former and current OpenAI and Google employees are calling AI companies out on what they say is a dangerous culture of secrecy surrounding AI risks. The letter titled \u201cA right to warn about advanced artificial intelligence\u201d states that AI companies have strong financial incentives to avoid effective oversight of potential AI risks. Besides being reckless by focusing on financial objectives instead of safety, the letter says that companies use punitive confidentiality agreements to actively discourage employees from raising concerns. The signatories are all former OpenAI and Google employees, with Neel Nanda, the only one still working at","og_url":"https:\/\/dailyai.com\/sv\/2024\/06\/former-openai-employees-publish-right-to-warn-open-letter\/","og_site_name":"DailyAI","article_published_time":"2024-06-05T09:50:37+00:00","og_image":[{"width":1792,"height":1024,"url":"https:\/\/dailyai.com\/wp-content\/uploads\/2024\/06\/Warning-letter.webp","type":"image\/webp"}],"author":"Eugene van der Watt","twitter_card":"summary_large_image","twitter_creator":"@DailyAIOfficial","twitter_site":"@DailyAIOfficial","twitter_misc":{"Skriven av":"Eugene van der Watt","Ber\u00e4knad l\u00e4stid":"3 minuter"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"NewsArticle","@id":"https:\/\/dailyai.com\/2024\/06\/former-openai-employees-publish-right-to-warn-open-letter\/#article","isPartOf":{"@id":"https:\/\/dailyai.com\/2024\/06\/former-openai-employees-publish-right-to-warn-open-letter\/"},"author":{"name":"Eugene van der Watt","@id":"https:\/\/dailyai.com\/#\/schema\/person\/7ce525c6d0c79838b7cc7cde96993cfa"},"headline":"Former OpenAI employees publish \u2018Right to Warn\u2019 open letter","datePublished":"2024-06-05T09:50:37+00:00","mainEntityOfPage":{"@id":"https:\/\/dailyai.com\/2024\/06\/former-openai-employees-publish-right-to-warn-open-letter\/"},"wordCount":487,"publisher":{"@id":"https:\/\/dailyai.com\/#organization"},"image":{"@id":"https:\/\/dailyai.com\/2024\/06\/former-openai-employees-publish-right-to-warn-open-letter\/#primaryimage"},"thumbnailUrl":"https:\/\/dailyai.com\/wp-content\/uploads\/2024\/06\/Warning-letter.webp","keywords":["AI risks","Google","OpenAI"],"articleSection":["Ethics &amp; Society"],"inLanguage":"sv-SE"},{"@type":"WebPage","@id":"https:\/\/dailyai.com\/2024\/06\/former-openai-employees-publish-right-to-warn-open-letter\/","url":"https:\/\/dailyai.com\/2024\/06\/former-openai-employees-publish-right-to-warn-open-letter\/","name":"Tidigare OpenAI-anst\u00e4llda publicerar \u00f6ppet brev om \"r\u00e4tten att varna\" | DailyAI","isPartOf":{"@id":"https:\/\/dailyai.com\/#website"},"primaryImageOfPage":{"@id":"https:\/\/dailyai.com\/2024\/06\/former-openai-employees-publish-right-to-warn-open-letter\/#primaryimage"},"image":{"@id":"https:\/\/dailyai.com\/2024\/06\/former-openai-employees-publish-right-to-warn-open-letter\/#primaryimage"},"thumbnailUrl":"https:\/\/dailyai.com\/wp-content\/uploads\/2024\/06\/Warning-letter.webp","datePublished":"2024-06-05T09:50:37+00:00","breadcrumb":{"@id":"https:\/\/dailyai.com\/2024\/06\/former-openai-employees-publish-right-to-warn-open-letter\/#breadcrumb"},"inLanguage":"sv-SE","potentialAction":[{"@type":"ReadAction","target":["https:\/\/dailyai.com\/2024\/06\/former-openai-employees-publish-right-to-warn-open-letter\/"]}]},{"@type":"ImageObject","inLanguage":"sv-SE","@id":"https:\/\/dailyai.com\/2024\/06\/former-openai-employees-publish-right-to-warn-open-letter\/#primaryimage","url":"https:\/\/dailyai.com\/wp-content\/uploads\/2024\/06\/Warning-letter.webp","contentUrl":"https:\/\/dailyai.com\/wp-content\/uploads\/2024\/06\/Warning-letter.webp","width":1792,"height":1024},{"@type":"BreadcrumbList","@id":"https:\/\/dailyai.com\/2024\/06\/former-openai-employees-publish-right-to-warn-open-letter\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/dailyai.com\/"},{"@type":"ListItem","position":2,"name":"Former OpenAI employees publish \u2018Right to Warn\u2019 open letter"}]},{"@type":"WebSite","@id":"https:\/\/dailyai.com\/#website","url":"https:\/\/dailyai.com\/","name":"DagligaAI","description":"Din dagliga dos av AI-nyheter","publisher":{"@id":"https:\/\/dailyai.com\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/dailyai.com\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"sv-SE"},{"@type":"Organization","@id":"https:\/\/dailyai.com\/#organization","name":"DagligaAI","url":"https:\/\/dailyai.com\/","logo":{"@type":"ImageObject","inLanguage":"sv-SE","@id":"https:\/\/dailyai.com\/#\/schema\/logo\/image\/","url":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/06\/Daily-Ai_TL_colour.png","contentUrl":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/06\/Daily-Ai_TL_colour.png","width":4501,"height":934,"caption":"DailyAI"},"image":{"@id":"https:\/\/dailyai.com\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/x.com\/DailyAIOfficial","https:\/\/www.linkedin.com\/company\/dailyaiofficial\/","https:\/\/www.youtube.com\/@DailyAIOfficial"]},{"@type":"Person","@id":"https:\/\/dailyai.com\/#\/schema\/person\/7ce525c6d0c79838b7cc7cde96993cfa","name":"Eugene van der Watt","image":{"@type":"ImageObject","inLanguage":"sv-SE","@id":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/07\/Eugine_Profile_Picture-96x96.png","url":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/07\/Eugine_Profile_Picture-96x96.png","contentUrl":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/07\/Eugine_Profile_Picture-96x96.png","caption":"Eugene van der Watt"},"description":"Eugene kommer fr\u00e5n en bakgrund som elektronikingenj\u00f6r och \u00e4lskar allt som har med teknik att g\u00f6ra. N\u00e4r han tar en paus fr\u00e5n att konsumera AI-nyheter hittar du honom vid snookerbordet.","sameAs":["www.linkedin.com\/in\/eugene-van-der-watt-16828119"],"url":"https:\/\/dailyai.com\/sv\/author\/eugene\/"}]}},"_links":{"self":[{"href":"https:\/\/dailyai.com\/sv\/wp-json\/wp\/v2\/posts\/12729","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/dailyai.com\/sv\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/dailyai.com\/sv\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/dailyai.com\/sv\/wp-json\/wp\/v2\/users\/6"}],"replies":[{"embeddable":true,"href":"https:\/\/dailyai.com\/sv\/wp-json\/wp\/v2\/comments?post=12729"}],"version-history":[{"count":3,"href":"https:\/\/dailyai.com\/sv\/wp-json\/wp\/v2\/posts\/12729\/revisions"}],"predecessor-version":[{"id":12733,"href":"https:\/\/dailyai.com\/sv\/wp-json\/wp\/v2\/posts\/12729\/revisions\/12733"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/dailyai.com\/sv\/wp-json\/wp\/v2\/media\/12731"}],"wp:attachment":[{"href":"https:\/\/dailyai.com\/sv\/wp-json\/wp\/v2\/media?parent=12729"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/dailyai.com\/sv\/wp-json\/wp\/v2\/categories?post=12729"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/dailyai.com\/sv\/wp-json\/wp\/v2\/tags?post=12729"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}