{"id":10283,"date":"2024-02-26T10:02:28","date_gmt":"2024-02-26T10:02:28","guid":{"rendered":"https:\/\/dailyai.com\/?p=10283"},"modified":"2024-02-26T10:05:57","modified_gmt":"2024-02-26T10:05:57","slug":"google-announces-gemma-its-open-llms-that-can-run-locally","status":"publish","type":"post","link":"https:\/\/dailyai.com\/de\/2024\/02\/google-announces-gemma-its-open-llms-that-can-run-locally\/","title":{"rendered":"Google k\u00fcndigt Gemma an, seine offenen LLMs, die lokal ausgef\u00fchrt werden k\u00f6nnen"},"content":{"rendered":"<p>Google hat zwei Modelle aus seiner Familie leichter, offener Modelle namens Gemma ver\u00f6ffentlicht.<\/p>\n<p>W\u00e4hrend es sich bei den Gemini-Modellen von Google um propriet\u00e4re, also geschlossene Modelle handelt, wurden die Gemma-Modelle als \"offene Modelle\" ver\u00f6ffentlicht und Entwicklern frei zug\u00e4nglich gemacht.<\/p>\n<p>Google hat Gemma-Modelle in zwei Gr\u00f6\u00dfen, 2B und 7B Parameter, mit vortrainierten und anweisungsabgestimmten Varianten f\u00fcr jedes Modell ver\u00f6ffentlicht. Google ver\u00f6ffentlicht die Modellgewichte sowie eine Reihe von Tools f\u00fcr Entwickler, um die Modelle an ihre Bed\u00fcrfnisse anzupassen.<\/p>\n<p>Google sagt, dass die Gemma-Modelle mit der gleichen Technologie entwickelt wurden, mit der auch das Flaggschiff-Modell Gemini arbeitet. Mehrere Unternehmen haben 7B-Modelle in dem Bem\u00fchen ver\u00f6ffentlicht, ein LLM zu liefern, das brauchbare Funktionen beibeh\u00e4lt, aber m\u00f6glicherweise lokal statt in der Cloud l\u00e4uft.<\/p>\n<p>Llama-2-7B und <a href=\"https:\/\/dailyai.com\/de\/2023\/12\/the-rise-of-the-french-ai-startup-mistral\/\">Mistral-7B<\/a> sind bemerkenswerte Konkurrenten in diesem Bereich, aber Google sagt, dass \"Gemma deutlich gr\u00f6\u00dfere Modelle bei wichtigen Benchmarks \u00fcbertrifft\", und bietet diesen Benchmark-Vergleich als Beweis an.<\/p>\n<figure id=\"attachment_10325\" aria-describedby=\"caption-attachment-10325\" style=\"width: 1000px\" class=\"wp-caption aligncenter\"><img loading=\"lazy\" decoding=\"async\" class=\"size-full wp-image-10325\" src=\"https:\/\/dailyai.com\/wp-content\/uploads\/2024\/02\/Gemma-vs-Llama-2-benchmark.webp\" alt=\"\" width=\"1000\" height=\"615\" srcset=\"https:\/\/dailyai.com\/wp-content\/uploads\/2024\/02\/Gemma-vs-Llama-2-benchmark.webp 1000w, https:\/\/dailyai.com\/wp-content\/uploads\/2024\/02\/Gemma-vs-Llama-2-benchmark-300x185.webp 300w, https:\/\/dailyai.com\/wp-content\/uploads\/2024\/02\/Gemma-vs-Llama-2-benchmark-768x472.webp 768w, https:\/\/dailyai.com\/wp-content\/uploads\/2024\/02\/Gemma-vs-Llama-2-benchmark-370x228.webp 370w, https:\/\/dailyai.com\/wp-content\/uploads\/2024\/02\/Gemma-vs-Llama-2-benchmark-800x492.webp 800w, https:\/\/dailyai.com\/wp-content\/uploads\/2024\/02\/Gemma-vs-Llama-2-benchmark-20x12.webp 20w, https:\/\/dailyai.com\/wp-content\/uploads\/2024\/02\/Gemma-vs-Llama-2-benchmark-740x455.webp 740w, https:\/\/dailyai.com\/wp-content\/uploads\/2024\/02\/Gemma-vs-Llama-2-benchmark-78x48.webp 78w\" sizes=\"auto, (max-width: 1000px) 100vw, 1000px\" \/><figcaption id=\"caption-attachment-10325\" class=\"wp-caption-text\">Benchmark-Ergebnisse von Gemma-7B gegen\u00fcber Llama-2-7B und Llama-2-12B. Quelle: Google<\/figcaption><\/figure>\n<p>Die Benchmark-Ergebnisse zeigen, dass Gemma sogar die gr\u00f6\u00dfere 12B-Version von Llama 2 in allen vier Bereichen \u00fcbertrifft.<\/p>\n<p>Das wirklich Spannende an Gemma ist die Aussicht, es lokal auszuf\u00fchren. Google ist eine Partnerschaft mit NVIDIA eingegangen, um Gemma f\u00fcr NVIDIA-GPUs zu optimieren. Wenn Sie einen PC mit einem RTX-Grafikprozessor von NVIDIA haben, k\u00f6nnen Sie Gemma auf Ihrem Ger\u00e4t ausf\u00fchren.<\/p>\n<p>NVIDIA gibt an, eine Basis von \u00fcber 100 Millionen NVIDIA RTX GPUs installiert zu haben. Das macht Gemma zu einer attraktiven Option f\u00fcr Entwickler, die sich entscheiden m\u00fcssen, welches leichtgewichtige Modell sie als Basis f\u00fcr ihre Produkte verwenden wollen.<\/p>\n<p>NVIDIA wird auch die Unterst\u00fctzung f\u00fcr Gemma auf seinen <a href=\"https:\/\/dailyai.com\/de\/2024\/02\/nvidias-custom-chatbot-runs-locally-on-rtx-ai-pcs\/\">Chat mit RTX<\/a> Plattform, die es einfach macht, LLMs auf RTX-PCs auszuf\u00fchren.<\/p>\n<p>Technisch gesehen handelt es sich zwar nicht um Open Source, aber nur die Nutzungsbeschr\u00e4nkungen in der Lizenzvereinbarung verhindern, dass Gemma-Modelle diese Bezeichnung tragen. <a href=\"https:\/\/dailyai.com\/de\/2023\/07\/meta-plays-down-the-potential-risks-of-its-new-open-source-model-llama-2\/\">Kritiker der offenen Modelle<\/a> weisen auf die Risiken hin, die damit verbunden sind, sie aufeinander abzustimmen, aber Google sagt, dass es ein umfassendes Red-Teaming durchgef\u00fchrt hat, um sicherzustellen, dass Gemma sicher war.<\/p>\n<p>Google sagt, dass es \"umfangreiche Feinabstimmung und Verst\u00e4rkungslernen aus menschlichem Feedback (RLHF) verwendet hat, um unsere auf Anweisungen abgestimmten Modelle auf verantwortungsbewusstes Verhalten auszurichten\". Au\u00dferdem wurde ein Toolkit f\u00fcr verantwortungsbewusste generative KI ver\u00f6ffentlicht, um Entwicklern zu helfen, Gemma nach der Feinabstimmung weiter zu optimieren.<\/p>\n<p>Anpassbare, leichtgewichtige Modelle wie Gemma bieten Entwicklern m\u00f6glicherweise mehr Nutzen als gr\u00f6\u00dfere Modelle wie GPT-4 oder Gemini Pro. Die M\u00f6glichkeit, LLMs lokal auszuf\u00fchren, ohne dass die Kosten f\u00fcr Cloud-Computing oder API-Aufrufe anfallen, wird von Tag zu Tag besser zug\u00e4nglich.<\/p>\n<p>Da Gemma f\u00fcr Entwickler offen zug\u00e4nglich ist, wird es interessant sein zu sehen, welche Palette von KI-gest\u00fctzten Anwendungen bald auf unseren PCs laufen k\u00f6nnte.<\/p>","protected":false},"excerpt":{"rendered":"<p>Google hat zwei Modelle aus seiner Familie der leichtgewichtigen, offenen Modelle namens Gemma ver\u00f6ffentlicht. W\u00e4hrend es sich bei den Gemini-Modellen von Google um propriet\u00e4re oder geschlossene Modelle handelt, wurden die Gemma-Modelle als \"offene Modelle\" ver\u00f6ffentlicht und f\u00fcr Entwickler frei zug\u00e4nglich gemacht. Google hat Gemma-Modelle in zwei Gr\u00f6\u00dfen, 2B und 7B Parameter, mit vortrainierten und anweisungsabgestimmten Varianten f\u00fcr jedes Modell ver\u00f6ffentlicht. Google ver\u00f6ffentlicht die Modellgewichte sowie eine Reihe von Tools f\u00fcr Entwickler, um die Modelle an ihre Bed\u00fcrfnisse anzupassen. Google sagt, dass die Gemma-Modelle mit der gleichen Technologie entwickelt wurden, die auch das Flaggschiff-Modell Gemini antreibt. Mehrere Unternehmen haben 7B<\/p>","protected":false},"author":6,"featured_media":10326,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[83],"tags":[102,118],"class_list":["post-10283","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-product","tag-google","tag-llms"],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v27.5 - https:\/\/yoast.com\/product\/yoast-seo-wordpress\/ -->\n<title>\u200b\u200bGoogle announces Gemma, its open LLMs that can run locally | DailyAI<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/dailyai.com\/de\/2024\/02\/google-announces-gemma-its-open-llms-that-can-run-locally\/\" \/>\n<meta property=\"og:locale\" content=\"de_DE\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"\u200b\u200bGoogle announces Gemma, its open LLMs that can run locally | DailyAI\" \/>\n<meta property=\"og:description\" content=\"Google has released two models from its family of lightweight, open models called Gemma. While Google\u2019s Gemini models are proprietary, or closed models, the Gemma models have been released as \u201copen models\u201d and made freely available to developers. Google released Gemma models in two sizes, 2B and 7B parameters, with pre-trained and instruction-tuned variants for each. Google is releasing the model weights as well as a suite of tools for developers to adapt the models to their needs. Google says the Gemma models were built using the same tech that powers its flagship Gemini model. Several companies have released 7B\" \/>\n<meta property=\"og:url\" content=\"https:\/\/dailyai.com\/de\/2024\/02\/google-announces-gemma-its-open-llms-that-can-run-locally\/\" \/>\n<meta property=\"og:site_name\" content=\"DailyAI\" \/>\n<meta property=\"article:published_time\" content=\"2024-02-26T10:02:28+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2024-02-26T10:05:57+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/dailyai.com\/wp-content\/uploads\/2024\/02\/Google-Gemma.jpg\" \/>\n\t<meta property=\"og:image:width\" content=\"1000\" \/>\n\t<meta property=\"og:image:height\" content=\"660\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/jpeg\" \/>\n<meta name=\"author\" content=\"Eugene van der Watt\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@DailyAIOfficial\" \/>\n<meta name=\"twitter:site\" content=\"@DailyAIOfficial\" \/>\n<meta name=\"twitter:label1\" content=\"Verfasst von\" \/>\n\t<meta name=\"twitter:data1\" content=\"Eugene van der Watt\" \/>\n\t<meta name=\"twitter:label2\" content=\"Gesch\u00e4tzte Lesezeit\" \/>\n\t<meta name=\"twitter:data2\" content=\"3\u00a0Minuten\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\\\/\\\/schema.org\",\"@graph\":[{\"@type\":\"NewsArticle\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/02\\\/google-announces-gemma-its-open-llms-that-can-run-locally\\\/#article\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/02\\\/google-announces-gemma-its-open-llms-that-can-run-locally\\\/\"},\"author\":{\"name\":\"Eugene van der Watt\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/#\\\/schema\\\/person\\\/7ce525c6d0c79838b7cc7cde96993cfa\"},\"headline\":\"\u200b\u200bGoogle announces Gemma, its open LLMs that can run locally\",\"datePublished\":\"2024-02-26T10:02:28+00:00\",\"dateModified\":\"2024-02-26T10:05:57+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/02\\\/google-announces-gemma-its-open-llms-that-can-run-locally\\\/\"},\"wordCount\":453,\"publisher\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/#organization\"},\"image\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/02\\\/google-announces-gemma-its-open-llms-that-can-run-locally\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2024\\\/02\\\/Google-Gemma.jpg\",\"keywords\":[\"Google\",\"LLMS\"],\"articleSection\":[\"Product\"],\"inLanguage\":\"de\"},{\"@type\":\"WebPage\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/02\\\/google-announces-gemma-its-open-llms-that-can-run-locally\\\/\",\"url\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/02\\\/google-announces-gemma-its-open-llms-that-can-run-locally\\\/\",\"name\":\"\u200b\u200bGoogle announces Gemma, its open LLMs that can run locally | DailyAI\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/02\\\/google-announces-gemma-its-open-llms-that-can-run-locally\\\/#primaryimage\"},\"image\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/02\\\/google-announces-gemma-its-open-llms-that-can-run-locally\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2024\\\/02\\\/Google-Gemma.jpg\",\"datePublished\":\"2024-02-26T10:02:28+00:00\",\"dateModified\":\"2024-02-26T10:05:57+00:00\",\"breadcrumb\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/02\\\/google-announces-gemma-its-open-llms-that-can-run-locally\\\/#breadcrumb\"},\"inLanguage\":\"de\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\\\/\\\/dailyai.com\\\/2024\\\/02\\\/google-announces-gemma-its-open-llms-that-can-run-locally\\\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"de\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/02\\\/google-announces-gemma-its-open-llms-that-can-run-locally\\\/#primaryimage\",\"url\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2024\\\/02\\\/Google-Gemma.jpg\",\"contentUrl\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2024\\\/02\\\/Google-Gemma.jpg\",\"width\":1000,\"height\":660},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/2024\\\/02\\\/google-announces-gemma-its-open-llms-that-can-run-locally\\\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\\\/\\\/dailyai.com\\\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"\u200b\u200bGoogle announces Gemma, its open LLMs that can run locally\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/#website\",\"url\":\"https:\\\/\\\/dailyai.com\\\/\",\"name\":\"DailyAI\",\"description\":\"Your Daily Dose of AI News\",\"publisher\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\\\/\\\/dailyai.com\\\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"de\"},{\"@type\":\"Organization\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/#organization\",\"name\":\"DailyAI\",\"url\":\"https:\\\/\\\/dailyai.com\\\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"de\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/#\\\/schema\\\/logo\\\/image\\\/\",\"url\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/06\\\/Daily-Ai_TL_colour.png\",\"contentUrl\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/06\\\/Daily-Ai_TL_colour.png\",\"width\":4501,\"height\":934,\"caption\":\"DailyAI\"},\"image\":{\"@id\":\"https:\\\/\\\/dailyai.com\\\/#\\\/schema\\\/logo\\\/image\\\/\"},\"sameAs\":[\"https:\\\/\\\/x.com\\\/DailyAIOfficial\",\"https:\\\/\\\/www.linkedin.com\\\/company\\\/dailyaiofficial\\\/\",\"https:\\\/\\\/www.youtube.com\\\/@DailyAIOfficial\"]},{\"@type\":\"Person\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/#\\\/schema\\\/person\\\/7ce525c6d0c79838b7cc7cde96993cfa\",\"name\":\"Eugene van der Watt\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"de\",\"@id\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/07\\\/Eugine_Profile_Picture-96x96.png\",\"url\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/07\\\/Eugine_Profile_Picture-96x96.png\",\"contentUrl\":\"https:\\\/\\\/dailyai.com\\\/wp-content\\\/uploads\\\/2023\\\/07\\\/Eugine_Profile_Picture-96x96.png\",\"caption\":\"Eugene van der Watt\"},\"description\":\"Eugene comes from an electronic engineering background and loves all things tech. When he takes a break from consuming AI news you'll find him at the snooker table.\",\"sameAs\":[\"www.linkedin.com\\\/in\\\/eugene-van-der-watt-16828119\"],\"url\":\"https:\\\/\\\/dailyai.com\\\/de\\\/author\\\/eugene\\\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Google k\u00fcndigt Gemma an, seine offenen LLMs, die lokal ausgef\u00fchrt werden k\u00f6nnen | DailyAI","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/dailyai.com\/de\/2024\/02\/google-announces-gemma-its-open-llms-that-can-run-locally\/","og_locale":"de_DE","og_type":"article","og_title":"\u200b\u200bGoogle announces Gemma, its open LLMs that can run locally | DailyAI","og_description":"Google has released two models from its family of lightweight, open models called Gemma. While Google\u2019s Gemini models are proprietary, or closed models, the Gemma models have been released as \u201copen models\u201d and made freely available to developers. Google released Gemma models in two sizes, 2B and 7B parameters, with pre-trained and instruction-tuned variants for each. Google is releasing the model weights as well as a suite of tools for developers to adapt the models to their needs. Google says the Gemma models were built using the same tech that powers its flagship Gemini model. Several companies have released 7B","og_url":"https:\/\/dailyai.com\/de\/2024\/02\/google-announces-gemma-its-open-llms-that-can-run-locally\/","og_site_name":"DailyAI","article_published_time":"2024-02-26T10:02:28+00:00","article_modified_time":"2024-02-26T10:05:57+00:00","og_image":[{"width":1000,"height":660,"url":"https:\/\/dailyai.com\/wp-content\/uploads\/2024\/02\/Google-Gemma.jpg","type":"image\/jpeg"}],"author":"Eugene van der Watt","twitter_card":"summary_large_image","twitter_creator":"@DailyAIOfficial","twitter_site":"@DailyAIOfficial","twitter_misc":{"Verfasst von":"Eugene van der Watt","Gesch\u00e4tzte Lesezeit":"3\u00a0Minuten"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"NewsArticle","@id":"https:\/\/dailyai.com\/2024\/02\/google-announces-gemma-its-open-llms-that-can-run-locally\/#article","isPartOf":{"@id":"https:\/\/dailyai.com\/2024\/02\/google-announces-gemma-its-open-llms-that-can-run-locally\/"},"author":{"name":"Eugene van der Watt","@id":"https:\/\/dailyai.com\/#\/schema\/person\/7ce525c6d0c79838b7cc7cde96993cfa"},"headline":"\u200b\u200bGoogle announces Gemma, its open LLMs that can run locally","datePublished":"2024-02-26T10:02:28+00:00","dateModified":"2024-02-26T10:05:57+00:00","mainEntityOfPage":{"@id":"https:\/\/dailyai.com\/2024\/02\/google-announces-gemma-its-open-llms-that-can-run-locally\/"},"wordCount":453,"publisher":{"@id":"https:\/\/dailyai.com\/#organization"},"image":{"@id":"https:\/\/dailyai.com\/2024\/02\/google-announces-gemma-its-open-llms-that-can-run-locally\/#primaryimage"},"thumbnailUrl":"https:\/\/dailyai.com\/wp-content\/uploads\/2024\/02\/Google-Gemma.jpg","keywords":["Google","LLMS"],"articleSection":["Product"],"inLanguage":"de"},{"@type":"WebPage","@id":"https:\/\/dailyai.com\/2024\/02\/google-announces-gemma-its-open-llms-that-can-run-locally\/","url":"https:\/\/dailyai.com\/2024\/02\/google-announces-gemma-its-open-llms-that-can-run-locally\/","name":"Google k\u00fcndigt Gemma an, seine offenen LLMs, die lokal ausgef\u00fchrt werden k\u00f6nnen | DailyAI","isPartOf":{"@id":"https:\/\/dailyai.com\/#website"},"primaryImageOfPage":{"@id":"https:\/\/dailyai.com\/2024\/02\/google-announces-gemma-its-open-llms-that-can-run-locally\/#primaryimage"},"image":{"@id":"https:\/\/dailyai.com\/2024\/02\/google-announces-gemma-its-open-llms-that-can-run-locally\/#primaryimage"},"thumbnailUrl":"https:\/\/dailyai.com\/wp-content\/uploads\/2024\/02\/Google-Gemma.jpg","datePublished":"2024-02-26T10:02:28+00:00","dateModified":"2024-02-26T10:05:57+00:00","breadcrumb":{"@id":"https:\/\/dailyai.com\/2024\/02\/google-announces-gemma-its-open-llms-that-can-run-locally\/#breadcrumb"},"inLanguage":"de","potentialAction":[{"@type":"ReadAction","target":["https:\/\/dailyai.com\/2024\/02\/google-announces-gemma-its-open-llms-that-can-run-locally\/"]}]},{"@type":"ImageObject","inLanguage":"de","@id":"https:\/\/dailyai.com\/2024\/02\/google-announces-gemma-its-open-llms-that-can-run-locally\/#primaryimage","url":"https:\/\/dailyai.com\/wp-content\/uploads\/2024\/02\/Google-Gemma.jpg","contentUrl":"https:\/\/dailyai.com\/wp-content\/uploads\/2024\/02\/Google-Gemma.jpg","width":1000,"height":660},{"@type":"BreadcrumbList","@id":"https:\/\/dailyai.com\/2024\/02\/google-announces-gemma-its-open-llms-that-can-run-locally\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/dailyai.com\/"},{"@type":"ListItem","position":2,"name":"\u200b\u200bGoogle announces Gemma, its open LLMs that can run locally"}]},{"@type":"WebSite","@id":"https:\/\/dailyai.com\/#website","url":"https:\/\/dailyai.com\/","name":"DailyAI","description":"Ihre t\u00e4gliche Dosis an AI-Nachrichten","publisher":{"@id":"https:\/\/dailyai.com\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/dailyai.com\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"de"},{"@type":"Organization","@id":"https:\/\/dailyai.com\/#organization","name":"DailyAI","url":"https:\/\/dailyai.com\/","logo":{"@type":"ImageObject","inLanguage":"de","@id":"https:\/\/dailyai.com\/#\/schema\/logo\/image\/","url":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/06\/Daily-Ai_TL_colour.png","contentUrl":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/06\/Daily-Ai_TL_colour.png","width":4501,"height":934,"caption":"DailyAI"},"image":{"@id":"https:\/\/dailyai.com\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/x.com\/DailyAIOfficial","https:\/\/www.linkedin.com\/company\/dailyaiofficial\/","https:\/\/www.youtube.com\/@DailyAIOfficial"]},{"@type":"Person","@id":"https:\/\/dailyai.com\/#\/schema\/person\/7ce525c6d0c79838b7cc7cde96993cfa","name":"Eugene van der Watt","image":{"@type":"ImageObject","inLanguage":"de","@id":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/07\/Eugine_Profile_Picture-96x96.png","url":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/07\/Eugine_Profile_Picture-96x96.png","contentUrl":"https:\/\/dailyai.com\/wp-content\/uploads\/2023\/07\/Eugine_Profile_Picture-96x96.png","caption":"Eugene van der Watt"},"description":"Eugene kommt aus der Elektronikbranche und liebt alles, was mit Technik zu tun hat. Wenn er eine Pause vom Konsum von KI-Nachrichten einlegt, findet man ihn am Snookertisch.","sameAs":["www.linkedin.com\/in\/eugene-van-der-watt-16828119"],"url":"https:\/\/dailyai.com\/de\/author\/eugene\/"}]}},"_links":{"self":[{"href":"https:\/\/dailyai.com\/de\/wp-json\/wp\/v2\/posts\/10283","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/dailyai.com\/de\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/dailyai.com\/de\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/dailyai.com\/de\/wp-json\/wp\/v2\/users\/6"}],"replies":[{"embeddable":true,"href":"https:\/\/dailyai.com\/de\/wp-json\/wp\/v2\/comments?post=10283"}],"version-history":[{"count":4,"href":"https:\/\/dailyai.com\/de\/wp-json\/wp\/v2\/posts\/10283\/revisions"}],"predecessor-version":[{"id":10329,"href":"https:\/\/dailyai.com\/de\/wp-json\/wp\/v2\/posts\/10283\/revisions\/10329"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/dailyai.com\/de\/wp-json\/wp\/v2\/media\/10326"}],"wp:attachment":[{"href":"https:\/\/dailyai.com\/de\/wp-json\/wp\/v2\/media?parent=10283"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/dailyai.com\/de\/wp-json\/wp\/v2\/categories?post=10283"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/dailyai.com\/de\/wp-json\/wp\/v2\/tags?post=10283"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}