{"id":1075341,"date":"2024-03-18T02:38:41","date_gmt":"2024-03-18T06:38:41","guid":{"rendered":"https:\/\/www.immortalitymedicine.tv\/elon-musks-xai-releases-grok-1-architecture-while-apple-advances-multimodal-ai-research-siliconangle-news\/"},"modified":"2024-08-18T12:49:09","modified_gmt":"2024-08-18T16:49:09","slug":"elon-musks-xai-releases-grok-1-architecture-while-apple-advances-multimodal-ai-research-siliconangle-news","status":"publish","type":"post","link":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/elon-musk\/elon-musks-xai-releases-grok-1-architecture-while-apple-advances-multimodal-ai-research-siliconangle-news.php","title":{"rendered":"Elon Musk&#8217;s xAI releases Grok-1 architecture, while Apple advances multimodal AI research &#8211; SiliconANGLE News"},"content":{"rendered":"<p><p>    The Elon Musk-run artificial intelligence startup xAI Corp.    today released the weights and architecture of its Grok-1 large    language model as open source code, shortly after Apple Inc.    published a paper describing its own work on multimode LLMs.  <\/p>\n<p>    Musk first said that xAI would release Grok as open source        on March 11, but the release today of the base model and    weights, fundamental components of how the model works makes    this the companys first open-source release.  <\/p>\n<p>    What has been released is part of the network architecture of    Groks structural design, including how layers and nodes are    arranged and interconnected to process data. Base model weights    are the parameters within a given models architecture that    have been adjusted during training, encoding the learned    information and determining how input data is transformed into    output.  <\/p>\n<p>    Grok-1 is a 314 billion parameter Mixture-of-Experts model    trained from scratch by xAI. A Mixture-of-Experts model is a    machine learning approach that combines the outputs of multiple    specialized sub-models, also known as experts, to make a final    prediction, optimizing for diverse tasks or data subsets by    leveraging the expertise of each individual model.  <\/p>\n<p>    The release is the raw base model checkpoint from the Grok-1    pre-training phase, which concluded in October 2023. According    to the company, this means that the model is not fine-tuned    for any specific application, such as dialogue. No further    information was provided in what was only a brief blog post.  <\/p>\n<p>    Musk revealedin    Julythat he had founded xAI and that it will compete    against AI services from companies such as Google LLC and    OpenAI. The companys first model, Grok, was claimed    byxAI to have been modeled after Douglas Adams classic    book The Hitchhikers Guide to the Galaxy and is intended to    answer almost anything and, far harder, even suggest what    questions to ask!  <\/p>\n<p>    Meanwhile, at Apple, the company Steve Jobs built    quietlypublished a    paperThursday describing its work on MM1, a set of    multimodal LLMs for captioning images, answering visual    questions,and natural language inference.  <\/p>\n<p>    Thurott     reportedtoday that the paper describes MM1 as a    family of multimodal models that support up to 30 billion    parameters and achieve competitive performance after    supervised fine-tuning on a range of established multimodal    benchmarks. The researchers also claim that multimodal large    language models have emerged as the next frontier in    foundation models after traditional LLMs and they achieve    superior capabilities.  <\/p>\n<p>    A multimodal LLM is an AI system capable of understanding and    generating responses across multiple types of data, such as    text, images and audio, integrating diverse forms of    information to perform complex tasks. The Apple researchers    believe that their model delivers a breakthrough that will help    others scale these models into larger sets of data with better    performance and reliability.  <\/p>\n<p>    Apples previous work on multimodal LLMs includes Ferret, a    model that was quietly open-sourced in October before being    noticed     in December.  <\/p>\n<p>      THANK YOU    <\/p>\n<p><!-- Auto Generated --><\/p>\n<p>Go here to read the rest: <\/p>\n<p><a target=\"_blank\" rel=\"nofollow noopener\" href=\"https:\/\/siliconangle.com\/2024\/03\/17\/elon-musks-xai-releases-grok-1-architecture-apple-advances-multimodal-ai-research\" title=\"Elon Musk's xAI releases Grok-1 architecture, while Apple advances multimodal AI research - SiliconANGLE News\">Elon Musk's xAI releases Grok-1 architecture, while Apple advances multimodal AI research - SiliconANGLE News<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p> The Elon Musk-run artificial intelligence startup xAI Corp. today released the weights and architecture of its Grok-1 large language model as open source code, shortly after Apple Inc. published a paper describing its own work on multimode LLMs.  <a href=\"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/elon-musk\/elon-musks-xai-releases-grok-1-architecture-while-apple-advances-multimodal-ai-research-siliconangle-news.php\">Continue reading <span class=\"meta-nav\">&rarr;<\/span><\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"limit_modified_date":"","last_modified_date":"","_lmt_disableupdate":"","_lmt_disable":"","footnotes":""},"categories":[612435],"tags":[],"class_list":["post-1075341","post","type-post","status-publish","format-standard","hentry","category-elon-musk"],"modified_by":null,"_links":{"self":[{"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/posts\/1075341"}],"collection":[{"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/comments?post=1075341"}],"version-history":[{"count":0,"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/posts\/1075341\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/media?parent=1075341"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/categories?post=1075341"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/tags?post=1075341"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}