{"id":1122478,"date":"2024-02-26T00:16:58","date_gmt":"2024-02-26T05:16:58","guid":{"rendered":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/uncategorized\/ai-vendor-finds-opportunity-amid-ai-computing-problem-techtarget\/"},"modified":"2024-02-26T00:16:58","modified_gmt":"2024-02-26T05:16:58","slug":"ai-vendor-finds-opportunity-amid-ai-computing-problem-techtarget","status":"publish","type":"post","link":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/cloud-computing\/ai-vendor-finds-opportunity-amid-ai-computing-problem-techtarget\/","title":{"rendered":"AI vendor finds opportunity amid AI computing problem &#8211; TechTarget"},"content":{"rendered":"<p><p>    With the growth of generative AI, a big problem enterprises and    vendors are concerned with is computing power.  <\/p>\n<p>    Generative AI systems such as ChatGPT    suck up large amounts of compute to train and run, making them    costly.  <\/p>\n<p>    One AI vendor trying to address the massive need for compute is    Lambda.  <\/p>\n<p>      The GPU cloud vendor, which provides cloud services including      GPU compute as well as hardware systems, revealed it had      achieved a valuation of more than $1.5 billion valuation      after raising $320 million in a Series C funding round.    <\/p>\n<p>      The vendor was founded in 2012 and has focused on building AI      infrastructure at scale.    <\/p>\n<p>      As a provider of cloud services based on H100 Tensor Core      GPUs from its partner Nvidia, Lambda gives AI developers      access to architectures for training, fine-tuning, and      inferencing generative AI and large language models (LLMs).    <\/p>\n<p>      One of the early investors and a participator in the latest      funding round is Gradient Ventures.    <\/p>\n<p>      Gradient Ventures first invested in the AI vendor in 2018 and      then did so again in 2022.    <\/p>\n<p>      The investment fund became interested in Lambda during a time      when the vendor faced the challenge of trying to build AI      models without having the workstation and infrastructure it      needed. This led Lambda to start building AI hardware that      researchers can use.    <\/p>\n<p>      \"That's why we were excited is that we saw this sort of      challenge to the development,\" said Zachary Bratun-Glennon,      general partner at Gradient Ventures. \"Since then, we've been      excited as the product has developed.\"    <\/p>\n<p>      Lambda grew from building workstations to hosting servers for      customers that had bigger compute needs and budgets and then      to offering a cloud service with which users can point and      click on their own desktop without needing to buy a      specialized workstation.    <\/p>\n<p>      \"Our excitement is just seeing them meet the developer and      the researcher where they are with what they need,\"      Bratun-Glennon said.    <\/p>\n<p>      Lambda's current fundraising success comes as the vendor      continues to take advantage of the demand for computing in      the age of generative AI, Futurum Group research director      Mark Beccue said.    <\/p>\n<p>      \"I really think the fundraise ... has got to do with that      opportunistic idea that AI compute is in high demand, and      they're going to jump on it,\" he said.    <\/p>\n<p>      As a vendor with experience building on-premises GPU hardware      for data centers, Lambda appeals to investors because of the      options it brings to enterprises, he added.    <\/p>\n<p>      Lambda also enables enterprises to get up and running quickly      with their generative AI projects, Constellation Research      founder R \"Ray\" Wang said.    <\/p>\n<p>      \"GenAI on demand is the best way to look at it,\" Wang said.      \"Lambda labs basically says, 'Hey, we've got the fastest, the      best, and not necessarily the cheapest but a reasonably      priced ability to actually get LLMs on demand.'\"    <\/p>\n<p>      \"What people are rushing to deliver is the ability to give      you your compute power when you need it,\" he continued.    <\/p>\n<p>      However, as generative AI evolves, the compute problem could      ease somewhat.    <\/p>\n<p>      Over the past year and a half, generative AI systems have      evolved from large models that run on up to 40 billion      parameters to smaller models that run on as few as       2 billion parameters, Beccue said.    <\/p>\n<p>      \"The smaller the language models are, the less compute you      have to use,\" he said.    <\/p>\n<p>      Moreover, while Nvidia is known for providing powerful AI      accelerators like GPUs, competitors including Intel and AMD      have also released similar offerings in the last few months,      Beccue added.    <\/p>\n<p>      For example, Intel's Gaudi2 is a deep-learning processor      comparable to Nvidia's H100.    <\/p>\n<p>      In December, AMD introduced MI300X Accelerators. The chips      are designed for       generative AI workloads and rival Nvidia H100s in      performance.    <\/p>\n<p>      \"The models are getting better, and the chips are getting      better and we're getting more of them,\" Beccue said. \"It's a      short-term issue.\"    <\/p>\n<p>      For Lambda, the challenge will be how to extend beyond      solving the current AI computing challenge.    <\/p>\n<p>      \"They're not necessarily going to be competing head-to-head      with the cloud compute people,\" Beccue said. He noted that      the major cloud computing vendors -- the tech giants -- are      deep-pocketed and have vast financial resources. \"I'm sure      what they're thinking about is, 'Okay, right now, there's      kind of a capacity issue that we can fill. How do we extend      over time?'\"    <\/p>\n<p>      As an investor in AI companies, Bratun-Glennon said he thinks      generative AI will produce thousands of language models,      requiring different amounts of compute.    <\/p>\n<p>      \"Even if there are models that have lower compute      requirements, the more use cases people will find to apply      them to, the lower the cost that creates so the more      ubiquitous that becomes,\" he said. \"Even as models get more      efficient, and more companies can use them  that expands the      amount of compute that is required.\"    <\/p>\n<p>      AI compute is also a big market, helping Lambda serve      developers -- a different audience than what other cloud      providers target, he added. Hyper-scale cloud providers focus      on selling to large enterprises and getting large workloads.    <\/p>\n<p>      \"Lambda is the AI training and inference cloud,\"      Bratun-Glennon said. \"The thing that has carried through the      six years I've been working with them is the AI developer      mindset.\"    <\/p>\n<p>      Lambda is not the only vendor working to meet the demand of      AI compute.    <\/p>\n<p>      On February 20, AI inference vendor Recogni revealed it      raised $102 million in series C funding co-led by Celesta Capital and GreatPoint Ventures.      Recogni develops AI inference systems to address AI compute.    <\/p>\n<p>      The latest Lambda round was led by Thomas Tull's U.S.      Innovative Technology fund, with participation, in addition      to Gradient Ventures, from SK Telecom, Crescent Cove and      Bloomberg Beta.    <\/p>\n<p>      Esther Ajao is a TechTarget Editorial news writer covering      artificial intelligence software and systems.    <\/p>\n<p><!-- Auto Generated --><\/p>\n<p>See the original post here: <\/p>\n<p><a target=\"_blank\" rel=\"nofollow noopener\" href=\"https:\/\/www.techtarget.com\/searchenterpriseai\/news\/366571064\/AI-vendor-finds-opportunity-amid-AI-computing-problem\" title=\"AI vendor finds opportunity amid AI computing problem - TechTarget\">AI vendor finds opportunity amid AI computing problem - TechTarget<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p> With the growth of generative AI, a big problem enterprises and vendors are concerned with is computing power. Generative AI systems such as ChatGPT suck up large amounts of compute to train and run, making them costly <a href=\"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/cloud-computing\/ai-vendor-finds-opportunity-amid-ai-computing-problem-techtarget\/\">Continue reading <span class=\"meta-nav\">&rarr;<\/span><\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[257743],"tags":[],"class_list":["post-1122478","post","type-post","status-publish","format-standard","hentry","category-cloud-computing"],"_links":{"self":[{"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/posts\/1122478"}],"collection":[{"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/comments?post=1122478"}],"version-history":[{"count":0,"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/posts\/1122478\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/media?parent=1122478"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/categories?post=1122478"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/tags?post=1122478"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}