Skip to main content Scroll Top

Nvidia reaches $20 Billion licensing deal with AI chip maker Groq

WHY THIS MATTERS IN BRIEF

Increasingly the AI-Compute battle is moving from GPUs to Inference and that requires a different architecture – and partnerships.

 

Love the Exponential Future? Join our XPotential Community, future proof yourself with courses from XPotential University, read about exponential tech and trendsconnect, watch a keynote, or browse my blog.

Nvidia has agreed to a licensing deal with Artificial Intelligence (AI) chip startup Groq that I talked about a while ago, in a bid to improve its AI inference capabilities, furthering its investments in companies connected to the AI boom and gaining the right to add a new type of technology to its products.

 

RELATED
New era of hypersonic warfare forces the Pentagon to rethink its strategy

 

The world’s largest publicly traded company has paid for the right to use Groq’s technology and will integrate its chip design into future products. Some of the startup’s executives are leaving to join Nvidia to help with that effort, the companies said. Groq will continue as an independent company with a new chief executive, it said Wednesday in a post on its website.

Nvidia’s technology already dominates data centers that are at the heart of the explosion in spending on new computing needed for AI software and services. The popularity of its existing offerings has made Nvidia by far the richest company in the chip industry and it has said it will use some of that cash to advance the uptake of AI across the economy.

Groq is among the startups and companies such as Alphabet Google that are developing their own AI chips to rival Nvidia. The startup, which was founded in 2016, raised $750 million at a post-funding valuation of $6.9 billion in September. At the time, Groq said it would use the funds to expand its data center capacity. Its data center business, which offers outsourced computing, will continue, the company said in the post.

 

RELATED
Low energy Analogue AI chips gets a $100 Million boost from DARPA

 

Groq Chief Executive Officer Jonathan Ross is a former Google chip executive who helped start that company’s Tensor Processing Unit, or TPU, which powers AI workloads. As part of the deal, he and other top executives will join Nvidia “to help advance and scale the licensed technology,” Groq said in the statement.

No financial details were released.

Groq’s low-latency chips are extremely responsive to inputs and will add new capabilities to Nvidia’s products and open up new areas of the market, Nvidia said. Under the leadership of Chief Executive Officer Jensen Huang, the chipmaker has added a myriad of new offerings aimed at cementing its position and speeding up the rate at which companies find a use for AI software. The company now sells networking, software and services as well as complete computers.

 

RELATED
$100 Billion Microsoft Stargate AI supercomputer will re-write the future

 

The licensing deal is similar in some ways to a partnership Meta Platforms reached with data labelling startup Scale AI, in which the big tech company made a sizable investment in the smaller firm, licensed technology and hired its CEO.

Nvidia has been making investments in companies across the AI infrastructure ecosystem and is trying to keep a large lead in the market for inference — running models once they have been developed. The company’s leadership has already pledged billions to a wide variety of projects that it believes will further the overall AI industry. Nvidia agreed to invest as much as $100 billion in OpenAI and has even bought a stake in erstwhile nemesis Intel Corp.

 

RELATED
Oculus' virtual reality film Henry wins an Emmy

 

By incorporating a new type of design into what it sells, Nvidia is showing willingness to be flexible and add novel capabilities. That approach is likely aimed at keeping its biggest customers and new adopters focused on its technology at a time when in-house efforts from Google, Microsoft and Amazon are gaining momentum as the industry rushes to install as much computing capacity as quickly as it can.

Related Posts

Leave a comment

Pin It on Pinterest

Share This