Neural Notes: Microsoft absorbs Inflection AI, Nvidia ups the chipsets game

ai

Source: SmartCompany

Welcome back to Neural Notes: A weekly column where I condense some of the most interesting and important AI news of the week. In the latest edition, we have Microsoft which acquires an AI startup and Nvidia announces even more AI chips.

Microsoft strikes a US$650 million deal with Inflection AI

Microsoft has entered into a hefty US$650 million agreement with Inflection AI, less than a year after being a part of the startup’s US$1.3 billion raise.

The deal isn’t being positioned as an acquisition, but it does involve licensing Inflection’s AI technologies and bringing the company’s co-founders — Mustafa Suleyman and Karén Simonyan — along with a significant portion of their team, in-house.

They are joining to create what Microsoft is calling a new consumer-focused organisation — Microsoft AI — with Suleyman as EVP and CEO.

“As part of this transition, Mikhail Parakhin and his entire team, including Copilot, Bing, and Edge; and Misha Bilenko and the GenAI team will move to report to Mustafa,” Microsoft CEO Satya Nadella said in a blog post.

“These teams are at the vanguard of innovation at Microsoft, bringing a new entrant energy and ethos, to a changing consumer product landscape driven by the AI platform shift. These organisational changes will help us double down on this innovation.”

By licensing Inflection’s AI models and integrating key personnel into its operations, Microsoft aims to bolster its AI capabilities, especially around consumer products like Copilot, Bing, and Edge. The financial breakdown includes US$620 million for technology licensing and approximately US$30 million in relation to the employee transition.

While the move has been lauded for its potential to enhance Microsoft’s AI offerings, it also raises questions about the future direction of Inflection AI, now significantly scaled down in terms of its workforce. The deal reflects the continuing trend of major tech companies fortifying their AI arsenals through strategic partnerships, investments, and talent acquisition.

Nvidia is still going hard

Speaking of Inflection AI investors, Nvidia has been front and centre of the AI stage again this week.

Nvidia introduced its new Blackwell series of chipsets at its GTC developer conference this week.

“I hope you realise this is not a concert,” Nvidia president Jensen Huang said on the stage.

“You have arrived at a developers’ conference. There will be a lot of science describing algorithms, computer architecture, mathematics. I sense a very heavy weight in the room — all of a sudden, you’re in the wrong place.”

Nvidia has been at the forefront of AI hardware over the last 18 months. While it has been in the chipset games for decades, it was a tech company that wasn’t in the mainstream limelight. It was mostly known by tech and gamer nerds such as myself until its GPUs began being bought up in the crypto mining boom, and now the AI renaissance.

Now it seems like Nvidia is everywhere and has the stock price to prove it. In Nvidia’s latest earnings report in February, the company saw its revenue hit US$22.1 billion — a 265% year-on-year increase — as well as its share price jump by 16% to US$785.38. That has seen a further jump in the wake of the conference this week. At the time of writing it, was sitting at around US$914 a share. Its market value is now US$273 billion.

So it’s unsurprising that GTC felt more like a concert. Or perhaps a passionate rally.

“There’s a new Industrial Revolution happening in these [server] rooms: I call them AI factories,” Huang said.

The Blackwell series represents Nvidia’s most advanced AI chips to date, boasting substantial improvements in speed and efficiency over their predecessors. According to Huang, Blackwell offers the power of two chips in one and is “pushing the limits of physics of how big a chip could be”.

“I’m holding around $10 billion worth of equipment here. The next one will cost $5 billion. Luckily for you all, it gets cheaper from there,” Huang said.

Blackwell is between two and 30 times faster than Nvidia’s previous generation. According to Huang, it took 8,000 GPUs, 15 megawatts and 90 days to create the GPT-MoE-1.8T model. With the new system, you could use just 2,000 GPUs and 25% of the power.

This leap in performance is expected to have far-reaching implications, from accelerating the pace of AI research and development to enabling more sophisticated AI applications in areas like autonomous vehicles, natural language processing and more.

Alongside the Blackwell series, Nvidia unveiled a slew of AI-focused innovations and partnerships aimed at bolstering its ecosystem. This includes enhancements to its software platforms and tools designed to streamline the deployment of AI models across different computing environments.

Nvidia’s announcements reflect a strategic push to make AI more accessible and efficient, addressing some of the key challenges in AI development, such as the escalating costs of computing power and energy consumption.

What else is going on in AI this week?

COMMENTS