Needle in an AI stack: Salesforce says your business should use more than one LLM

salesforce ceo of AI clara shih

Saleforce's CEO of AI, Clara Shih. Image: Salesforce

Salesforce announced an overhaul of its platform this week, complete with a new name. Now known as Einstein 1, it has a focus on using AI for automation and integration.

It also comes with two new flagship features. The first is a generative AI assistant called Einstein Copilot that works across the entire platform.

This can be further tweaked with Einstein Copilot Studio — a dedicated space to customise prompts, skills and AI models for Copilot.

What are Einstein Copilot and Einstein Studio?

Einstein Copilot is an AI assistant that will be integrated into every Salesforce application. The company says that users will be able to have questions answered and their workflow optimised by Copilot garnering relevant — and secure — information sourced directly from the Salesforce Data Cloud.

Einstein Copilot Studio is a tool that companies can use to modify their Einstein Copilot to more precisely address their unique business challenges.

The three core features are a prompt builder, skills builder and model builder.

And that last feature is particularly interesting. Users are able to either use Saleforce’s own AI model or integrate one of their trusted partner models.

Salesforce pushes towards literal Open AI

As the AI arms race heats up amongst big tech companies, Salesforce is placing a fair amount of importance on being an open platform.

“This is a relaunch of our Salesforce platform to create a trusted AI platform for customer companies. The Einstein 1 platform is integrated, it’s intelligent, it’s automated,” Salesforce’s EVP of product, Patrick Stokes, said during a media briefing in San Francisco.

“It is and always has been low code and no code and most importantly, it is open. It is an open platform meant to support many many different data providers, large language model providers and independent software vendors.”

Of course, there are limits to this. Salesforce is working with what it calls “trusted partner models” as part of its Einstein Trust Layer.

The idea behind this is to ensure that customers can trust the AI being used on the platform (trust is still a big issue in the space, with good reason) from both data security and accurate information generation perspectives.

According to Salesforce, Einstein Copilot utilises secure data retrieval from Data Cloud. It also masks secure and proprietary information before it’s sent to the LLM, and offers zero data retention and AI monitoring.

We previewed Saleforce’s intentions for AI integration earlier this year when Slack announced AI integration with the platform. While it does have its own Slack GPT offering for users, companies are also free to use their Large Language Model (LLM) of choice.

If you’re wondering about the connection here, Salesforce acquired Slack for a cool US$27.7 million back in 2021. Slack soft-launched its own generative AI expansion just last week ahead of Saleforce’s yearly flagship event, Dreamforce.

The benefits of not choosing just one LLM right now

During a Q&A at the end of Einstein 1 reveal, Salesforce CEO of AI Clara Shih said that a company having multiple LLMs is a “tremendous advantage”.

“It is so early in the development of AI that it’s unclear which models are going to win out or whether it’s going to be the foundation models only versus the combination of large models and small models, closed source models and open source models,” Shih said.

This is a fair point. Going beyond OpenAI and its viral ChatGPT chatbot, Google, Meta, Nvidia and Apple all have their own LLMs. And that’s just at the big end of town. There are a plethora of other choices from smaller vendors and startups.

This includes Anthropic — a company founded by former OpenAI employees. With a focus on AI research and safety, it’s developed its own LLM, Claude. It was also one of the beneficiaries of Saleforce’s $369 million investment fund for AI startups earlier this year.

“I think it’s a very smart idea for companies not to lock into one vendor and actually take an open approach so that as the models evolve. We can easily swap in and out and optimise the right model for the right task,” Shih said.

Shih went on to say that both cost, ESG and carbon footprint should factor into a company’s decision regarding which LLMs to use.

“You don’t want to use a large model for something that can be accomplished with a small model.”

Salesforce says it didn’t rush its AI

But that isn’t to say that Salesforce isn’t promoting its own model.

In the same Q&A, the panel was asked about what makes Einstein Copilot different from competitors, and Stokes didn’t shy away from talking up Einstein Copilot in an era where “copilots and AI assistants have become common”.

“Some competitors are going after productivity. We’re really going after the core customer workflows — sales service, commerce, marketing — where you have that direct interaction or exchange with with your customer,” Stokes said, before touching on the issue of speed to market.

“It’s been a quick transition to AI in the last 18 months or so, but we really wanted to make sure that we didn’t rush it,” Stokes said.

“We spent a lot of time really deeply thinking through how we connect these applications to a large language model in a safe and trusted way so that customers can protect their most important asset, their data.”

The author travelled to San Francisco as a guest of Salesforce.

COMMENTS