The Drum Awards Festival - Extended Deadline

-d -h -min -sec

Digital Transformation Artificial Intelligence Meta

What do different AIs ‘think’ about your brand?

By John Dawson, Vice president of strategy

Jellyfish

|

The Drum Network article

This content is produced by The Drum Network, a paid-for membership club for CEOs and their agencies who want to share their expertise and grow their business.

Find out more

June 19, 2024 | 7 min read

As AIs increasingly mediate the relationships between consumers and your brand, John Dawson of Jellyfish explains why it’s important to know what these new technologies ‘think’ about your brand.

The face of a fluffy llama

Meta has big hopes for the release of Llama 3 and the integration of AI into its products / Harry Grout via Unsplash

In April, Meta released Llama 3 and integrated Meta AI into Facebook, Instagram, WhatsApp, and Messenger. Although initially limited to the US and 12 other markets, this news could see AI functionality integrated into apps used by around 3.19 billion people daily.

The Meta AI updates were introduced by Zuckerberg in a video he posted to Instagram. While his chain attracted a lot of attention online, the current and future capabilities that this launch points to are worth exploring further.

As Mark Zuckerberg said in an interview with interviewer Dwarkesh Patel: “Our bet is that [Meta AI] is going to basically change all of the products.”

Powered by AI

Explore frequently asked questions

More than a chatbot

Meta’s history is marked by major developments in the face of competition: the shift to mobile (2012), the introduction of Stories (2016), and the development of Reels (2020). Meta AI is poised to sit alongside these as an important platform and format shift in the face of pressure from OpenAI, Google, and others.

Zuckerberg said in an interview with The Verge at the time that he aims to make, “the most intelligent AI assistant that people can freely use across the world.”

The version released in April is part of a broader vision that Zuckerberg laid out in his conversation with Patel: “I think that there’s going to be a kind of Meta AI general assistant product. I think that that will shift from something that feels more like a chatbot, where you ask a question and it formulates an answer, to things where you’re giving it more complicated tasks and then it goes away and does them.”

At Apple’s WWDC, on June 10, we saw how AI can connect multiple actions autonomously to complete tasks, drawing upon personal context from your device and actions.

We don’t need to wait for a new Meta AI release to see how that shift will change social search, and the capabilities of social media more broadly. For example, rather than scrolling through posts from your favorite food influencer, you can now ask Meta AI to get the top restaurants they’ve posted about in an area of the city you are visiting.

Instead of scrolling through posts after searching for a destination name, you can better plan your itinerary by being much more conversational in how you ask for content. For example, rather than searching for ‘Tokyo’, you can ask Meta AI, “Build me a list of the top ramen restaurants that American couples recommend.”

At the moment, the answers leave a bit to the imagination – they don’t smoothly link to Instagram content for instance, nor can you use Meta AI in Instagram, to search through your messages in Messenger – but the potential for new interactions is clear.

LLMs are here to stay

The most important takeaway from these developments for advertisers is that Large Language Models (LLMs) are here to stay and will increasingly mediate the interactions between brands and consumers. In the interview with Patel, Zuckerberg further explained, “I think there’s a big part of what we’re going to do that is interacting with other agents for other people... A big part of my theory on this is that there’s not going to be just one Singular AI that you interact with.”

Translation: soon, interactions between brands and consumers will take place across multiple AI models – even between models acting on behalf of brands and consumers.

The challenge for advertisers will be how to best position themselves across the models – models that all have their differences in data and parameters, are used for different purposes by consumers, or vary in their integrations into digital services. At Jellyfish, we’ve developed Share of Model, a metric to address these emerging challenges.

Share of Model is a way to understand how the different LLMs ‘think’ about a brand, category, or product. It’s designed for a media landscape where models become the key interface. For example, as people start to ask more of Meta AI – and when it starts to do more on behalf of users – understanding how Llama 3 ‘thinks’ about your brand or product could help to identify areas of improvement to elevate your brand versus the competition in this space.

These are still early days in the development of new user experiences that are connected and accelerated by AI, but if the pace of change over the last 18 months continues, the future could come sooner than we think.

Digital Transformation Artificial Intelligence Meta

Content by The Drum Network member:

Jellyfish

Jellyfish is a marketing performance company for the platform world, where success demands a creative, multi-platform mindset. We help brands thrive, by navigating,...

Find out more

More from Digital Transformation

View all

Trending

Industry insights

View all
Add your own content +