Study predicts emergence of ‘intention economy’ where businesses bid for accurate predictions of human behaviour Artificial intelligence (AI) tools can be used to manipulate online audiences into making decisions – from what to buy to who to vote for, say researchers at the University of Cambridge.
The paper highlights an emerging market for “digital signals of intent”, or the “intention economy”. In this market, AI assistants can understand, predict and manipulate human intentions, and sell this information to companies that can profit from it.
Researchers at Cambridge University’s Leverhulme Centre for Future Intelligence (LCFI) hailed the intention economy as the successor to the attention economy, in which social networks keep users addicted to their platforms and serve them ads.
The intention economy involves AI-savvy tech companies selling what they know about your motivations – from hotel stay plans to your views on political candidates – to the highest bidder.
“Attention has been the currency of the internet for decades,” said Dr Jonny Payne, a technology historian at the LCFI. “Sharing your attention through social media platforms like Facebook and Instagram drives an online economy.”
He added: “Unless regulated, the intent economy will treat your motivations as if they were real. Before we become victims of its unintended consequences, we should start considering the impact such a market could have on human aspirations, including free and fair elections, a free press, and fair market competition.”
The study claims that large language models (LLMs), the underlying technology for AI tools like the ChatGPT chatbot, will be used to “predict and guide” users based on “intent, behavioral, and psychological data.”
The authors say the attention economy allows advertisers to buy current user attention through real-time bidding on ad exchanges, or future user attention by buying a month’s worth of billboard space.
LLMs can also capture attention in real time, for example, asking users if they’ve thought about seeing a certain movie — “Have you thought about going to see Spiderman tonight?” — and make suggestions related to future intentions, such as asking, “You mentioned earlier that you’re feeling overworked, should I book you tickets for that movie we discussed earlier?”
The study proposes a scenario where these examples are “dynamically generated” to match factors such as users’ “personal behavioral traces” and “psychographic profiles.”
“In the intention economy, LLMs can cheaply exploit users’ cadence, politics, vocabulary, age, gender, flattery preferences, etc., combined with intermediary bidding to maximize the likelihood of achieving a given goal (e.g., selling movie tickets),” the study says. In such a world, AI models will guide conversations to serve advertisers, businesses, and other third parties.
Advertisers will be able to use generative AI tools to create customized online ads, the report says. The report also cites the example of Cicero, an AI model created by Mark Zuckerberg’s Meta, which has achieved “human-level” ability to play the board game Diplomacy — a game that the authors say relies on inferring and predicting an opponent’s intentions.
The study adds that AI models will be able to adjust their output based on “the vast streams of data generated by users,” citing research showing that models can infer personal information from everyday exchanges and even “steer” conversations to get more personal information.
The study also proposes a future scenario in which Meta auctions off users’ intentions to book restaurants, flights or hotels to advertisers. The report says that while there is already an industry dedicated to predicting and bidding on human behavior, AI models will refine these practices into “highly quantified, dynamic and personalized forms.”
The study cites a warning from Cicero’s research team that “AI agents may learn to push their conversational partners to achieve specific goals.”
The study mentions tech executives discussing how AI models can predict users’ intentions and behaviors. The study cites Jensen Huang, CEO of Nvidia, the largest AI chipmaker, who said last year that models will “figure out what your intentions are, what your desires are, what you want to do, and present information to you in the best way possible, depending on the context.
“Shop Laptop Battery the Online Prices in UK – Large Selection of Batteries and Adapters from batteryforpc.co.uk With Best Prices.