Converting thoughts into currency
A potential marketplace — the ‘intention economy’ — would now involve the buying and selling of digital signals of intent to covertly influence everything from buying movie tickets to voting;
For years, companies have made money by fighting for your attention. Every scroll, swipe, and click helped them figure out what held your focus. This created what experts call the attention economy. It changed everything from how news spreads to the way people spend. But now, researchers say the internet is moving beyond that. Instead of just grabbing your attention, companies want to know what you’re planning. Not just what you want, but what you’re about to want. They want to understand and even shape your decisions before you make them. At the centre of this shift are large language models or LLMs. These are the brains behind chatbots and digital assistants. They’re trained on huge amounts of information and can sound very human. You’ve probably already chatted with one, whether through customer service or a virtual tutor. These AI systems don’t just understand language. They study how you speak, what you ask, and even how you respond to flattery. By doing this, they can guess what you’re thinking. And with more conversations, they get even better at knowing what you want — or what you might want soon. Researchers at the University of Cambridge claim that Artificial Intelligence (AI) tools could soon be used to manipulate the masses into making decisions that they otherwise would not. The study introduces the concept of an “intention economy”, a marketplace where AI can predict, understand, and manipulate human intentions for profit. Powered by LLMs, AI tools such as ChatGPT, Gemini and other chatbots, will “anticipate and steer” users based on “intentional, behavioural and psychological data”. “Anthropomorphic AI agents, from chatbot assistants to digital tutors and girlfriends, will have access to vast quantities of intimate psychological and behavioural data, often gleaned via informal, conversational spoken dialogue,” the research stated.
The study cited an example of an AI model created by Meta, called Cicero that has achieved a human-like ability to play the board game Diplomacy which requires the participants to infer and predict the intent of opponents. Cicero’s success shows how AI may have already learned to “nudge” conversational partners towards specific objectives which can effectively translate into pushing users online towards a certain product that the advertisers may want it to sell. The research claims that this level of personalisation would allow companies such as Meta to auction the user’s intent to advertisers where they buy the right to influence the decisions. Dr Yaqub Chaudhary from Cambridge’s Leverhulme Centre for the Future of Intelligence (LCFI) emphasised the need to question whose interests these AI assistants serve, especially as they gather intimate conversational data. The intention economy builds on the foundations of the attention economy, which commodified users’ attention for platforms like Instagram and Facebook. In contrast, the intention economy focuses on commodifying human motivations and intent, treating them as valuable currency. Prior studies have explored the attention economy’s impact on consumer behaviour, yet the transition to the intention economy has been underexamined. Existing work has often framed this shift as empowering for consumers but lacks critical scrutiny of its societal implications. Traditional philosophical definitions of intention focus on purposeful action and reasoning, while tech-driven conceptualizations view intention as a computationally operationalisable phenomenon. The assumption that human choices can be shaped within structured digital environments underpins much of LLM research and development. The first OpenAI developer conference in November 2023 marked a pivotal moment in the evolving role of LLMs.
Announcements like customizable generative pre-trained transformers (GPTs), multimodal capabilities, and revenue-sharing programs underscored OpenAI’s strategy to harness user-generated data through developer innovation. Key partners like Microsoft are investing heavily in infrastructure to accommodate LLM workloads, with Microsoft positioning Azure as the dominant cloud platform. NVIDIA and Meta also aim to integrate LLMs into core computing, leveraging them to infer human intent and enhance interactions across applications. LLMs’ ability to extract and predict user intentions signals a shift in how data is collected and monetized. OpenAI, for example, seeks datasets that express human intent to refine future AI systems, while companies like Meta develop datasets and frameworks like “Intentonomy” to categorise motivations. These advancements are also applied in commercial settings, such as ad targeting and recommendation systems, where behavioural insights drive hyper-personalised experiences. However, these innovations raise ethical concerns. Techniques for eliciting intent through conversational AI could lead to manipulation, as demonstrated in projects like Meta’s CICERO, which blends strategic play with persuasive dialogue. Generative AI bypasses traditional privacy safeguards, using content as a proxy for inferring private attributes. Partnerships like OpenAI’s with Dotdash Meredith further illustrate how intent data is commodified for advertising. The new research paper also warns that LLMs like ChatGPT aren’t just changing how we interact with technology, they’re laying the groundwork for a new marketplace where our intentions could become commodities to be bought and sold.
For decades, tech companies have profited from what’s known as the attention economy, where our eyeballs and clicks are the currency. Social media platforms and websites compete for our limited attention spans, serving up endless streams of content and ads. However, according to researchers Chaudhary and Dr Jonnie Penn, we’re witnessing early signs of something potentially more invasive: an economic system that could treat our motivations and plans as valuable data to be captured and traded. What makes this potential new economy particularly concerning is its intimate nature. “What people say when conversing, how they say it, and the type of inferences that can be made in real-time as a result, are far more intimate than just records of online interactions,” Chaudhary explains. Early signs of this emerging marketplace are already visible. Apple’s new “App Intents” developer framework for Siri includes protocols to “predict actions someone might take in future” and suggest apps based on these predictions. OpenAI has openly called for “data that expresses human intention… across any language, topic, and format.” Major tech companies are positioning themselves for this potential future. Microsoft has partnered with OpenAI in what the researchers describe as “the largest infrastructure buildout that humanity has ever seen,” investing over $50 billion annually from 2024 onward. The researchers suggest that future AI assistants could have unprecedented access to psychological and behavioural data, often collected through casual conversation. The researchers warn that unless regulated, this developing intention economy “will treat your motivations as the new currency” in what amounts to “a gold rush for those who target, steer, and sell human intentions.” This isn’t just about selling products — it could have implications for democracy itself, potentially affecting everything from consumer choices to voting behaviour. An intention economy’s targets could extend far beyond vacation planning or shopping habits. The researchers argue we must consider the likely impact on human aspirations, including free and fair elections, a free press, and fair market competition before we become victims of unintended consequences. Perhaps the most unsettling aspect of the intention economy isn’t its ability to predict our choices, but its potential to subtly guide them. As our AI assistants become more sophisticated at anticipating our needs, we must ask ourselves: In a world where our intentions are commodities, how many of our choices will truly be our own?
Views expressed are personal