The FCA has raised concerns that design features on trading apps, such as confetti animations, push notifications, and leaderboards, may be encouraging risky behaviour and poorer investment outcomes, especially among younger and less experienced investors.
These features, known as Digital Engagement Practices (DEPs), are becoming more common across trading platforms. In a newly published paper, the regulator found that people using apps with a lot of DEPs were more likely to trade frequently, invest in risky products like crypto, and ultimately lose money.
The FCA analysed data from several UK trading apps, grouping them by how many digital engagement features (DEPs) they used. Users of high-DEP apps traded up to seven times more and were more likely to invest in risky products like crypto. They showed behaviours such as excessive use and impulsive trading, and ultimately saw worse investment outcomes.
The report shows a link between app design and risky investor behaviour, backing the FCA’s concerns that gamification and nudges could lead to poor outcomes for consumers.
The FCA is calling on firms to design platforms that support smart, responsible decisions and protect people’s long-term financial health, as more than 21 million UK adults now invest, many of them through digital apps.
This research comes as Tisa calls for AI regulation to protect consumers from things such as search engines that may offer misleading financial advice.
In its submission to the Treasury Select Committee’s call for evidence on Artificial Intelligence (AI) in Financial Services, Tisa points out that while AI can improve efficiency and provide personalised financial advice, without regulation, consumers are at risk of misinformation.
Tisa is concerned about consumer protection, especially for vulnerable users who could get harmful advice from unregulated AI. It’s urging the FCA and HM Treasury to regulate AI search engines to keep consumers safe.
Additionally, Tisa wants the FCA to remove barriers to responsible AI innovation and back initiatives like Open Finance to improve transparency.
Tisa also calls for regulators to be better equipped with the resources and expertise needed to manage AI risks and ensure systems are clear and unbiased.
Tisa policy executive Phil Turnpenny says: “AI is already transforming the financial services landscape, and as it continues to evolve it will fundamentally reshape how consumers interact with their money and financial products. By encouraging innovation, the transformative power of AI can empower millions to make better financial decisions and have a bright financial future.
“Whilst this will provide significant value for both consumers and firms, as with all emerging technologies, there are new challenges that need to be addressed promptly to protect consumers. For example, not all AI is created equal. Search engine-based AI tools are increasingly being used as a source of financial advice — often without oversight, transparency, or accountability. This has created a wild west situation and poses serious risks, particularly for vulnerable consumers who may be misled by confident-sounding but inaccurate information.
“That’s why we’re calling on the FCA and HMT to regulate this space urgently. At the same time, our research shows that properly designed and regulated AI has the power to improve financial access and outcomes — especially for underserved groups. We must draw a clear line between risky, unregulated tools and those that are built with consumer protection at their core.”