Last month, the Ontario Securities Commission (OSC) released a report on research performed in conjunction with the Behavioral Insights Team (the BIT) into the role that artificial intelligence (AI) plays in supporting retail investor decision making. The OSC then published a follow-up report with the BIT with the assistance of the consultancy firm Behavioural Insights Team Canada entitled Artificial Intelligence and Retail Investing: Scams and Effective Countermeasures. The newest report explores how malicious actors are increasingly exploiting AI capabilities to deceive investors and orchestrate fraudulent schemes.

The OSC collaborated with the BIT to provide a research-based overview of:

  • The use of AI to conduct financial scams and other fraudulent activities, including how:
    • Scammers use AI to increase the efficacy of their financial scams;
    • AI distorts information and promotes disinformation and/or misinformation;
    • Effectively people distinguish accurate information from AI-generated disinformation and/or misinformation;
    • The promise of AI products and services are used to scam and defraud retail investors; and
  • The mitigation techniques that can be used to inhibit financial scams and other fraudulent activities that use AI at both the system and individual levels.

To achieve this, the teams employed two research streams:

  • A literature and environmental scan to understand current trends in AI-enabled online scams, and a review of mitigation strategies to protect consumers.
  • A behavioural science experiment to assess the effectiveness of two types of mitigation strategies in reducing susceptibility to AI-enhanced investment scams.

The report concluded that AI-enhanced scams pose significantly more risk to investors, with participants investing 22% more in these scams compared to conventional ones. Malicious actors are effectively deceiving investors by:

  • Using generative AI technologies to ‘turbocharge common investment scams’ by increasing their reach, efficiency and effectiveness;
  • Developing new types of scams that would have otherwise been impossible without AI, like deepfakes and voice cloning; and
  • Exploiting the promise of AI through false claims of ‘AI investment opportunities.’

To address these heightened risks, the report identified promising strategies that could mitigate the harms associated with AI-related investment scams:

  • System-level mitigations which limit the risk of scams across all (or a large pool of) investors; and
  • Individual-level mitigations which help individual investors in detecting and avoiding scams.

Some platforms already use mitigation techniques, including by filtering content, and disabling accounts that spread disinformation. The report found that preventative techniques, including both system-level mitigations and individual-level mitigations, may be needed for retail investor protection. On an individual level, techniques such as the inoculation technique and web browser plug-ins that flag potential ‘high-risk’ opportunities, proved effective in significantly reducing the impact of these scams. The ‘inoculation technique’ involves providing users with high-level guidance on scam awareness before they encounter specific investment opportunities. The use of the web-browser plug-in reduced investment in fraudulent investment opportunities by a statistically significant 31%.

A related report, entitled Gamification Revisited: New Experimental Findings in Retail Investing, summaries research conducted by the OSC with the BIT into the effects of digital engagement practices on investor behaviour. The intent of the experiment was to measure the influence of non-expert, social information on the trading behaviour of Canadians. The experiment indicated that participants in the social interactions feed, which enable platform users to interact with other users, made 12% more of the total volume of their trades in promoted stocks compared to the control group. Participants exposed to copy trading conditions, where functionality of the platform allows users to copy the trades of other profiled users, made 18% more of the total volume of their trades in the promoted stocks. The findings could mean that socially based engagement techniques can influence behaviour by encouraging trading in specific assets.

The report recommended that regulators consider whether to limit digital trading platforms from using digital engagement practices that can compromise investor protection, potentially including points, top traded lists, social interaction feeds and copy trading, and to continue gathering data to measure the impact of these types of practices.

Reports such as these may be good indicators of areas where future regulation, additional guidance or consultations may be forthcoming.

October 31, 2024