By Martijn Versteegen, CEO & Co-founder at IMAGIN.studio
Are you already using artificial intelligence (AI) to work more efficiently? For lenders and motor finance providers, AI offers real opportunities; from speeding up credit approvals and streamlining onboarding to automating customer communications and generating compliant vehicle listings. With rising pressure to improve margins and customer experience, the appeal of AI is obvious. But have you fully considered the risks?
One growing concern is around intellectual property. In the UK, the BBC recently threatened legal action against Perplexity AI, claiming it used BBC articles to train its AI models without permission. The broadcaster demanded deletion of the content and compensation. In the US, Anthropic is now facing what industry groups call the largest AI copyright class action ever certified, with millions of authors alleging the company trained its models on pirated books.
While this may seem distant from the day-to-day work of a lender, it raises a clear red flag. If an AI tool you rely on for content creation, such as customer emails, vehicle descriptions or marketing material, was trained on unlicensed data, you could face legal or reputational exposure. Legal disputes are likely to shape the future rules around data licensing for AI training, and businesses relying on AI-generated content may soon be expected to show more due diligence.
Another challenge is AI’s tendency to “hallucinate”, that is, generate content that sounds plausible but is factually wrong. A recent example saw a Chicago Sun-Times journalist publish a list of summer reads that included several books that don’t exist. The AI tool he’d used had simply made them up. That might raise a smile in the context of arts coverage, but in motor finance, it’s a different story. Incorrect details on a finance agreement, vehicle spec or interest rate could mislead consumers, breach FCA rules, and damage your company. As AI gets embedded into back-office workflows, quality control becomes more, not less, important.
Then there’s the issue of bias. AI models learn from patterns in historical data. If that data contains skewed outcomes, for example, around gender, geography, or credit score thresholds, those same patterns can influence automated decisions or content. This poses obvious risks for firms operating under the FCA’s Consumer Duty, particularly where transparency and fairness must be demonstrable in customer journeys.
Using AI in motor finance isn’t a risk in itself, but it does require oversight. You need to know what the model was trained on, what data it’s using, and where content is coming from. You’ll also need to factor in manual review, validation, and accountability. Working with providers that focus on licensed content and transparent tooling can reduce exposure and free up internal teams from compliance firefighting.
AI can improve speed and consistency. But it doesn’t eliminate responsibility. Human judgment is still required – especially in a regulated industry like motor finance.
