Has AI in L&D run out of steam?
HAS AI IN L&D RUN OUT OF STEAM? Emma Jones thinks it might be time to change the tone…
Having worked in learning and development for more than 15 years, across internal, external, customer facing and business-to-business roles - I’ve experienced the changes, restructures, and challenges that Learning and Development professionals have had to navigate.
Over that time I’ve become a firm believer in injecting as much personality into my writing as I can which has helped to create meaningful learning moments that have sparked joy and left a lasting impression on people. The way we communicate with learners is arguably the most important part of the learning experience. Language that resonates and stories that reflect how learners experience life in the real world are the hooks that keep learners engaging with our content.
With the release of the most recent State of Digital Learning Report (linked below) and the focus on AI in Learning and Development (L&D), I wanted to share some of my musings on the fatigue that learners are undoubtedly experiencing. The report highlights that learners are having to fit their learning into already fractured, stop-start working days. With only a tiny minority saying their learning always feels relevant, our learners are coming to us tired, only to receive content presented in a tone of voice that doesn't resonate, creating further dissonance.
--
Humans are born ready to learn. Even after we leave formal education our minds are still hungry for knowledge. We join the world of work and suddenly much of our learning is placed in the hands of the organisation we work within, more specifically, the beloved (?), but occasionally much maligned, L&D.
It feels strange to be writing this in 2026 but our learners are coming to us tired, only to receive content presented in a tone of voice that doesn't resonate, creating further dissonance. They are left digesting learning experiences featuring stories so poor they leave you rolling your eyes with systems in place to stop speed-running… the content isn’t engaging enough or as relevant as it needs to be.
But why is it (still) like this?
Meeting the expectations of learners while simultaneously overcoming business challenges is difficult. I’ve heard that L&D ‘needs to earn a seat at the table’ (ew, by the way) and much of that seems to be achieved by meeting tighter deadlines and ticking more boxes.It might be uncomfortable to read but according to the report, in 2025 only 11% of L&D professionals seemed concerned with skills application and retention as outcomes of a learning experience, so why exactly are we producing that learning in the first place?
I am pleased to note that the figure is much higher this year at 67%, but I think we need to be careful in how we go about teaching skills, because the report also points to L&D departments starting to use AI overwhelmingly to increase efficiency, with ‘faster content generation’ being cited as the top reason to use AI as part of a learning strategy by 75.5% of respondents. This creates a disconnect between L&D departments and learners.
If L&D departments focus on using AI to speed up production, specifically content creation, but learners recoil when they read a piece of AI generated material, we’re going to have a problem. CEdMA recently published a paper looking at the impact of AI content on learners within tech organisations, the TLDR of which is that “learners are deeply selective about what they’ll accept from AI. Their attitudes vary dramatically depending on the type of content and how it’s used…and these attitudes matter because they directly shape engagement, trust, and learning outcomes.”
I am forced to consider if the use of AI isn't an intentional, calculated move from L&D professionals but more a comment on the culture we often find ourselves working within. Scale it faster, do more with less. You can probably get away with letting AI write that. ‘Earn your seat at the table.’
L&D creators clearly understand the issues, with the report citing that the top two concerns these departments have is a risk to learning integrity, and erosion of the human centered learning we know gets the best out of people. I find it interesting then that the third concern is the future viability of L&D roles in business. Surely, with content in such a poor state, L&D professionals are needed now more than ever?
We need the skills within these departments to re-centre learning around the human experience, and that absolutely must start with how we communicate with our learners.
Poor quality content creates mistrust and disengagement in learners very quickly. Rebuilding that trust and engagement is a much harder, longer road when our learners assume (in most cases quite rightly) that the learning ‘experience’ that just landed in their inbox is in all likelihood going to chronically underestimate them as individuals.
So, what’s the best way forward?
I am fortunate to work with some clients who are tackling this head on, rethinking their tone of voice to meet their learners in the human experience, speak to them like people, and challenge the status quo that starts with the way we communicate content. They are in the minority however and my experience is more frequently that organisations are not brave enough to make those changes. All too often I find when I inject a more human-centric, real, or funny tone of voice (because it is part of my style) I often receive pushback , because the status quo has remained unchanged, it's too ‘risky’, and all this worsened by AI because it is learning FROM that status quo.
AI didn’t learn on its own. It was given examples humans created that it then tries to recreate (often poorly) and the cycle continues. This results in an eventual collapse of the entire model as it slowly cannibalises itself. “We discover that indiscriminately learning from data produced by other models causes ‘model collapse’—a degenerative process whereby, over time, models forget the true underlying data distribution” Shumailov, I., Shumaylov, Z., Zhao, Y. et al.
We are at a crossroads on the verge of AI doing more harm than good, particularly as we have collectively started to use it without the strategic direction it clearly needs.
The report cites that, where AI is concerned, “experimentation is high, but the overall strategic direction remains unclear”. It goes on to mention that just under half of L&D departments are prioritising selective AI adoption. I think that might be an interesting comment on how we know AI can’t produce the experiences we need it to with ‘selective’ being the operative word. We want to be efficient, but we know producing learning that doesn’t support real change, and doesn't recognise our learners as individuals with their own thoughts and rhythms, is a mistake, with the report acknowledging that the pressure is on to better acknowledge wellbeing and foster human connection. If we are really playing the corporate long game not recognising our learners certainly won't ‘earn us a seat at the table’.
In the immortal words of one Rob Gordon of High Fidelity fame (probably my favourite movie ever): “Do I listen to pop music because I am miserable? Or am I miserable because I listen to pop music?”
Are we using AI because culturally, we don’t have the time or the presence to create the high impact genuine engagements our learners need? Or are we lacking time and presence because we are using AI as a reason to create more, lower quality content in tighter timelines, while boxing L&D into a place where it is no longer given the time that it needs… a lack of time that is doing irreparable damage to the quality of our entire learning landscape?
Our learners are telling us that the AI hype-train is running out of steam for L&D far sooner than we thought. Nearly half state the quality of their learning hasn’t improved over the last year and AI-produced learning is rated only adequate or poor 41% of the time.
We need a strategy for certain, and I think that might need to start with this:
It’s time for L&D departments everywhere to challenge the status quo. Organisations need to stop underestimating learners, start talking to them like they are people and push the ‘boundaries’ of tone of voice.
We need to start looking beyond ‘completion rates’ (Guys, if a learning experience is mandatory, the completion data is useless to me) as useful data and start to care about how our learners are experiencing the content we create, trying new modalities, talking in different ways, asking for feedback on that, and collating that data. (Hang on, I’ve just worked out how AI can be helpful here,) then push forward with the information we find.
It may feel a little uncomfortable, and I don’t underestimate the sense of risk many organisations will contend with, but risk and reward tend to go hand in hand right?
This is a tipping point. According to the report L&D professionals are worried AI will replace their roles but consider this: AI won’t make L&D redundant, L&D will make itself redundant by not challenging the way we meet our learners. We have a window of opportunity to do something different, and can we really let an opportunity to tell better stories, have greater impact, and better engage our learners pass us by? So next time you’re looking at content, getting pressure to produce the same thing you’ve always produced, try something new.
There’s a human brain out there that will thank you for it…
And while we’re at it let’s give AI some decent examples to learn from.
Take a look at the State of Digital Learning 2026 report here.
CEdMA - The AI Content Explosion - What Your Learners Actually Think (And Why It Matters) (Dec 25)
Emma Jones is an innovative, creative training and learning development professional with a focus on strategic development and deployment of training materials. She is particularly passionate about technical learning and has many years' experience designing and delivering enterprise scale solutions that upskill and motivate learners.
Connect with Emma here: https://www.linkedin.com/in/emmagribble/

Member discussion