Expectations vs. reality: how L&D teams really use AI

By Rares Bratucu

AI promised to transform training. We looked at how L&D teams actually use it and found a few surprises.

Orange gradient background with white text: "easygenerator. How L&D teams truly harness AI in agile instructional design. Lessons from real, everyday discussions.

Last updated on December 11, 2025

AI is becoming part of daily work for many L&D teams. Some teams moved fast, and others are still exploring, but one thing is clear. AI helps people create company-tailored training faster, improve engagement, and make it easier for employees to share their knowledge without needing instructional design skills. It also helps L&D teams understand skills gaps and shape learning paths that support real work. These are exciting possibilities, and for some teams, they are already real.

To understand how that’s playing out, we reviewed over 1,500 lines of feedback from conversations between our Customer Success team and L&D professionals.

These weren’t surveys or studies. They were real, everyday discussions.

What we found was a lot more nuanced than we expected. Some teams followed the intended path. Others built creative workarounds. And many were still figuring it out. Here’s what we learned.

The original plan: how we first expected AI to support L&D

When we launched EasyAI, the idea was simple: help users go from idea to course in just a few steps.

We thought L&D teams and employees would:

  • Upload a source document (like a manual or PowerPoint)
  • Let AI draft a course structure and content
  • Use AI-powered Quick Actions to refine the tone or wording
  • Publish company-tailored training faster than ever

The idea was to reduce the friction of course creation. Users could focus on what they do best: sharing knowledge; instead of getting bogged down in formatting, instructional design, and layout decisions.

And we weren’t alone in this thinking. According to Brandon Hall Group’s 2025 report on AI in corporate learning, 87% of organizations believe automated content creation is critical to the future of L&D.

For many of our users, that vision became a reality. Since its launch, over 75,000 courses have been created with EasyAI. L&D teams saw a 75% increase in course authors and built up to 9x faster training.

When AI works, it supports L&D teams in meaningful ways

Plenty of L&D professionals embraced the tools just as intended, and they saw results. For these teams, AI became more than nice to have. It became a dependable part of their workflow.

“I used it to build an entire course with EasyAI, including generating the questions. It did a really great job.”

These users appreciated the speed, simplicity, and support AI provided:

  • One team used AI to convert course notes into interactive training “within minutes.”
  • Another praised EasyAI’s output, which was “incredibly helpful even for seasoned e-learning course builders.”

They followed the designed flow: upload content, generate a draft, review, and refine. These early adopters treated AI like a co-creator, using it to boost efficiency without expecting perfection.

Teams in manufacturing, tech, and logistics saw especially strong results. For example:

  • A manufacturing team used AI to democratize course creation, making it easier for frontline experts to contribute.
  • A tech company streamlined training localization, using AI to prepare content for multiple regions.
  • A logistics team cut down manual formatting and editing by combining Quick Actions with internal templates.

In these cases, AI didn’t replace their expertise. It amplified it.

The experimenters using AI with cautious curiosity

The largest group we found in our analysis wasn’t confused or frustrated. They were experimenting. These users were actively exploring how AI could fit into their work. They asked questions to understand how to get real value from the tools.

“I’m wondering, EasyAI, where is it pulling data from?”

“Is there a way to change the tone of the AI output? Something more casual instead of something so professional?”

Rather than rejecting the tool, they asked questions. They experimented. And they often used AI for specific, low-risk tasks:

  • “I prefer to dump PowerPoint files into EasyAI.”
  • “Is there an AI guide we can share with the team?”
  • “Can I use [this] to draft questions only without doing the whole course?”

This group revealed something important: adoption isn’t just about having the best tools. It’s about having the right support to use them well.

Many users turned to our Customer Success team to learn how to get started, tweak AI outputs, or introduce AI to their colleagues. Whether it was a quick tip, a walkthrough, or a live call, that support helped them move from “just testing” to building real training.

We’ve learned that the right tools matter. But helping people feel confident using them matters even more.

Why a gap exists between AI expectations and reality

So why does the gap between expectation and reality persist? Our analysis revealed five key reasons:

1. Misunderstood capabilities

Some users didn’t realize AI could work from a simple prompt, not just a full document:

“How can EasyAI do anything if you don’t have any content?”

That surprise actually points to a strength: users can start from scratch. Even a few words are enough for AI to generate a draft, build a structure, or suggest quiz questions.

Another common example is users who thought they had to go through the full course builder flow when, in fact, they could apply Quick Actions to any text at any time, whether it’s inside a block or copied from a chat.

The tools worked. They just had more to offer than some teams initially realized.

2. Compliance and privacy blockers

Users who wanted to use AI couldn’t always get approval. Internal legal or security teams often delayed or denied access.

“Our compliance team made the checks, but in the end, they just told us we can’t use it.”

Even internal AI tools were treated with caution, especially in industries like healthcare, finance, and pharma. Users weren’t saying no; they were saying “not yet.”

3. Distrust in content quality

Some users were concerned about factual accuracy or copyright risks:

“If this could end up being a copyright infringement, what would be the solution from Easygenerator?”

They wanted assurance that using AI wouldn’t lead to legal trouble or misinformation. In response, some teams took extra care. One team, for example, created a checklist to verify AI-generated content against trusted internal sources, especially when the training touched on sensitive or regulated topics.

This is where employee input makes a big difference. Because they understand the context deeply, they can quickly spot what sounds off, needs editing, or simply wouldn’t land with learners.

4. The expectation gap

Some expected AI to do everything automatically, while others expected tailored results without much input. The reality is somewhere in between, and setting expectations is critical.

Without that clarity, even good results can feel disappointing.

5. Learning takes time

Many users were still getting used to the idea of working with AI. Even with tools ready to go, teams needed time to explore, experiment, and find the right use cases.

“Is there a step-by-step guide or webinar we can share with our team?”

Instead of diving in, some teams stuck to what they knew, especially if they weren’t sure where to start. This wasn’t a lack of support; it was a natural part of learning something new.

Unexpected uses and creative workarounds

Not every team followed the standard AI workflow, and that’s good. Some of the most creative uses came from users bending the tool to fit their needs.

“Dump and polish” from employees

Instead of asking employees to write polished content, some teams let them jot down bullet points. Then, they used Quick Actions to clean things up.

“Our engineers just type out bullet points, then we click improve, and AI rewrites it, so it sounds like actual training.”

PowerPoint as a knowledge primer

Some users uploaded PowerPoint slides not to create a course layout but to give EasyAI background info. Then, they rebuilt the structure manually, using AI to support rewriting.

“We just uploaded PPT files, so EasyAI knows what we’re talking about, but we ended up rebuilding the structure ourselves.”

Rewriting compliance content

One team uploaded raw compliance text, ran several Quick Actions in a row (simplify, reword, generate questions), and turned a policy doc into real training.

“We used AI to rewrite the GDPR policy in easier terms and generate knowledge checks at the end.”

Updating outdated content

Some teams used EasyAI not to create courses from scratch but to refresh existing ones. They uploaded outdated materials and used Quick Actions to polish everything up.

“We had old training files that needed a refresh, so we ran them through AI to clean up the language and rebuild the knowledge checks.”

What this means for L&D professionals

Whether you’re already using AI or still getting started, here’s what we’ve learned:

  • Start small: use AI to summarize, improve tone, or generate quiz questions. It doesn’t have to be all or nothing.
  • Expect to edit: think of AI as a first draft, not a final product. This mindset helps teams save time without sacrificing quality.
  • Train your team: even a short guide or quick demo can help teams get more value from AI. Some companies created internal “AI champions” to answer questions and run demos.
  • Use AI where it helps most: many users succeed with repetitive, low-risk tasks, not complex or critical content.
  • Stay aligned with compliance: if AI is off-limits for now, look for workarounds (like using Quick Actions on public content).
  • Experiment: we’ve seen unexpected use cases within companies where AI adds value in ways the teams didn’t expect. It calls for an open mindset and willingness to try new things, especially when new AI innovations are released rapidly.

Most users we’ve heard from found real value in AI once they matched it to the right tasks. Some started small and scaled. Others saw a single high-impact use case and stuck with it. No two paths looked the same, and that’s okay.

Remember: adoption is a journey. Real impact comes when AI fits into your process, not vice versa.

AI in L&D is a work in progress, and that’s a good thing

Some teams are flying ahead with AI, while others are still trying it for the first time. All of this is valuable. It means we’re figuring out what works, what doesn’t, and where the real value lies. Do users care enough to experiment, question, and offer feedback? That’s progress.

As long as we keep listening, improving, and adapting, AI in L&D will move from promise to practice, one insight at a time.

About the author

Rares is a Content Specialist at Easygenerator. He spends his time researching and writing about the latest L&D trends and the e-learning sector. In his spare time, Rares loves plane spotting, so you’ll often find him at the nearest airport.

Frequently asked questions

How to use AI in learning and development?

AI supports everyday tasks that help L&D teams work faster. You can summarize long documents, rewrite complex text, or generate questions, then adjust the output to match your company’s voice. Easygenerator helps teams apply these steps inside a simple workflow.

What are the most common AI use cases for corporate training? +

Many teams use AI to draft course structures, improve tone, prepare assessments, and refresh outdated learning materials. Easygenerator users also rely on AI to turn rough notes into clear training content that supports real work.

What problems does AI solve for L&D teams? +

AI reduces bottlenecks in content creation and cuts time spent on rewriting, editing, and formatting. It gives teams a reliable starting point so they can focus on accuracy and context. Easygenerator makes this process accessible to more authors.

How does AI help L&D teams scale learning faster? +

AI helps teams create first drafts in minutes, which speeds up the entire workflow. Easygenerator supports this by helping employees share their knowledge without waiting for long production cycles.

What types of AI tools are most useful for L&D? +

Tools that summarize text, rewrite content, generate questions, or build draft structures bring the most immediate value. Easygenerator includes these capabilities so teams can move from ideas to training quickly.

Can employees create training with AI in Easygenerator even if they are not instructional designers? +

Yes, employees can create training even without design experience. Easygenerator guides them through clear steps while AI supports structure, tone, and clarity. This helps more people share their expertise in a consistent way.

What types of content can L&D teams create with AI in Easygenerator? +

Teams can create company-tailored courses, knowledge checks, refreshes of old materials, and simple on the job resources. AI helps shape rough inputs into clear and practical training that matches real needs.

It's easy to get started
  • 14 day trial with access to all features. Start with variety of course templates.
  • Get unlimited design inspirations. Level up your courses.
  • Upload your PowerPoint presentations. Get instant courses created.