Real Learning in the Age of AI — Part 1
Over the past decade, the global learning landscape has transformed. From universities and think tanks to independent educators and consultancies, education has increasingly moved online. This transition accelerated during the pandemic — and has now entered a new phase shaped by artificial intelligence (AI).
Today, anyone opening their social media feed will encounter a flood of attractive offerings:
- “Become a certified expert in two hours.”
- “AI-powered leadership training — limited time only.”
- “Fast-track qualification — globally accredited.”
- “Communicate like a leader in just three modules.”
- “Certified holistic therapies and healing courses — no prior experience needed.”
- “Certified life coach and governance compliance expert in one programme.”
The visuals are polished. The claims are confident. The promises are sweeping.
AI has radically lowered the cost of producing professional-looking materials: entire curricula, slide decks, marketing campaigns, and even “testimonials” can be generated in seconds. In this saturated environment, distinguishing between genuine learning and decorative content has become a matter of both ethics and governance.
At the Centre for Transnational Development and Collaboration (CTDC), we increasingly meet organisations and practitioners who have invested time, money, and trust into courses that looked credible but delivered little depth, context, or accountability. The problem is not AI itself. The problem is how AI is being used, and what it enables when critical scrutiny is absent.
This blog sets the frame for our five-part series on “Real Learning in the Age of AI: How to Tell What’s Worth Your Time.” Here, we explore how the educational market has shifted, and why the rise of AI makes careful evaluation more important than ever.
🌱 The Illusion of Abundance: When More Courses Do Not Mean More Learning
The explosion of online courses has created the impression of abundance: more choice, more accessibility, more speed. Yet abundance does not equate to quality. In fact, it often conceals the opposite.
AI has drastically changed the surface of education without changing the substance required for real learning.
Today, AI can instantly produce:
- “comprehensive” toolkits
- step-by-step gender equality and EDI frameworks
- professional-looking monitoring and evaluation templates
- polished DEI glossaries, leadership models, and case studies
- “Communicating as a leader” packages, certified life coaching scripts, and governance compliance manuals
The problem is not that these materials exist. The problem is that they often lack:
- conceptual grounding
- ethical awareness
- structural and power analysis
- lived experience
- critical pedagogy
- Rigorous methodology,
AI makes it easy to replicate the aesthetics of expertise without depth. For areas such as safeguarding, DEI, gender justice, trauma-informed practice, organisational governance, holistic therapies, life coaching, and leadership development, this superficiality is not merely inadequate — it is actively harmful.
When technical gloss replaces rigorous learning, organisations absorb depoliticised, decontextualised frameworks that can reproduce harm rather than prevent it.
🧠 Content Is Not Learning. Automation Is Not Pedagogy.
Across CTDC’s organisational assessments and learning design work, one finding appears again and again:
Learning is not defined by the volume of materials presented, but by the depth of engagement facilitated.
AI-generated content can give the illusion of:
- language of empowerment
- standardised frameworks
- trauma-sensitive principles
- methodological approaches
- safeguarding standards
But these outputs lack the relational, social, ethical, and political grounding that turns content into learning.
Content is what you receive — a video, a PDF, a set of tools.
Learning is what happens when you:
- examine assumptions
- understand power dynamics
- apply ideas to your context
- reflect ethically on your role
- engage with others
- build new practices
This distinction matters because AI accelerates content creation, but it cannot substitute:
- methodological coherence
- contextual adaptation
- critical questioning
- reflexivity
- lived experience
- ethical responsibility
The risk is that learners begin to confuse “polished content” with “pedagogical and methodological rigour”.
🧩 AI as a Tool: Helpful When Guided, Harmful When Unexamined
AI is not the problem; uncritical, unguided use of AI is.
AI can meaningfully support education when:
- ✔ it assists drafting so educators can focus on design
- ✔ it improves accessibility through translation, and simplification,
- ✔ it generates examples that facilitators then contextualise
- ✔ it supports learners through structured prompts or reflective questions
- ✔ it is embedded within an intentional pedagogical approach
In these cases, AI is a tool within a human-led learning process.
AI becomes harmful when:
- ✘ it replaces educators entirely
- ✘ it generates entire courses without methodological grounding
- ✘ it copies mainstream frameworks uncritically
- ✘ it reproduces global hierarchies and narrow Western models
- ✘ it prioritises speed and aesthetics over accuracy and ethics
Here, AI is not expanding education — it is flattening it.
Superficial materials, when offered as expertise, can mislead practitioners and weaken organisational systems.
This is particularly dangerous in fields where careless framings have ethical consequences — such as safeguarding, gender justice, conflict, trauma, or social policy.
⚖️ The Stakes: Why Poor-Quality Learning Is Not Just a Waste of Time
Low-quality education has always existed, but AI increases both its volume and its credibility. The risks include:
- Misinformed practitioners
Learners adopt tools and frameworks that are untested, depoliticised, or inappropriate. - Harmful organisational practices
Superficial safeguarding training, for instance, can lead to misclassification of harm, weak confidentiality systems, or survivor-silencing mechanisms. - Extractive relationships
Courses marketed aggressively but grounded in minimal substance commodify sensitive fields and extract value without contributing to meaningful change. - Loss of trust
When learners realise, they have been misled, trust in the broader learning ecosystem erodes. - Reinforcement of global inequalities
AI-generated content often draws on dominant knowledge systems, intensifying epistemic injustice and sidelining contextual, community-driven expertise.
Given these stakes, choosing courses requires the same level of critical scrutiny that we apply to governance, safeguarding, and organisational accountability.
🔍 Why We Need Educational Due Diligence
CTDC’s experience across sectors shows that people often rely on reputation, branding, and word of mouth when choosing learning programmes. In the age of AI, these cues alone are no longer sufficient. But in the age of AI, this is not enough.
We need educational due diligence — a structured approach for assessing:
- who created the course
- what informs the methodology
- how learning will occur
- what the provider’s history and governance look like
- how responsibly AI is used
- what ethical frameworks guide the work
Due diligence is the recognition that education shapes practice — and practice shapes people’s lives.
Over the coming weeks, this series will map the components of a responsible decision- making process for learners, institutions, and practitioners. From identifying genuine educators to spotting synthetic content, from evaluating accreditation claims to conducting a 15-minute audit, the series will offer a practical toolkit to navigate an increasingly complex learning market.
🌍 At the Centre for Transnational Development and Collaboration
CTDC designs and delivers learning rooted in:
- feminist, decolonial and interpretive methodologies
- structural analysis of power and harm
- lived practitioner experience
- rigorous pedagogy
- relational accountability
- context-sensitive approaches
We also use AI — but within a transparent, critical, human-led framework.
As we prepare to launch our upcoming practice camps, we invite learners to apply the same standards of scrutiny to our work as to any other offering. Accountability must be mutual.
In a world full of content, learners deserve depth.
In a world full of automation, learners deserve integrity.
In a world full of speed, learners deserve context.
Reach to Us
Have questions or want to collaborate? We'd love to hear from you.