Reflections from the AI-in-Mentoring SIG, MenTRnet | 22 March 2025
On 22 March 2025, participants in the MenTRnet AI-in-Mentoring Special Interest Group (SIG) came together again to explore the role of AI in teacher-research mentoring – this time focusing on data analysis. Building on our previous discussion about AI’s role in data collection, this session was about what happens after the data is gathered: How can AI tools support analysis? Where might they fall short? And how can we, as mentors and teacher-researchers, use them critically and ethically?
Why We Met: A Quick Snapshot
Our purpose was to understand the practical and pedagogical implications of integrating AI into data analysis in Exploratory Action Research (EAR) and EAR-mentoring. Mentors in our group shared hands-on experiences with tools like ChatGPT, Gemini, and DeepSeek, applying them to real teacher-research contexts.
After the data collection phase, AI was used to:
- Summarise interview transcripts
- Suggest initial codes and themes
- Generate follow-up prompts and analytical questions
- Serve as a reflective partner during coding decisions
- Create initial findings to discuss with mentees and ask them about
Balancing AI and Human Thinking
Throughout the session, some key messages echoed: AI is not a replacement for human judgment. Participants voiced appreciation for AI’s speed and structuring, but they also cautioned against using it blindly.
Key takeaway: AI should not be positioned as a “final answer” generator but rather as a thinking partner, something to bounce ideas off, compare findings with, or use to stimulate deeper reflection.
Some concerns included:
- Over-reliance on AI might dampen teacher autonomy
- AI tools often offer generic summaries that lack classroom context
- Biases and assumptions built into AI tools can distort findings
- The act of prompt design requires expertise, clarity, and contextual awareness
Key takeaway: teachers and mentors should attempt their own analysis first, then use AI to compare, question, and critically reflect, not the other way around.
The Role of Mediation: A New Direction for Reporting?
An inspiring idea emerged: If we’re asking teachers to mediate between their own and AI-produced data analyses, in other words to interpret, synthesise, and communicate findings, shouldn’t the ways we invite teachers to report their findings reflect that?
Key takeaway: The group suggested that poster presentations could follow on from oral discussion of findings, allowing teachers to explain their thinking, research process, findings and conclusions, and showcase how they used AI tools to reach their conclusions.
This links closely to the Common European Framework’s concept of mediation, where the goal is not just knowledge reproduction but purposeful transformation of information.
Limitations and Ethical Considerations
While AI tools like Gemini and DeepSeek supported data interpretation, participants acknowledged several limits:
- AI can miss nuance in teacher narratives
- It may skip over contradictory or unexpected data
- There is a risk of reducing analysis to a checklist or summary of findings
- Without domain knowledge, AI outputs can mislead or oversimplify
Key takeaway: We agreed that AI must be trained, understood, and handled critically. It’s not just about pressing “generate”—it’s about engaging with the tool and understanding its process, not just the results.
Building Together: A Collaborative Guide for Mentoring with AI
An exciting next step is in motion. We’re planning to co-create a practical mentoring guide to support the use of AI in Exploratory Action Research. This resource will include:
- Example prompts for different research stages
- Reflections on what worked (and what didn’t)
- Guidance for combining AI with human mentoring
- Ethical considerations and role definitions
- Tips for responsible, critical use of AI in teacher-research
We aim to develop a provisional version by June 2025. Our hope is to offer both inspiration and structure for others navigating in this evolving space.
Final Thoughts
Our session reminded us that AI can be a powerful ally in teacher-research—but only when used thoughtfully and critically. AI should never replace the mentor; rather, it should empower both mentor and mentee to notice more, question more, and reflect more deeply.
But used poorly, it risks decreasing autonomy and weakening reflection.
We’ll continue exploring this balance together, testing ideas, challenging assumptions, and shaping practices that are both forward-thinking and grounded in classroom reality.
We welcome others to join us as we co-create resources, test ideas, and explore the ever-evolving role of AI in education.
If you’re mentoring teachers or conducting your own classroom-based research and want to explore how AI can support you, please reach out to us to get involved.
Stay tuned for our first AI-in-EAR Mentoring Handout in June!
Şirin Soyöz Yılmaz
This blog post summarises key insights from our AI-in-Mentoring SIG meeting and reflections shared by participants. AI tools, including Zoom AI Companion and ChatGPT, were used to synthesise and organise key themes from reflective written material and discussions, ensuring a comprehensive overview of our findings.
Image credit: www.freepik.com