User Research on AI Tools for Students

A comprehensive qualitative research and participatory design project to create human-centered AI learning assistants for university students.

UX ResearchAIParticipatory Design
2 min read
User Research on AI Tools for Students

Image Credit: Samuele Mazzei during the workshop

Situation University students face diverse challenges in exam preparation, prompting a need to understand how AI-powered learning assistants can be designed to improve learning experiences while addressing ethical concerns like over-dependence.

Task I led a comprehensive qualitative research and participatory design project to identify specific student needs and co-create human-centered AI tools that support academic success and personalized learning.

Action

  • Methodological Research: Conducted 25 observations (in-person and online), 10 semi-structured interviews, and a focus group with 11 students to gather deep insights into study habits and frustrations.
  • Thematic Analysis: Utilized inductive coding and iterative refinement to categorize data into three core design pillars: Personalized Learning, Response Approach, and Mindful AI Interaction.
  • Participatory Design Workshop: Facilitated a 2.5-hour workshop with 12 participants using the “Tell, Make, Enact” framework.
  • Scenario-Based Ideation: Developed 6 real-world scenarios (e.g., Law and Math exams) and 10 specific “goals” (e.g., Speed up, Accuracy, Gamification) to situationalize design tasks.
  • Rapid Prototyping: Bridged the gap between accessibility and structure by combining physical materials (paper sketching) with digital prototyping in FigJam and Figma.
  • Archetypal Persona Alignment: Organized participants into archetypal design teams (e.g., “The Innovators” for Google, “The Designers” for Apple) to explore diverse design philosophies and user experience priorities.

Result

  • Impactful Prototypes: Successfully generated three distinct AI-driven solutions, including “Mathemapple,” which uses a Socratic method for step-by-step guidance, and a law-focused interactive debate system that encourages critical thinking.
  • Design Validation: Confirmed that students prioritize interactive feedback and transparency in AI response generation over simple “full answers”.
  • Actionable Insights: Discovered that generalist AI tools are less effective than niche-specific solutions, identifying a clear requirement for different design approaches between STEM and Humanities students.