The ever-automated future
AI continues creeping into nearly every corner of our lives. From our inboxes to our interfaces, automation is reshaping how we work, create, and connect. UX Research and Research Operations (ReOps) are no exception. It’s a part of how we research, recruit, and synthesize insights.
Some in the field are energized by the momentum. Others are skeptical. But most agree that AI is here to stay, with clearly defined uses, firm guardrails, and human oversight (of course).
As AI embeds deeper into workflows, the question is no longer if it belongs but how it’s changing the work.Qiwen Zhao, UX Researcher at Cisco, sums up the sentiment perfectly:
I’ll admit, sometimes when I talk about AI, I feel like an older person skeptical of embracing new technology. But rather than resist it, I believe we need to adapt thoughtfully. That’s exactly what we’re unpacking in our June 18 webinar, The Use of AI Tools in Daily Research & ReOps Work, featuring Pete Fleming, UX leader at YouTube Ads.
Spoiler alert: our team is even using AI right now to verify sign-ups aren’t bots. A small but telling example of how embedded this tech is becoming.
So if you’re a human thinking about AI’s role in your day-to-day, save your spot below:
Read on for the latest trends, terms, and perspectives from UXR and ReOps in the age of AI, as the community weighs what we gain, what we risk, and how much of research can (or should) be handed off to the thinking machines. No Dune puns included😉.
AI & automation use cases
Whether you're a UX researcher, ReOps pro, product manager, designer, engineer, or anyone connected to insights, you've probably felt the rising tension around AI, automation, and the human element of research.
Here’s a quick rundown of the biggest AI use cases stirring the pot—ordered from the most widely embraced to the ones sparking the most debate:
🌟Study & Screener Development
Using AI to draft research plans, generate interview questions, and create screeners.
⭐Data Synthesis
Automating the analysis of qualitative data: theming, coding, sentiment analysis, and pattern recognition.
⚠️Study Moderation
AI-assisted or AI-led moderation of live research sessions, including live transcription and note-taking.
🚨 Synthetic Participants
Using simulated "users" to generate feedback and insights.
Terms to know
AI jargon can feel like a foreign language depending on who you ask. Here’s the quick glossary to keep us all fluent:
Artificial Intelligence (AI) is a broad term for computer systems designed to perform tasks that typically require human intelligence. This includes technologies like machine learning (ML) and natural language processing (NLP).
Here’s your map to connect it all back:
Machine learning (ML) enables systems to learn from data without explicit programming.
Natural language processing (NLP) helps computers understand and generate human language.
Generative AI is a type of ML that creates new content — text, images, or audio — using deep learning methods modeled on the brain’s neural networks.
Large language models (LLMs) like GPT are a form of generative AI trained on massive text datasets. The “P” in GPT stands for pre-trained, meaning the model learned from internet content up to a certain cutoff. That’s why tools like ChatGPT might not reflect the most recent events.
ChatGPT is the chatbot interface that lets users interact with the GPT model in real time.
For simplicity, we’ll use these terms interchangeably throughout this article. Now, let’s dive into what this all translates into practice when saving time, scaling insights, and keeping the human element in the loop across daily research and ReOps workflows.
Managing depth, scale, and complexity
The work of UXR and ReOps involves juggling multiple demands. So, what exactly is AI solving for in UXR and ReOps, and what does it mean for our teams?
Common challenges include:
Finding and recruiting the right participants, often under tight timelines
Managing and organizing growing volumes of data
Scheduling and coordinating sessions, including no-shows and reschedules
Synthesizing qualitative data into clear, actionable insights
Upholding ethical standards and privacy compliance
Minimizing bias and maintaining research rigor
None of these are new challenges, but AI and automation introduce speed and scale.
With that comes new questions: What speeds up? What slips through the cracks? How do team dynamics shift? Who owns the research now?
No surprise this is sparking real conversation. Lan Cheng, UX Research Manager at YouTube, nails it:
As a UX researcher, I'm genuinely excited by AI’s potential to revolutionize our field. But before using any AI tool in my workflow, I need a clear understanding of its strengths, limitations, and whether it meets the bar.
Done right, AI in UXR & ReOps delivers
If you’ve used tools like Qualtrics, transcription services, sentiment tagging, or automatic theme summaries, you’ve already interacted with AI. It’s increasingly embedded across the research stack as a practical engine for speed, scale, and accuracy.
Jessica Lewes, Associate Principal of Research Operations at Kaluza, brings both hands-on experience and an academic perspective. Working in a global software company with a democratized approach to research, she notes:
One of my responsibilities is working with the business to decide what research and design technology we should use to support our work. All of these technology providers offer AI powered features.When diving into how she appraises AI's role in UX research, she points to her earlier work at a user recruitment agency where repetitive, error-prone tasks consumed time and resources. Salaries were the biggest overhead. If AI could reduce human error and speed up participant recruitment, the business would instantly become more efficient and more profitable.
In practice, AI in Research Ops should deliver:
Time Saved: Less busywork, more time for strategic thinking
More Research, Faster: Studies don’t stall on logistics
Richer Insights: Patterns surface that may go unnoticed
Scalable Processes: Run more studies without sacrificing quality
Stronger Collaboration: Everyone stays aligned and informed
Where the field stands today
It’s said that AI isn’t here to replace UX professionals. As Adam Malamis, UX pro with over 20 years of experience and CPACC-certified accessibility expert at Simalam, puts it:
For now, we definitely need an “adult in the room”. When used effectively and by an experienced UX professional, can elevate research capabilities and drive more informed decision-making.Used wisely, it supports every stage of research—from planning and recruiting to synthesis and sharing insights. Here’s where the field stands on use cases:
AI for Study and Screener Development
AI is becoming an unexpectedly effective planning partner. From drafting study outlines to generating interview guides and recruitment screeners, it helps researchers go from blank page to structured plan much faster.
Traditionally, teams manually drafted everything — objectives, target users, research questions — often under tight deadlines. AI changes that by pulling from past studies or behavioral data (like site analytics, NPS, or purchasing patterns) to suggest what to prioritize and how to frame the study.
🧠 Real-World Example:
A fintech UX team needed to plan a usability test for a new dashboard. Instead of starting from scratch, they used AI to scan prior research and generate a draft plan — including questions and suggested participant criteria. While the tool saved hours, the team still reviewed and refined the plan to ensure relevance and mitigate bias (for example, removing unnecessary tasks like card sorting that didn’t fit the study’s goals).
🛑 Caution:
AI can’t intuit context. It might miss emerging issues or nuances “human” researchers would catch. Always review outputs for bias, bloat, and assumptions. Running first with AI agents can provide context and refine your thinking, but it’s just one part of the process, confirms Ryan Smith, helping leaders through productivity and organizational effectiveness workshops and advisory services at Smith Horn Group:
In recent work across product strategy and innovation, I’ve found ChatGPT’s Deep Research and Claude’s Web Search to be powerful tools for accelerating discovery.
AI for Synthesis
Everyone’s using AI here. Whether sorting open-ended survey responses or mining usability feedback for patterns, AI helps reduce the manual burden of synthesis. from Detecting recurring themes in interview transcripts, Coding qualitative data into categories like “frustration,” “delight,” or “confusion”, Ranking insights by frequency or urgency Spotting contradictions or blind spots humans may miss.
🧠 Real-World Example:
Faced with 2,000 customer survey responses, a researcher batches them into an AI tool. The AI clusters common issues, surfaces insights (e.g., “users can’t find the pricing page”), and ranks problems by frequency. Then a human review confirms the patterns are meaningful.
📌 Advanced Use Case:
For larger studies, AI auto-codes and tags themes to scale synthesis and make insights more shareable. It also helps tie insights back to business goals — e.g., onboarding confusion linking to retention issues.
🛑 Caution:
AI doesn’t replace deep reading or lived research expertise. Spend time “living in the data” before reporting, and always verify AI’s outputs — especially when it’s confidently wrong.
AI Moderation
AI moderation is a mixed bag. Some teams experiment with AI bots for live sessions; others lean on AI more for post-session tasks like transcription, tagging, and pattern detection. It transcribes interviews quickly, detects speakers, timestamps quotes, and highlights key themes — speeding researcher access to data without the logistical lift. The jury’s still out on 🤖 AI Interview Bots. While some tools offer AI-led interviews or bots that ask users questions, most senior researchers remain skeptical. Bots often miss follow-ups, ignore nuance, and risk eroding trust.
🧠 Real-World Example:
A three-person team conducts eight interviews. Instead of transcribing by hand or outsourcing, they use an AI tool that delivers full transcripts with sentiment tagging and pull quotes within hours.
🛑 Caution:
Even transcription tools make mistakes — especially with accents, jargon, or background noise. Human review is essential.
Synthetic Participants
“Research without users” is a hot topic—and a no-go for many. Erika Hall says it bluntly:Let's put this whole "synthetic users" thing to rest. It is unethical, indefensible, and also unnecessary, to create a product or service or policy that affects other people, without having conversations with representatives of those populations. Synthetic users, trained on generalized data, promise speed. But they erase lived experience, reinforce bias, and promote assumptions over reality.
🛑 Caution:
Synthetic participants might seem efficient, but the consensus is clear: they’re a step backwards for ethical, inclusive, human-centered design.
Ethical considerations and data privacy concerns
As AI systems process increasingly sensitive data, privacy and security become critical. At the same time, algorithms trained on skewed datasets risk reinforcing systemic bias, leading to flawed conclusions and potentially harmful design — especially for vulnerable groups.AI may be efficient but there’s a host of ethical concerns come to the surface. Jacqui Olkin, UX & Research consultant for over two decades, cuts to the heart of it:
One of the potential harms I'm confused about is deceptive impersonation-an AI agent implicitly representing itself as human, and even a *specific human,* in conversation with an unknowing human user of a software product. This raises broader concerns: Are we unintentionally removing the human layer from research? Can participants give meaningful consent when AI is interpreting their input? And are we truly surfacing unbiased insights — or simply scaling existing blind spots?
Transparency is equally essential. Stakeholders should understand which insights come from humans versus AI, and what trade-offs each approach carries. Informed consent processes must evolve to reflect not just how data is collected, but how it's analyzed, stored, and used.
Bottom line: AI doesn’t lower the bar. It raises it. And if we’re not careful, speed and scale will come at the cost of integrity, trust, and the people we’re supposed to be designing for.
Research democratization in the age of AI
Democratization is pushing user research to become a collective effort — extending beyond traditional research teams to include product managers, designers, and engineers. With the rise of AI tools, many are now accessing and interpreting insights independently.
Some view this as a natural evolution: a way to scale research and bring user understanding closer to decision-making. Others raise valid concerns around rigor, context, and the risk of misinterpretation when research happens without trained practitioners.
What’s emerging is less a debate over tools and more a redefinition of roles. AI can accelerate access to patterns and summaries, but it doesn’t guarantee nuance or strategic alignment. As these capabilities proliferate, the dynamics of ownership around research are shifting.
The result is a more fluid research landscape — one where traditional boundaries blur, and where tool access may outpace shared understanding of how to use them responsibly.
In this environment, questions around quality, interpretation, and impact become more complex — and more critical. The role of UX researchers is evolving from primary executors to strategic partners and educators. ReOps continues to play a foundational role in this shift. As James Lang says ReOps is part of a hopeful future:
We design and build research infrastructure to maximize impact. We implement participant management systems, create repositories that surface insights, and develop democratization frameworks that empower non-researchers with appropriate tools, guidance, and training
What’s shaping UXR & ReOps in 2025
Here we’re taking away from our research fueling the user research industry:
Organizations continue to look to customer insights to drive business growth
The majority of research teams are now using artificial intelligence (AI)
User interviews and usability testing lead as the most common methods throughout the product lifecycle
The push to build the right products fast is fueling research demand
Time and bandwidth remain the top challenges for product teams
The number one use case for AI is automating manual research tasks
The UX researcher role is shifting from technical executor to educator
Democritization drives user research to be a collective effort across teams
Where Ethnio fits in your evolving tool stack
Built for curious, customer-centric teams, our user research crm and panel management software make it ridiculously easy for anyone—from dedicated researchers to non-researchers juggling research alongside other priorities—to conduct qual or quant research.
By combining automation with thoughtful infrastructure, Ethnio streamlines every stage of the research lifecycle while ensuring rigor, governance, and compliance.
Recruitment becomes faster and more flexible: teams can intercept real users via web or mobile (iOS and Android) or share screeners through email, Slack, or social channels to reach targeted participants.
Incentives and scheduling are fully automated, allowing payments in any currency worldwide and coordination of 1:1 or group sessions using intuitive scheduling tools with smart defaults and templates.
Study builders and templates simplify setup, enabling even non-researchers to launch consistent, compliant research confidently. These frameworks eliminate guesswork and keep research aligned with organizational standards.
Governance features support ethical and secure research at scale, with controls like cool-down periods, participation limits, opt-out options, GDPR compliance, and single sign-on (SSO) to safeguard privacy and data integrity.
As research becomes increasingly cross-functional, Ethnio enables stakeholders to join sessions, provide feedback, and actively participate in the research process. This fosters a collaborative, embedded approach where research integrates seamlessly into product development.
This is how leading teams reduce rework, accelerate delivery, and make research a continuous, shared practice.
TL;DR
AI is here—and it’s already woven into daily research workflows.
Most adopted: Drafting studies, screeners, and interview guides; synthesizing qualitative data (theme detection, tagging, sentiment analysis); and automating transcription and post-session analysis.
More controversial: AI-led moderation of live sessions and synthetic participants—largely rejected for ethics and rigor.
Biggest benefits: Saves time by cutting manual, repetitive work; scales research without needing more headcount; and supports democratization by enabling non-researchers to engage more effectively.
Biggest risks: Losing context, nuance, and depth if AI outputs are accepted at face value; ethical and privacy concerns around consent and impersonation; and reinforcing bias without careful oversight.
Shifting roles: Researchers are evolving into strategic facilitators and educators—not just executors.
To sum up…
Innovation isn’t slowing down—and neither are the pressures on UXR and ReOps teams. Think of the research landscape like the shifting sands of Arrakis—constantly moving, reshaping the terrain beneath our feet. Okay, we promised no Dune quotes, but we wanted to surprise you at the end.
The temptation to stick with familiar methods or make only incremental changes is understandable. But in 2025, that’s not enough. We’re builders—crafting tools that streamline recruitment, automate busywork, and help teams scale impact without sacrificing rigor.
Still, tools alone aren’t the full answer. The real opportunity lies in rethinking how we work—balancing automation with human oversight, preserving nuance in a fast-paced world, and making research more accessible without losing depth.
We’d love to hear how you're approaching these shifts. Join us on June 18 for our webinar, The Use of AI Tools in Daily Research & ReOps Work, featuring Pete Fleming, UX leader at YouTube Ads.