A practitioner’s journal on navigating UX in the age of AI
A practical and personal look at how technology is reshaping the role — and responsibility — of designers.
It started with a late-night scroll — headlines about AI, layoffs, and a new kind of design future.
I sat there, somewhere between awe and anxiety, reflecting on my journey as a designer — where I started, what I’ve learned, and what still feels unknown. I’ve watched the confidence of seasoned peers collide with the hesitation of those just starting out. And somewhere in that tension, I felt the need to write.
This isn’t just an article — it’s a map of my thoughts. A journal entry from the edge of the change I feel will impact us all.
As the world grows more digital by the second, screens continue to shape nearly every interaction we have — whether we’re working, socializing, learning, or relaxing. In this shifting landscape, User Experience Design (UX) has become more than just a function of product development. It’s the connective tissue between humans, technology, and business.
But as the boundaries of design blur with the rise of AI, autonomous agents, and emerging modalities like voice and ambient computing — I find myself asking: What does the future hold for the UX profession?
Today: the current state of UX
According to the UX Trends 2025 report, the state of UX today reflects a maturing discipline that is increasingly foundational to how digital products are built, scaled, and evolved. UX has evolved beyond screens and interfaces — it now, or should guide strategy, inform systems thinking, and shape how teams build for people in complex environments.
Personalized at Scale
Personalization today goes far beyond saving preferences or tailoring content. With machine learning models embedded in design systems, we’re delivering experiences that adapt in real time — based on behavior, context, and predictive signals.
From adaptive onboarding flows to intelligent feature delivery, UX is becoming deeply anticipatory. These systems aren’t just responding to users — they’re learning from them, identifying patterns, and proactively shaping interactions to feel more seamless and relevant. Take Spotify, for example. Its recommendation engine continuously adapts to listening habits, surfacing new music and curating playlists that evolve with the user. This is personalization at scale — powered by AI, refined through behavior, and executed in milliseconds.
Designing for these experiences means moving beyond static personas to dynamic behaviors. It’s UX informed by insight, powered by data, and driven by systems that evolve alongside the user.
Deep Collaboration
Gone are the days of siloed design handoffs. UX today thrives in environments where designers, PMs, and engineers work together from the outset — not just to ship, but to discover. The UX Trends 2025 report identifies this as a sign of UX maturity: integrated, problem-solving teams aligned on user value and business impact.
This vision aligns with Marty Cagan’s idea of empowered product teams — cross-functional groups that own outcomes, not just deliverables. In his view, design isn’t just a support role — it’s central to shaping and solving meaningful product challenges.
But while the trend report highlights the presence of collaboration, Cagan pushes us to consider the quality of it. He cautions that collaboration isn’t consensus. Empowered teams are trusted to make decisions, take risks, and challenge each other with alignment, not uniformity.
In his book Inspired, he states that great products emerge when teams operate with clarity, context, and autonomy. UX isn’t just at the table, it’s helping to define the table’s purpose.
Ethical & Inclusive by Default
With greater influence comes greater responsibility — and yes, I’m paraphrasing Spider-Man’s Uncle Ben. Ethical design is now a core expectation, not just in theory but in practice. Teams are being asked to address everything from algorithmic bias and exclusionary defaults to manipulative patterns and misinformation. Accessibility, privacy, and psychological safety are no longer side considerations — they’re product requirements.
Inclusive design goes beyond compliance. It’s about designing for edge cases and historically underserved users — understanding how race, gender, ability, language, and socioeconomic status intersect with technology. The best teams prioritize co-creation, inviting diverse perspectives into research and testing processes to build more equitable experiences from the ground up.
As systems become more automated and autonomous, the ethical stakes are higher. Designers must now think in terms of unintended consequences, system-level harms, and long-term trust. Ethical frameworks are evolving into operational practices: checklists, red-team reviews, bias audits, and accountability rituals.
As highlighted in the UX Trends 2025 report, ethical and inclusive design isn’t a layer to add, it’s the lens through which the entire product experience must be shaped. It’s the benchmark for quality in today’s experience-driven world.
The shift happening now: toolchains, layoffs & the AI reset
While the future often feels theoretical, today’s designers are already navigating dramatic shifts in how work is done.
AI agents in the wild
Early AI agents are already reshaping workflows. AI agents are a system that can reason, plan, and take actions on its own based on information it’s given to manage workflows, use external tools, and adapt as things change. Tools like n8n.io let teams automate multi-step processes with ease — from generating content and summarizing research to integrating APIs and triggering reminders. These aren’t science fiction sidekicks. They’re real, available, and starting to augment creative work.
Let’s not confuse AI agents with automation. I recently watched a breakdown by Kevin Hutson from Futurepedia about AI agents and he mentioned, “Automation is more like predefined fixed steps whereas AI agents are a more dynamic and flexible system capable of reasoning.”
https://medium.com/media/e8958e4a7c316509dd1d7a6eb514984f/href
In his video he noted that agents rely on three key components:
- The Brain (LLM) — A Large Language Model like ChatGPT, Claude, Gemini, etc., handles the reasoning, planning, and language generation.
- Memory — This gives the agent the ability to remember past interactions and use that context to make better decisions. It might remember previous steps in a conversation or pull from stored memory sources like documents or a vector database.
- Tools — How the agent interacts with the world — searching the web or pulling info from a document, taking action by sending emails, updating databases or creating calendar events, and the orchestration of calling other agents, triggering workflow or fusing actions together.
Designers today are experimenting with agents that run user tests, market research, flag accessibility issues, or build prototypes from written prompts. The question is no longer if agents will assist us, but how we design around them.
The continued rise of No-Code & Vibe Coding
No-code platforms like Webflow, Framer, and now Figma Make are enabling designers to bypass engineers and bring their visions to life faster. Combined with AI-assisted creation, “vibe coding” is emerging — a practice where intuition meets prompt engineering to generate layouts, content, and flows.
Vibe coding is more than just a new buzzword — it’s a shift in mindset. Other tools like Bolt and Lovable, which are more dedicated vibe coding tools, allow creators to work through feeling, tone, and energy. Designers can describe the vibe of an experience — playful, serious, minimal, bold — and use AI tools to manifest that intention instantly in visual form. It’s rapid, intuitive, and deeply aligned with how many creatives actually think.
This isn’t just about speed. It’s about shifting power. Designers can now experiment more freely, iterate faster, and own more of the production process. And that freedom changes how we think — focusing us more on expression and good taste, where designers must be able to clearly articulate the language of design in a tasteful way that expresses to a machine how a product needs to feel, flow, and function.
The AI reset & the new team structure
According to Forbes, the “AI reset” is prompting mass layoffs across tech — not simply for cost savings, but to recalibrate for a leaner, more AI-native workforce. Organizations are reducing headcount, rethinking team structures, and placing more emphasis on toolchains than headcount.
Design teams are being asked to do more with less. This shift demands designers who are not only visually skilled but operationally savvy — comfortable with automation, systems thinking, and rapid prototyping.
“We’re not just designing products anymore. We’re designing the way work itself happens.”
Tomorrow: the near future of UX
Google I/O 2025 revealed numerous innovations that serve as early indicators of upcoming UX trends. The new product releases signal a major evolution in human interaction methods with AI systems which work across different platforms and devices. Designers must understand these signals and use them to develop experiences that are usable and inclusive to all humanity.
https://medium.com/media/00adb294fb0991f8291da9d9eed4d58f/href
AI-powered personalization
Picture user interfaces that understand your behaviors so well that they can predict what you’ll do next. The merging of AI into everyday tools enables unprecedented levels of personalization. For instance, Google’s Gemini AI assistant delivers “Personalized Smart Replies” within Gmail that replicate users’ writing styles to improve communication efficiency. Customization at this level improves user engagement by producing interactions that appear instinctive and straightforward.
From a UX standpoint, this raises deeper questions: At what point does personalization become manipulation? How can we build user experiences that evolve based on individual needs while ensuring they maintain both transparency and user control?
Voice, gesture & multimodal UX
The traditional screen-based interface is evolving. Devices now possess enhanced functionality to process and react to voice commands along with gestures and visual signals. The launch of “Gemini Live” demonstrates this shift. Through this feature users can immediately engage their surroundings by using their camera to have back-and-forth conversations with AI about visible objects.
This ushers in new UX challenges: We must create conversational and gestural feedback loops that match the clarity and assurance provided by a click. What strategies can we implement to guarantee accessibility for every form of user interaction?
Systems over screens
Seamless integration across platforms and devices holds the key to the future of UX. Through its “AI Mode” search function, Google replaces conventional search result formats with interactive chatbot experiences that enable users to gather information more intuitively and faster.
Google launched “Stitch”, their vibe coding tool that allows application development through natural language inputs and image prompts to connect design and development processes. I’m curious about how engineering teams will be affected by these developments.
The development of these tools shows a continuing shift in the industry toward AI-enhanced design methods that distribute UX design beyond individual UIs to multiple surfaces and contextual moments. With AI serving as the foundational layer for daily interactions, designers now need to focus on orchestrating systems rather than building them as they create invisible architectures which deliver tangible results.
Google’s new advancements reveal how the future will unfold. These new product announcements suggest that future UX design will extend beyond traditional clicks to create immersive experiences throughout users’ environments.
Beyond: UX in the next 5–10 years
Designers who thrive in the next decade will be those who lean into ambiguity, shape emerging tech with empathy, and hold the line on what makes technology human.
AI agents as experience partners
The rise of intelligent agents will mean users don’t just interact with interfaces — they partner with them. From personal schedulers to shopping assistants and creative collaborators, agents will carry out tasks with increasing autonomy.
In the next 5–10 years, we can expect a shift from isolated, task-specific agents to fully integrated, multi-agent ecosystems. These systems will allow AI agents to collaborate with one another — negotiating schedules, delegating tasks, and even learning from shared user behavior across platforms. Imagine a project management AI coordinating with your personal calendar agent, your team’s research agent, and a marketing analytics agent — all working together to keep you focused and productive. Wow! My brain hurts just thinking about this.
Mass adoption of AI agents will likely become a workforce norm. According to Salesforce, AI agent usage is expected to rise dramatically by 327%, with projected productivity gains of up to 30%. For professionals, this won’t just be a competitive advantage — it will be a basic expectation. Much like we’re expected to know how to use email or spreadsheets today, fluency in managing and co-working with AI agents will become standard.
For UX designers, this means creating not just user flows — but agent frameworks. We’ll need to shape how agents communicate with each other, how they surface decisions to users, and how trust, context, and boundaries are preserved across every interaction.
This challenges us to design experiences that are relational, not transactional. How do we build trust with a digital entity? What does a good “agent personality” look like? How do we give users control without overwhelming them?
As economist Richard Baldwin said, “AI won’t take your job — but someone who knows how to use AI will.” This is where designers must thrive: crafting agents that build, collaborate, and accelerate workflows. The human advantage is you.
The non-interface future
Screens may disappear, but design will persist. As highlighted in this LinkedIn piece by Muhammad Zeeshan Asghar , the next decade of UX may be shaped more by how we feel and perceive than by what we tap and scroll. From neural interfaces to ambient computing, interactions will increasingly be mediated through space, sensation, and subtle context-aware cues.
This transition challenges designers to think beyond visible surfaces. We’ll need to become choreographers of the invisible — designing feedback loops that feel intuitive without relying on screens, buttons, or clicks. The future UX layer may live in our environments, wearables, or neural signals — but it will still require intention, clarity, and a human touch.
Designers as ethical orchestrators
As automation takes on the labor, the designer’s role will become increasingly strategic. We’ll be called to shape not just the user journey, but the moral architecture behind it. Our work will determine what AI does, who it serves, and where the boundaries are drawn.
As outlined in this article by Design Bootcamp, ethical UX design in the AI era isn’t just about preventing harm — it’s about anticipating unintended consequences, designing for edge cases, and embedding human values into algorithms that scale. It’s no longer enough to ask, “can we build this?” The more important question is: “should we?”
This requires deeper collaboration with data scientists, ethicists, and product leaders. It demands transparency in how AI decisions are made and fairness in who they impact. As AI grows more autonomous, designers must champion user trust, agency, and dignity in every experience.
Ethical design won’t be a slide in the deck — it will be the foundation.
The future won’t be led by those who simply adapt to AI, but by those who guide it — with intention, responsibility, and humanity.
OpenAI’s acquisition of io: a paradigm shift in UX
In a landmark move, OpenAI has acquired io, the AI hardware startup founded by former Apple design chief Jony Ive for $6.5 billion. This acquisition signifies OpenAI’s strategic entry into the consumer hardware market, aiming to develop AI-native devices that transcend traditional screens and interfaces.
The collaboration between OpenAI and Ive’s design firm, LoveFrom, is set to redefine user interactions with technology. By integrating advanced AI capabilities with iconic industrial design, the partnership aspires to create devices that are contextually aware, ambient, and seamlessly integrated into users’ lives.
This move underscores a shift toward experiences that are not just user-friendly but profoundly user-centric — emphasizing empathy, ethical design, and utility that feels almost invisible. As reported by the New York Post, the deal hints at an entirely new category of interaction design that is screen-optional, multimodal, and shaped by intent rather than input.
For UX professionals, this development signals a broader transformation: we are not just designing software; we are designing intelligent, embodied experiences. This calls for a hybrid skill set that blends behavioral science, interaction design, and systems thinking in ways never before required.
UX is entering a new era — not defined by tools, but by tensions. Between speed and depth. Automation and intention. Power and empathy. The next frontier of design isn’t about control. It’s about conversation. Between human and machine, intent and outcome, possibility and principle.
My advice to designers: Learn something new every day and stay adaptable in this ever-changing landscape of technology and experience design. Be an early adopter. Embrace experimentation. And work toward becoming a Super IC, a title coined by Garron Engstrom, Director of Product Design at Meta — someone who bridges business goals, leadership, emerging tech, product and human-centered thinking in the age of agentic design.
The question for all of us isn’t just “what can we design?” but rather:
“What world are we designing toward?”
References
UX Trends 2025 Report — UX Collective: https://trends.uxdesign.cc/
Top UX Design Trends in 2024 — UX Design Institute: https://www.uxdesigninstitute.com/blog/the-top-ux-design-trends-in-2024
A Comprehensive Guide to Vibe Coding — Madhukar Kumar: https://madhukarkumar.medium.com/a-comprehensive-guide-to-vibe-coding-tools-2bd35e2d7b4f
AI Reset: Layoffs & The New Realities of Work — Forbes: https://www.forbes.com/sites/jasonsnyder/2025/02/12/ai-reset-layoffs-rto-and-the-new-realities-of-work/
n8n.io — AI Workflow Agents in Action (YouTube): https://www.youtube.com/watch?v=EH5jx5qPabU
Everything Announced at Google I/O 2025 — Wired: https://www.wired.com/story/everything-google-announced-at-io-2025
Google I/O 2025: Biggest Announcements — The Verge: https://www.theverge.com/news/669408/google-io-2025-biggest-announcements-ai-gemini
Future of UI/UX and AI Accessibility — LinkedIn (Asghar Jafri): https://www.linkedin.com/pulse/future-uiux-how-ai-accessibility-shape-next-decade-asghar-8epkf/
The Agentic Era of UX — UX Collective: https://uxdesign.cc/the-agentic-era-of-ux-4b58634e410b
Navigating the Ethical Landscape of UX Design in the Age of AI — Design Bootcamp: https://medium.com/design-bootcamp/navigating-the-ethical-landscape-of-ux-design-in-the-age-of-ai-25c12ad3ed6d
UX & AI: Designing the Future — Qubika: https://qubika.com/blog/ux-ai-designing-the-future
OpenAI Acquires io — Financial Times: https://www.ft.com/content/8ac40343-2fd1-4035-9664-47c77017d0d3
OpenAI + Jony Ive Collaboration — OpenAI: https://openai.com/sam-and-jony
OpenAI/io Acquisition — New York Post: https://nypost.com/2025/05/21/business/ex-apple-exec-jony-ive-joins-openai-in-6-5-billion-deal-for-ai-devices-startup/
The Impact of Agentic AI on the Workforce — Salesforce: https://www.salesforce.com/news/stories/agentic-ai-impact-on-workforce-research/
A practitioner’s journal on navigating UX in the age of AI was originally published in UX Collective on Medium, where people are continuing the conversation by highlighting and responding to this story.
This post first appeared on Read More