The Problem
ARTIFACT DRIVEN DESIGN IN AN
AI WORLD
Our design process looked modern on the surface, but underneath it had all the classic failure modes:
Design was artifact-driven (Figma files, specs, tickets)
Engineering worked from interpretation, not shared systems
AI was used inconsistently,mostly as a productivity hack, not a foundation
Feedback loops were slow, subjective, and disconnected from real behavior
The result:
Rework
Misalignment
And a growing gap between what we designed and what actually shipped
This is a known issue in traditional design systems—handoff-heavy workflows create waste and slow iteration, while faster learning cycles and tight collaboration improve outcomes
Guiding Philosophy
Design behavior, not screens.
Work in live systems, prototypes over specs, always.
AI is part of the system, not a side tool.
Codify decisions so they scale (skills, patterns, rules).
If it can’t be tested, it’s not ready.
Evaluate outputs, not intent.
No handoffs, co-create with engineering from the start.
Default to small, fast loops over big, polished deliverables.
Use real data as early as possible.
Prioritize impact over completeness.
Align tightly so teams can move independently.
Share knowledge openly,no silos, no gatekeeping.
PERSONAL TRANSFORMATION
In the span of six months, I also rewired my own workflow completely—moving out of static design tools as the center of gravity and into direct interaction with models using:
Cursor
Claude
This wasn’t theoretical adoption. It was operational.
What I Built
1. AI-Native Workflow
Designers and engineers now:
Prototype directly against LLMs
Test outputs using real data
Iterate in tight loops (hours, not days)
No waiting for “final designs.” The system is the design.
2. “Skills” System (Codified AI Behavior)
I introduced a set of Markdown-based “skills” that define how AI is used across the team.
These include:
Prompt structure standards
Component fidelity rules (no hallucinated UI)
Evaluation criteria for outputs
Human-in-the-loop guardrails
This turned AI usage from: “everyone doing their own thing”
intoa shared, enforceable system3. Embedded Evaluation
We shifted from subjective design critique to observable output quality:
PRs require explanation of AI-generated code
Outputs are validated against real use cases
Edge cases are defined upfront, not discovered later
Design quality became measurable, not just aesthetic.
4. Co-Authorship Collaboration Model
Design and engineering now:
Work in the same loop
Co-define behavior before building
Review outputs together before shipping
The “handoff” phase was eliminated.
What Changed
Speed
Idea → working output compressed from days/weeks to hours
Alignment
Fewer interpretation gaps
Less rework between design and engineering
Quality
Outputs tested against real conditions earlier
Fewer downstream surprises
Role of Design
Design moved from: creating deliverables
to
defining how the system behavesKey Insight
Creative thinking didn’t disappear, it moved.
It now lives in:
how prompts are structured
how systems are defined
how outputs are evaluated
how the user's voice is included in the process
Not in static files.
Lean ux Research
Drive research efforts using agile, micro-study approaches to constantly focus on the user experience while keeping pace with engineering velocity.
Embrace the "just enough research" principle to inform key design decisions and ensure consistent delivery.
CX alignment
Foster seamless collaboration between design, engineering, and customer experience teams to deliver a unified, cohesive experience across all touchpoints.
Ensure alignment of customer insights, design decisions, and product strategy. This will lead to a consistent,d user-centric journey that drives satisfaction and loyalty.
Governance
Regular strategy reviews, design reviews with leadership, team retrospectives
While a design strategy sets the overall direction, empowering teams to determine how best to govern and apply it within their context is crucial.


