Skip to content

Design Critique at Duo

Designing a critique framework to nearly triple team connections

Design Critique at Duo — Designing a critique framework to nearly triple team connections

Why this mattered

Duo's 30-person design team had no systematic critique practice, and a shift to remote work intensified isolation, threatening both craft quality and team cohesion.

What I did

I treated this as a design problem: conducted user research with all 30 team members, prototyped a complete feedback framework, then tested and iterated based on quantitative data and qualitative feedback.

Impact delivered

  • -Nearly tripled team connections (192% increase in high-satisfaction scores)
  • -Improved team awareness by 60% across all squads
  • -Designers gained visibility into teammates' "messy middle" design process
  • -8/10 average satisfaction for feedback quality and time investment value
  • -Created scalable framework that integrated into natural workflow and project timelines

Something essential to our craft was missing

Duo's 30 person design team had no systematic critique practice. Some squads had informal design reviews, but for a team this size, the absence of structured peer feedback was holding us back. Critique and design review are considered tablestakes for mature design teams—yet we had none.
A shift to remote work intensified the isolation. With no in-person activities, designers had limited visibility into each other's work, social connections were weakening, and opportunities for meaningful feedback were inconsistent.
When I transitioned into a DesignOps role, I immediately saw this as a systems design opportunity. Rather than add another meeting to calendars, I could research what designers actually needed and prototype a solution that prioritized both connection and craft elevation.
Crumpled yellow paper shaped like a lightbulb surrounded by white doodles on a chalkboard background

Discovery: Treating teammates as my users

I approached this like any design challenge: systematic user research with the design team as my users. Collaborating with two senior designers, I conducted listening tours and focus groups with all 30 team members—designers, researchers, and design leads.
The research revealed four core problems:
  • Lack of visibility: No insight into teammates' work-in-progress or design thinking
  • No knowledge sharing: Teams working in isolation, reinventing solutions
  • Weak interpersonal connections: Remote work had eliminated organic relationship building
  • Scale anxiety: Everyone knew these problems would worsen as we continued hiring
This wasn't about critique skills—it was about designing the right social and organizational infrastructure for feedback to thrive.
Workshop board with multiple columns, sticky notes, and voting dots representing a critique framework

Focus group sessions revealed key themes to address

Four blue sticky notes on a desk surface, each describing a challenge with critique and visibility

Key themes identified in sessions

Prototyping a systematic framework

I designed and prototyped a complete system addressing each problem:
"Work with Me" sheets solved psychological safety by helping teammates understand individual communication styles and working preferences—pure user-centered design applied to team dynamics.
Cross-functional pods created diverse 6-8 person groups mixing designers, researchers, and domain expertise. This addressed isolation while bringing fresh perspectives to familiar problems.
Structured activities with clear roles included presenter, critiquers, and pod facilitators—removing the guesswork that kills volunteer participation.
Supporting artifacts in Slack and Mural provided both synchronous critique sessions and asynchronous work-sharing, creating multiple touchpoints for team connection.
Profile-style Q&A one-pager describing a teammate's communication style and preferences

"Work with Me" Sheets were instrumental in helping peers get to know each other and strengthened psychological safety

Tracking data as the team felt the shift

I launched "Critique Beta" with quantitative baselines across three metrics: team awareness, social connection, and visibility into design process. After three months of testing with structured surveys, the data showed improvement across all measures—but qualitative feedback revealed a fundamental design flaw.
Survey results showed progress:
  • Team awareness: 3-6 range improved to 6-8 range
  • Social connection: 3-6 range improved to 6-8 range
  • Process visibility: Similar positive trajectory
But user feedback revealed a core problem: Artificial pods pulled people from their natural work context. In addition, setting context for pod members who didn't know each other's projects took longer than getting actual feedback.
Three survey result bar charts showing ratings for leveling up, time investment, and happiness with the activity

Survey data showed improvement across all measures

Iteration based on user needs

The insight: I had optimized for social connection but ignored workflow integration. Critique needed to happen where real work happened—within squads, not artificial pods.
I completely redesigned the system:
  • Moved critique to squad-level activities integrated into project timelines
  • Included design managers and leads to provide accountability and governance
  • Made critique project-triggered, not calendar-driven to ensure relevant timing
  • Maintained support materials while shifting ownership from DesignOps to team leads

Impact: Systematic improvements

After six months, follow-up surveys showed sustained improvement across all metrics. Team members reported getting valuable feedback that moved their work forward (8/10 average satisfaction), with strong sentiment about time investment value.
Two survey bar charts showing how well teammates know each other socially and culturally

192% increase in high satisfaction scores and an 84% reduction in low scores

What I learned

This program taught me that organizational problems are design problems—they require user research, prototyping, testing, and iteration. Most designers focus on individual craft skills, but I focus on designing the systems that enable great craft to thrive at scale.
While other designers struggle to get their work implemented, my program management background lets me identify and solve the organizational barriers that kill good design. I don't just design—I design the conditions for design success.
This is how I approach all design challenges: diagnose the systemic barriers, prototype solutions, test with real users, iterate based on data, then scale what works.
The activity has given me a much better idea of what folks are working on and how they approach problems.

Duo Product Designer, survey feedback

Next up

Duo Security

View case study