Personal Productivity & AI-Augmented WorkPractitioner StoryLenny's Newsletter

From Figma to Claude Code and back | Gui Seiz & Alex Kern (Figma)

Read original

Why I picked this

Visual design has been the holdout — the creative domain where AI felt most suspect. Designers could point to taste, spatial reasoning, the ineffable craft of making things feel right. But Figma's own team just demonstrated something more interesting than 'AI does design now.' They've shown how AI collapses the middle phase of design work entirely. Not the strategic thinking at the front (what should this do?) or the craft refinement at the end (does this feel right?), but the rushed execution phase where designers frantically translate concepts into pixels under deadline pressure. That's the part Claude Code now handles.

What makes this worth attention: it's Figma practitioners showing their actual workflow, not a vendor pitch. Gui and Alex walk through bidirectional sync between design files and production code via MCP — meaning the Figma file and the shipped product stay coupled, not divergent artifacts. The 90% AI-generated code metric matters less than the structural insight: when your codebase is organized for AI legibility, the tool becomes genuinely useful rather than a novelty that generates plausible-looking garbage.

The frontier isn't 'can AI design' but 'what does design work become when execution speed approaches zero?' Figma's answer: designers move upstream to planning and downstream to craft, spending time on the parts that actually require human judgment. The frantic middle — where most design work currently happens — compresses into minutes instead of days. That's a workflow transformation, not a feature add.

ai-coding-toolscursor-vs-copilotautomation-stacksai-writing-workflows

Three lenses

Builder

The MCP integration is the real story here — bidirectional sync means I can iterate in code and have design files update automatically, or vice versa. That's not a workflow improvement, that's eliminating an entire class of version control problems that have plagued product teams for a decade.

Revenue Leader

Show me the designer who's actually shipping 90% AI-generated code to production at scale, not in a demo. The workflow looks elegant until you hit edge cases, accessibility requirements, or the reality that most codebases aren't 'properly structured' for AI legibility. I need to see this deployed across a 50-person product org before I believe the efficiency gains translate.

Contrarian

Notice what they're not saying: how many iterations it took to get Claude to generate usable code, how much prompt engineering expertise is required, or what happens when the AI hallucinates a component that looks right but breaks accessibility. The 90% metric is meaningless without knowing the denominator — 90% of what kind of code, for what kind of features, with how much human correction?

AI has shifted design work upstream to planning and downstream to craft, eliminating the rushed middle phase of execution

Key takeaways

  • Figma's MCP enables bidirectional sync between design files and production code, eliminating static handoffs and version drift
  • AI coding tools enable 90% code generation when codebase is properly structured, with custom skills automating pre-flight checks and CI monitoring
  • Design workflow is bifurcating: AI handles rushed middle execution phase while humans focus on strategic planning (upstream) and quality craft (downstream)

People mentioned

  • Gui Seiz, Designer @ Figma
  • Alex Kern, Engineer @ Figma

Companies

FigmaAnthropic

Key metrics

  • 90% of code written by AI

Why this matters for operators: Product teams rethinking design-engineering handoffs and evaluating whether their codebase structure enables or blocks AI coding tool adoption

I cover AI×GTM intelligence like this every Wednesday.

Get STEEPWORKS Weekly

More picks

Enterprise AIMIT Technology Review AI

Rebuilding the data stack for AI

  • Enterprise AI adoption is bottlenecked by fragmented, ungoverned data infrastructure rather than AI model capabilities
  • Competitive differentiation comes from proprietary data combined with third-party enrichment, not just AI tools
  • Evolution from 'system of engagement' to 'system of action' represents shift toward autonomous AI agents managing workflows
data-infrastructureenterprise-ai-readinessai-governance
Enterprise AIDemand Gen Report

Gartner: Explainable AI Will Drive LLM Observability Investments

  • LLM observability adoption will jump from 15% to 50% of GenAI deployments by 2028, driven by explainability requirements for scaling beyond low-risk use cases
  • Traditional IT observability (latency, cost) is insufficient - new metrics needed include hallucination detection, factual accuracy, logical correctness, and sycophancy measurement
  • Gartner recommends XAI tracing for high-impact use cases, multidimensional observability platforms, and continuous evaluation frameworks with human-in-the-loop validation
ai-policyregulatory-impactmarket-consolidation
AI DevelopmentLenny's Newsletter

From a $6.90 newsletter to $3M API: How a non-coder built Memelord | Jason Levin

  • Non-technical founder scaled from $6.90/month newsletter to $100K ARR using Bubble (no-code), then raised $3M to build API-first product - validates no-code as legitimate path to venture scale
  • Mandatory 'vibe-coding' rule for marketing team - employees must build their own AI tools/automations, representing shift from using AI to building with AI as core marketing skill
  • Free AI tools as lead gen replacing traditional content - 'free tools are the new PDF downloads' generated hundreds of thousands of emails, signaling evolution in PLG motion
ai-coding-toolsautomation-stacksplg-to-sales

This analysis was produced using the STEEPWORKS system — the same agents, skills, and knowledge architecture available in the GrowthOS package.