Human-AI IntersectionPractitioner StoryThe Verge AI

You Could Be Next

Read original

Why I picked this

Victor's instinct here is correct — this piece deserves to break through the paywall noise. It's a practitioner story that shows what most AI adoption discussions carefully avoid: the human supply chain behind the models. Katya (pseudonym) is a displaced content marketer now training AI for $45/hour through Mercor, with zero visibility into which model she's improving or how long the work will last. Her project evaporated in two days. The irony is structural, not accidental: white-collar workers automated out of stable roles are now the precarious labor force teaching models to do 'the worst version' of what they used to do well.

This matters for GTM leaders because it surfaces the labor dynamics you're inheriting when you deploy AI tools. Someone trained that model. Probably someone who used to do the job you're now automating. The opacity is deliberate — workers don't know 'the client,' can't assess what they're building, can't opt out of training their own replacements. It's gig economy mechanics applied to knowledge work, with none of the transparency.

The consulting relevance is immediate: if you're running AI pilots in content, copywriting, or customer support, you're participating in this system whether you know it or not. The ethical questions aren't abstract. They're embedded in your vendor contracts and your team's job security. This article is a mirror, not a warning. You're already here.

ai-training-laborwhite-collar-displacementgig-economy-aidata-labeling-workforce

Three lenses

Builder

The opacity is a feature, not a bug — if workers knew they were training competitors to their own roles, the labor pool would collapse. But that fragility is your risk surface. Build assuming your training data pipeline has a half-life measured in months, not years.

Revenue Leader

If I'm deploying AI content tools across my org, I need to know: who trained this model, under what conditions, and what happens when that labor supply dries up? The $45/hour gig worker today is my vendor's existential risk tomorrow. Show me the sustainability model or I'm not buying.

Contrarian

Everyone celebrates AI efficiency gains. Nobody's pricing in the cost of burning through the knowledge worker class that makes training possible. When Katya and hundreds like her stop taking these jobs — and they will — your model quality degrades and you don't even know why. I've seen cost-cutting destroy vendor reliability before. This is that, but hidden three layers deep in the supply chain.

My job is gone because of ChatGPT, and I was being invited to train the model to do the worst version of it imaginable

Key takeaways

  • White-collar workers displaced by AI (content marketing, copywriting) are being recruited to train the very models that replaced them, creating a cruel economic feedback loop
  • AI training labor operates as precarious gig work with no job security - projects canceled with zero notice despite workers planning finances around the income ($45/hr but can disappear in 2 days)
  • The AI training supply chain is deliberately opaque - workers don't know which AI they're training ('the client'), what it's for, or how their work fits into the larger system, preventing informed consent about contributing to further automation

People mentioned

  • Katya, Freelance journalist/content marketer (pseudonym) @ Unemployed/Mercor contractor
  • Melvin, AI interviewer @ Mercor

Companies

MercorCrossing Hurdles

Key metrics

  • $45 per hour
  • hundreds of people
  • two days
  • several hours per task

Why this matters for operators: GTM leaders deploying AI content/copywriting tools need to understand the precarious labor dynamics and ethical implications embedded in their vendor supply chains; opacity in training pipelines creates sustainability and quality risks

I cover AI×GTM intelligence like this every Wednesday.

Get STEEPWORKS Weekly

More picks

Personal Productivity & AI-Augmented WorkPractitioner StoryLenny's NewsletterVictor's pick

From Figma to Claude Code and back | Gui Seiz & Alex Kern (Figma)

Visual design one of last frontiers so this an interesting read

  • Figma's MCP enables bidirectional sync between design files and production code, eliminating static handoffs and version drift
  • AI coding tools enable 90% code generation when codebase is properly structured, with custom skills automating pre-flight checks and CI monitoring
  • Design workflow is bifurcating: AI handles rushed middle execution phase while humans focus on strategic planning (upstream) and quality craft (downstream)
ai-coding-toolscursor-vs-copilotautomation-stacks
AI×GTMThought LeadershipThe Signal (Brendan Short)Victor's pick

26 FAQs about GTM Engineering in 2026

A good if high level overview

  • Article is a FAQ format covering GTM Engineering predictions/practices for 2026
  • Published by Brendan Short's The Signal newsletter (6,595+ subscribers)
  • Content truncated - only intro/header visible, preventing full analysis
gtm-engineeringsignal-infrastructureenrichment
GTM OpsLenny's Newsletter

A guide to advanced B2B positioning

  • Content is a podcast/video episode announcement, not substantive article
  • Focuses on positioning fundamentals and cross-functional alignment challenges
  • No specific case studies, metrics, or implementation details provided in excerpt
back-to-basics-gtmvibe-marketing

This analysis was produced using the STEEPWORKS system — the same agents, skills, and knowledge architecture available in the GrowthOS package.