There are 10x people and 1x people. This has always been true, but it was easy to hide. In a pre-AI world, the difference between a great employee and an average one might be 2x or 3x in output. The gap was manageable. You could staff around it. You could compensate with process and headcount.
AI changed that math. The tools available today amplify the ambitious, the curious, and the autonomous. A 10x person with Claude and ChatGPT is not just a little more productive than a 1x person — they are operating in a fundamentally different category. The gap is not 2x or 3x anymore. It is 10x or more. And it is getting wider every month.
A 10x person is not just someone who works harder or knows more. In the AI era, 10x people share a specific set of traits:
They embrace the tools. They do not need to be told to use AI. They are already using it. They have subscriptions. They have figured out prompting techniques. They have built things on their own time because they were curious. When a new model comes out, they try it the same day.
They have autonomy. Give them a problem and they attack it directly. They do not wait for more clarity, more people, more time, or more permission. They figure out what needs to happen and they start building. When they hit a wall, they find another way around it.
They think at the right level of abstraction. They do not hand AI a list of specs and wait for output. They describe problems, explain context, and let the AI help them figure out what should exist. They treat AI as a thinking partner, not a task executor. This single difference accounts for an enormous gap in output quality.
They iterate fast. The first version is not precious. It is a starting point. They ship something, get feedback, improve it, and ship again. They are comfortable with imperfection because they know that three iterations in a week beats one "perfect" attempt in a month.
They are naturally curious. They ask "what if" and "why not." They explore. They experiment. When they see a new capability, their first instinct is to try it, not to evaluate whether it fits into the current roadmap.
A 1x person is not necessarily bad at their job. Many 1x people were excellent in the pre-AI world. But they share traits that AI exposes:
They get bottlenecked. They wait — for approval, for access, for the "right" tool, for someone to tell them what to do. In a world where you can spin up almost any capability in minutes, waiting is a choice. And it is the wrong choice.
They are intimidated by the tools. They have heard about AI but have not deeply engaged with it. They might have tried ChatGPT once and were not impressed by the free tier. They have not subscribed to the professional tools. They have not experimented with extended thinking or code generation. They are judging a revolution based on a demo.
They default to old-world timelines. When asked how long something will take, they give pre-AI estimates. A quarter for a project that could be prototyped in a week. Six months for an initiative that could deliver measurable results in two weeks. They are not lying — they genuinely believe these timelines because they have not internalized what the tools make possible.
They optimize for process over output. They want the plan finalized before building starts. They want stakeholder alignment before experimenting. They want comprehensive requirements before a prototype. In the old world, this was reasonable. In the new world, it is a way to avoid building.
The gap between 10x and 1x people is not static. It is accelerating. Here is why:
AI tools compound. A 10x person who has been using Claude and ChatGPT for six months has built up prompting skills, mental models, and workflows that a newcomer does not have. They know which model to use for which task. They know how to structure problems. They know how to iterate. Every week they use the tools, they get better at using them. The starting advantage compounds.
Models are improving. Every few months, a new generation of models comes out that is meaningfully better than the last. 10x people adopt immediately and their capabilities jump. 1x people are still figuring out the previous generation — or have not started at all.
The work itself is changing. As AI transforms what is possible, the nature of valuable work shifts. The tasks that used to define competence — manual analysis, basic coding, report generation, data gathering — are being automated. What remains is judgment, creativity, problem framing, and the ability to leverage tools. These are 10x traits.
This month, evaluate every person on your team with these questions:
You do not need a large team to start. You need the right two or three people.
This is your builder and your guardrail. They do not need to be the world's best coder — raw coding ability is less important than it used to be because AI handles much of the writing. What matters is they understand cloud infrastructure: how to set up AWS, Azure, or GCP securely, how to configure firewalls, how to manage deployments, and how to think about cybersecurity. They are the person who takes a vibe-coded prototype and makes it production-ready.
Often the hardest part of AI transformation is not the technology — it is getting people to buy in. An ex-consultant who knows how to talk about change management in organizations is invaluable. Even better if they used to work at a software development firm or a data science practice. They bridge the gap between what the technology can do and what the organization is ready to adopt. They help you sequence the rollout, manage resistance, and communicate the vision.
When hiring for any role, add these to your evaluation:
The right question is not "which people are 10x?" but "which roles should be 10x roles?" Some positions benefit enormously from AI leverage. Others do not. Map your organization:
10x roles are positions where AI dramatically amplifies output: analysts, developers, marketers, operations managers, strategists, product managers. These need exceptional humans wielding AI tools. Pay more for fewer, better people.
Automation candidates are workflows that are repetitive, rules-based, high-volume, and low-judgment. These should not be staffed at all — they should be automated. Every reminder. Every repetitive outreach. Every predictable high-volume workflow.
Essential human roles are positions where human judgment, empathy, or physical presence is irreplaceable: client relationships, strategic decisions, creative direction, leadership. AI supports these roles but does not replace them.
Here is the uncomfortable truth: some people on your team will not make the transition. Not because they are unintelligent, but because the traits that made them successful in the pre-AI world — diligence, process adherence, deep specialization in now-automatable tasks — are not the traits that matter most anymore.
You owe these people honesty. You owe them the chance to adapt, the training to try, and the time to demonstrate they can evolve. But you also owe your organization the truth about what it needs to compete. The companies that win are not going to be the ones that protect every existing role. They are going to be the ones that redesign their operating model around AI-native speed and leverage.
The gap between 10x execution and 1x execution is now enormous. And every week you wait to address it, the gap gets wider.
We help leaders evaluate their organizations through the AI lens and redesign roles for 10x leverage.
Apply for a Strategy Call