Back to Blog
— 8 min read

Modeling vs. Asking: The Shift That Unlocked Everything

The Practice Revolution Written by Felix Lenhard

There’s a moment in every learning journey where the approach itself changes. Not a new technique. Not a new piece of information. A fundamental shift in how you go about acquiring skill. For me, that shift had a name: modeling.

I didn’t invent the term. “Art of Practice” borrowed it from the NLP community — Neuro-Linguistic Programming — where modeling is defined as “a process of capturing, encoding, replicating, and transferring knowledge and experience.” But the way the book applied it to practice methodology made something click in my mind that six months of interviewing performers had failed to produce.

The idea was deceptively simple: if you want to learn what makes someone exceptional, don’t ask them. Watch them. Decode their behavior. Extract the principles embedded in their actions. Then replicate those principles in your own practice.

Ask versus model. Interview versus observe. The difference sounds subtle. In practice, it changed everything.

Why Asking Failed

I’ve already written about why interviewing expert performers produced nothing useful. The short version: unconscious competence. The best practitioners have internalized their effective behaviors so deeply that they can no longer access or articulate them. When you ask what they do differently, they give you a conscious narrative that describes their self-perception rather than their actual behavior. And their self-perception is reliably wrong about the specific details that matter.

But the failure of asking wasn’t just about the unreliability of expert self-reports. There was a deeper structural problem.

Asking is a pull methodology. You’re pulling information from someone based on what you think you need to know. Your questions shape the answers. If you ask “How long do you practice?” you’ll get a number. If you ask “What do you warm up with?” you’ll get a description. But these questions assume that duration and warm-up content are the relevant variables — and they might not be. You might be asking the wrong questions entirely, and the expert has no way to redirect you because they don’t know what the right questions are either.

Modeling is a push methodology. You’re not asking questions. You’re observing behavior and letting the patterns emerge. You don’t decide in advance what’s important. You watch everything and let the data tell you what matters. The signal reveals itself — you don’t have to know where to look for it.

In consulting, this distinction was fundamental. When we analyzed struggling companies, we never started by asking the executives what was wrong. They didn’t know what was wrong — if they did, they would have fixed it. We started by observing: how do decisions actually get made? How does information actually flow? Where do projects actually stall? The observation revealed patterns that no interview could have surfaced, because the relevant problems were invisible to the people inside the system.

Modeling practice methodology works the same way. You don’t ask the natural what they do. You watch what they do. You note the patterns that emerge across multiple naturals. And then you compare those patterns to what non-naturals do, looking for the systematic differences that explain the performance gap.

The Four Steps of Modeling

“Art of Practice” broke the modeling process into four steps, and each one resonated with my consulting training.

First: capture. Observe the best performers and record what they actually do. Not what they say. What they do. Every detail — when they start, what they start with, how they respond to mistakes, when they take breaks, when they stop, what they work on next. Raw behavioral data, without interpretation.

Second: encode. Find the patterns in the captured data. What do the best performers have in common? What do they all do differently from average performers? Strip away the individual variations and identify the universal principles. This is pattern recognition — the core skill of strategic consulting applied to human performance.

Third: replicate. Take the encoded principles and translate them into specific, actionable practices that anyone can implement. “Start with the hardest material” is a principle. “In your first five minutes, work on the technique you’re currently worst at” is a replicable practice.

Fourth: transfer. Teach the replicated practices to others. This is where the modeling method proves its power — if the principles are truly universal, they should produce results for anyone who implements them, not just the original naturals.

Each of these steps was familiar from my consulting work. We captured data through observation and analysis. We encoded it into strategic frameworks. We replicated those frameworks as implementation plans. We transferred them to client organizations. The methodology was identical. Only the domain was different.

What Modeling Revealed

When I shifted from asking to modeling — from interviewing performers to systematically observing them — the insights came quickly.

The first thing I noticed was the session order. Naturals started with hard material. Non-naturals started with easy material. This pattern was so consistent across every performer I observed that it couldn’t be coincidence.

The second thing was the rhythm. Naturals practiced in focused bursts with deliberate breaks. Non-naturals practiced in long, continuous sessions with gradual degradation in quality. Again, consistent across every observation.

The third was the response to mistakes. Naturals slowed down and got curious. Non-naturals sped up and got frustrated. The emotional and behavioral response to failure was a reliable marker of which approach someone was using.

The fourth was the advancement threshold. Naturals moved to harder material before achieving full mastery of the current level. Non-naturals refused to advance until they’d reached one hundred percent consistency. The ninety percent rule, as “Art of Practice” called it, was visible in every natural I watched.

None of this was visible through interviews. In interviews, both naturals and non-naturals described similar approaches — because the naturals were describing what they thought they did (which matched conventional wisdom) rather than what they actually did (which contradicted it).

Only observation could surface these differences. Only modeling could extract them into teachable principles.

The Consulting Connection

The parallels between modeling in performance and analysis in consulting were so precise that I sometimes forgot which domain I was thinking about.

In consulting, the most valuable insights came from observing discrepancies — the gaps between what companies said they did and what they actually did, between what their strategy documents described and what their operational data revealed. Those gaps contained all the leverage.

In practice modeling, the same was true. The gaps between what experts said they did and what observation showed they did contained all the insight. The entire methodology hinged on trusting observation over testimony.

This is counterintuitive for most people. We’re trained to trust expert opinion. When someone who’s exceptional at something tells you how they do it, the instinct is to believe them. Modeling says: don’t believe them. Watch them. The data is in their behavior, not in their words.

In consulting, we had a principle: “Don’t listen to what the client says. Watch what the client does.” It sounds cynical, but it’s not. It’s not that clients lie. It’s that they genuinely don’t know what they’re doing that makes them successful or unsuccessful. The conscious mind constructs narratives. The unconscious mind drives behavior. And in most domains of performance, behavior is what matters.

Applying It to My Own Practice

Once I understood the modeling framework, applying it to my own practice was straightforward — conceptually. Emotionally, it was a different story.

The modeling data was clear: naturals start hard, practice in bursts, stop when focus fades, move on at ninety percent, and respond to mistakes with curiosity. I needed to do all of these things.

Each one contradicted my instincts. Each one felt wrong. Each one triggered a psychological resistance that was real and powerful.

But I had something that pure instinct couldn’t provide: data. I’d observed the patterns. I’d read the research. I’d seen the principles confirmed across multiple disciplines. This wasn’t opinion. This was evidence-based practice methodology, decoded from the behavior of people who got results.

So I implemented it. Not all at once — the changes were too numerous and too uncomfortable for a complete overnight switch. But systematically, over the course of about two months, I restructured my practice sessions to align with the modeled principles.

Hard material first. Short, focused bursts. Stop when concentration drops. Move to harder material at ninety percent. Treat mistakes as data.

The results were not subtle. Within three weeks, techniques that had been stalled for months showed measurable improvement. Within two months, my overall skill level had advanced more than it had in the previous six months combined. The modeling approach didn’t just work. It worked dramatically.

Why This Is the Real Breakthrough

I want to be clear about something: the modeling methodology is more important than any specific principle it revealed.

Yes, starting with hard material is valuable. Yes, practicing in bursts is effective. Yes, stopping when focus fades is smart. But these are individual tactics. They’re the output of the modeling process, not the process itself.

The process — observe, capture, encode, replicate — is a universal tool. It works for practice methodology. It works for performance craft. It works for business strategy. It works for any domain where expertise exists but can’t be articulated.

The reason this matters is that learning never stops. There will always be a new skill to acquire, a new level to reach, a new domain to explore. Having a reliable method for decoding expertise — a method that doesn’t depend on experts being able to explain themselves — is a permanent advantage.

In my consulting career, the most valuable skill I developed wasn’t any specific analytical technique. It was the meta-skill of figuring out what works by observing what successful people and organizations actually do, rather than what they say they do. Modeling, as “Art of Practice” described it, was the same meta-skill applied to human performance.

The Bridge Forward

Understanding naturals versus non-naturals gave me the what: the specific behaviors that differentiate exceptional from average practitioners.

The modeling methodology gave me the how: a systematic process for observing, decoding, and replicating those behaviors.

What remained was the deeper layer — the inner game. The mindset, the motivation, the psychological architecture that makes someone capable of consistently overriding their instincts and doing the counterintuitive thing. Because knowing what to do and actually doing it are separated by an enormous chasm, and most people who know the right approach still default to the wrong one because the wrong one feels better.

“Art of Practice” had a framework for this too — the practice blueprint, the ninety-five/five split, the concept of working on your practice rather than in your practice. These ideas would become the next chapter of my journey, the shift from understanding what naturals do to understanding how to rewire my own psychology to do the same things consistently.

But that’s a story for the posts ahead. For now, the shift from asking to modeling — from trusting what experts say to trusting what observation reveals — was the single most important methodological change I made in my entire learning journey.

It didn’t make practice easier. It made practice effective. And those are two very different things.

FL
Written by

Felix Lenhard

Felix Lenhard is a strategy and innovation consultant turned card magician and co-founder of Vulpine Creations. He writes about what happens when you apply systematic thinking to learning a craft from scratch.