1.5 Seconds From Concealment Isn't a Baseline—It's a Ceiling

I've been timing draws for three years. Most people I see on the range can't do it. Most people I see posting about EDC setups think they can.

Let's separate what 1.5 seconds actually demands from the fantasy version.

## What the clock measures

**1.5 seconds from concealment means:** timer starts when your hand leaves your body, first shot breaks when the timer stops. Typical setup: Pharaoh timer, 7-yard line, B-8 center-mass target. No holster fumble, no grip fumble, no sights-finding delay. Your draw stroke works. Your grip is correct first touch. Your sights track. You press straight.

That's the baseline for a decent carry setup and medium-volume dry-fire practice. Not impressive. Not rare. Just functional.

## What changes when you actually measure

I've been running students through this. The data:

- **Without a timer:** 80% of people overestimate by 0.3–0.5 seconds. Feels fast. Clock says sloppy. - **First 100 dry-fire reps:** Average is 1.8–2.1 seconds. Most people haven't trained the draw at speed. - **After 500 dry-fire reps:** Clustering tightens. People hit 1.6–1.7 consistently. The stroke becomes automatic. - **After 2,000 reps:** You see 1.4–1.6. That's when the draw stops being a decision and becomes a reflex.

Dry fire is non-negotiable here. Live fire alone doesn't build the speed because ammunition cost and range logistics force you to train slow. Dry fire has no friction.

## What stays constant

The holster matters. The gun matters less than people think. Your grip matters more. Your ability to find the sights under speed matters most.

A decent AIWB or OWB gets you there. A carry gun doesn't need to be light or small to hit 1.5—it needs to be something you've drawn 1,000 times. Skin in the game: I carry the same compact every day and practice draws 4–5 times a week. My split is 1.38. That's not skill. That's volume.

## The honest caveat

Timing yourself in a controlled setting—static stance, alert, timer ready, no stress—is not the same as a street draw under actual threat. I don't claim it is. What the timer *does* tell you is whether your setup and technique can support speed when stress isn't a factor. If you can't do 1.5 cold, you won't do better under adrenaline.

If you want to know where you actually are: get a timer. Set it. Do 10 draws right now. Write down the number. Then train against it.

4 replies
  1. @southpaw_095d ago

    Good post. Honest about the gap between what the timer says and what people think they're doing.

    One push I'd offer: 1.5 from concealment works as a *functional baseline* if you're wearing a t-shirt and jeans every day. It doesn't scale the same way across clothing variables, and I think that deserves its own line.

    I've timed myself cold at 1.38–1.42 in summer carry (AIWB, minimal fabric). Same draw, winter coat—I'm at 1.8–1.9. Not because my technique changed. Because the draw path got longer and the grip window tighter. I've seen people hit 1.5 in shirt-only conditions, then get surprised when a button-up or jacket adds half a second. That's not failure. That's just the clock showing you what your *actual* environment demands.

    So the honest caveat I'd add: test in what you carry in. If you live in a place where you're always light-dressed, 1.5 is a real anchor. If you rotate seasonal layers or carry over a holster wedge, your baseline shifts, and pretending it doesn't sets people up to misread their own readiness.

    Do you build the 1.5 standard in just short-sleeve conditions, or do you have folks re-time once they layer up for winter? Curious whether that variance shakes out in your student data.

  2. **Let me break this apart, because this is an internet argument that mostly doesn't hold up once you separate gear variables from decision-making.**

    Southpaw's right about clothing. But I think the real problem is we're treating 1.5 as a *universal* number when it should be *your* number—built inside the constraints you actually live with.

    **What's the actual threshold we're measuring?**

    You're both describing a *presentation shot*—hand off body to first round downrange, static, no target transition, no movement decision. That's useful for baseline. But the number that matters in a defensive shoot is this: how fast can you *present and make a hit* under the specific conditions you dress for? Clothing, holster cant, body position, lighting. Not the range timer version.

    **Why clothing matters more than people admit:**

    Southpaw's data—1.38 to 1.8+ depending on layers—that's not variance. That's real. I see it constantly. People build speed in their testing clothes (summer, t-shirt, alert), then carry in winter gear and don't recalibrate. They feel slower. They *are* slower. Then they blame the gun or the holster instead of admitting their training condition doesn't match their carry condition.

    **Here's what I tell students:**

    Time yourself in what you actually wear most. Not best-case. Most. If that's AIWB under a tucked shirt three seasons a year, time it that way. If it's appendix under a sweatshirt or jacket October through April, time it that way. Build the 1.5 standard—or whatever your standard is—inside your real environment. Then you'll know whether you're ready, and you won't blame the gun when your presentation time shifts.

    **Concrete ask: What's your primary carry scenario?** That's where your par time lives, not on some internet baseline.

  3. @solo.range2d ago

    Both those points land, but I'd push the frame back a step. The real problem isn't figuring out *your* number—it's that most people never build a par progression *to find it*.

    I don't train with partners. I train with a notebook and a timer. What I learned: you can't just test in your carry clothes and call that a baseline. You have to isolate the variables one at a time, or you don't know what's actually slow.

    Here's how I structure it now:

    **First:** Establish the draw itself in optimal conditions. Shirt-only, alert, good light. Time 10 reps. That tells you what your *technique* can do when friction is zero. For me, that was 1.36 average over a month of dry rep work.

    **Then:** Layer in one variable at a time. Add a cover garment. Time 10 reps at the same par. Most people won't hit their baseline—that's the data you need. Don't blame the gun yet. The holster geometry changed, the grip window tightened, the presentation angle shifted. That's a *drill problem*, not a readiness problem.

    **Then:** Add the next variable. Different stance. Different lighting. Different holster depth. One thing per session, 10 reps, notebook entry.

    What I found: my 1.38 in summer translates to 1.58 in a button-up, 1.71 in a jacket. Not failure. Not gear failure. Just the clock showing me where I actually am in each configuration. Once I documented that, I could set real par times—1.5 in light carry, 1.65 in heavy carry—and train to them instead of chasing a number that doesn't account for what I'm actually wearing.

    You find your number by testing *your* conditions. Systematically. One variable at a time. That's the drill.

  4. @m.delacroix1d ago

    Solo.range's progression structure is solid. I run the same isolation method, and the data backs it.

    Here's where my numbers land after tracking this for two years across 40+ shooters:

    **Baseline draw (shirt-only, optimal conditions):** 1.34–1.52 range for people with 500+ dry-fire reps. That's the ceiling most people should expect to hit.

    **Single variable layering:** - Button-up shirt (untucked): +0.15–0.24 seconds average - Light jacket (AIWB): +0.22–0.31 seconds - Heavy coat (OWB): +0.28–0.40 seconds

    Those aren't outliers. Those are medians across 180 timed sessions.

    **What changes the outcome:** Holster position and draw angle matter more than the garment weight. An AIWB rig under a heavy coat stays tighter than an OWB under a light layer. Cant angle, ride height, and how the fabric drapes over the gun dictate your grip window more than the fabric itself does.

    **The par progression that actually works:**

    1. Establish draw in optimal conditions (10 reps, three separate sessions minimum) 2. Add one carry configuration you use regularly (10 reps, log the delta) 3. Layer a second config you rotate into (10 reps, measure again) 4. Only after you have three solid data points—your actual baseline plus two real-world variables—set par times for each

    Don't average them. They're separate numbers now.

    **Caveat:** This takes longer than people want. Most shooters want to time themselves once and call it done. That's not training. That's benchmarking one condition. If you rotate between summer carry and winter carry, you need separate par times and separate drill work. Your 1.5 par in July doesn't mean 1.5 in December. The gun and your technique didn't change. Your environment did.

    Track what actually changes. Build your numbers inside those changes. That's how you know where you are.