One hundred years ago this week, Babe Ruth hit the first of his 714 home runs--a record that stood for a lifetime.
Also about a hundred years ago (in 1911, actually), a guy named Clarence DeMar got the first of his seven Boston Marathon victories. Everyone in America has heard of Babe Ruth. How many, other than we runners and a few very old Bostonians and Hopkintonians, have ever heard of Clarence DeMar?
I bring up this question because I've been doing a lot of cogitating lately about the largely forgotten role of human endurance in our culture of ever-greater speed and ever-quicker gratification in all things. We live now in what I call a "sprint culture." In the physiology of sport, of course, a sprint is an action that can last no more than 1 to 2 minutes. If you run as fast as you can, you can keep it up for maybe 440 yards (most of us not even that far) before having to stop, stooped over with your hands on your knees, gasping. You've gone anaerobic, and until you pay off the oxygen debt and get rid of all the metabolic waste you've accumulated in your system, you're done. Yet, a well-trained long-distance runner, using highly efficient aerobic metabolism, can go 2, 3, 4, or 5 hours nonstop (or in an ultra, 6, 12, or 24 hours or longer with only brief stops) and cross the finish line without gasping for breath.
Our civilization has a comparable choice, based on the same metabolic principles. Our industries burn carbon fuels and breathe out carbon dioxide at a rate that will catapult us into civilizational burnout within decades--or we can shift to sustainable industries and behaviors that will let humanity thrive for many generations to come. But that's another story, which I tell in a book that will be out in October. More about that later.
Back to Babe Ruth and Clarence DeMar, it strikes me that their relative fame (or lack thereof) is reflective of our culture's obsession with quickness and speed. We want faster airplanes, faster computers, faster TV thrills, faster relief from pain, faster returns on investment. For the Babe, who came along in the early years of this developing obsession, it took only a one half-second swing of the bat to bring 60,000 people to their feet. A thrill in an instant.
For a very good marathon runner in those days (when much less was known about training methods, nutrition, etc.), a finishing time of 2 hours and 30 minutes was an effort that took 18,000 times as long as a swing of the baseball bat.
So, quite aside from the time spent training and laboring in now-forgotten competitions, how do the peak achievements of Babe Ruth's 714 home runs compare with Clarence DeMar's 7 Boston Marathon victories just in terms of the number of minutes of iconic athletic performance they gave us? It's a worthy comparison, because both were about equally unequaled by other men.
Discounting the ritual run around the bases after the ball flies over the fence, if we count each home run as a marvelous half-second, the Bambino's actual time hitting home-runs adds up to 357 seconds, or about 6 minutes. DeMar's Boston Marathon wins, averaging somewhere around 2:25 each, add up to more than 1,000 minutes. So, who's the greater athlete?
OK, OK. If you're a rabid baseball fan or spectator-sports junkie in general, you'll have a heap of objections to this. I don't mind, and I'll even acknowledge in advance that in some respects this comparison may be considered absurd. But as many playwrights, novelists, and artists have found, contemplation of the absurd sometimes provokes us to new perspectives on the even greater absurdities of the culture in which we live. I think I need to get out for a run.
Josh about to pass his grandpa
Here, at age three, Josh regularly runs a mile or two with me--and I have to work hard to keep up!
Thursday, May 10, 2012
Sunday, May 6, 2012
"Best Trail Ultra" -- Who Wrote This?
I was in a Barnes & Noble looking at magazines a few days ago, and found a magazine called "Trail" (not the regular Trail Runner), in which there was a slick feature about "best" American trail races in all sorts of categories: "Best Partner Race," "Best Marathon You've Never Heard of," "Best Barefoot Event," "Best Race on Reclaimed Industrial Land," "Most Zany Fun Run" . . .
I noticed the item "Best Ultra Race," and out of curiosity took a moment to read it. I had read in Ultrarunning magazine that there are now more than 550 ultras in the U.S., so how does anyone decide which one is "best"?
The selection was "The Western States Endurance Run." This was no surprise. Western States is probably the most famous, most publicized, and--as I note in a forthcoming piece (August issue) in Running Times magazine--the most "legendary" ultra. And, while I have run around 50 ultras (along with around 600 other long-distance races) in my life so far, I wouldn't particularly argue about this one-shot magazine's designation of Western States as "the best."
What amazed me, though, was the writer's reason for selecting Western States. In the interests of journalistic accuracy (and what journalists call "fair use"), let me quote the three-sentence reason the writer gave:
"This is the granddaddy of them all, the 100 mile race that spurred the sport of ultradistance trail running in the U.S. It morphed out of a horse race; when Gordy Ainsleigh's horse came up lame in 1974, he decided to run the entire course through the Sierra Nevada mountains himself. Since then, Western States has become the Boston Marathon of trail running, attracting the best trail runners from around the world."
I have read a lot of "rewriting of history" over the years, but this one should get a prize. Hey, I have a few dusty trophies and medals from popular ultras I ran in the years before Western States was born, so maybe I could send the writer one of those. And maybe I could enclose a note asking how Western States got to be a "grandaddy" of some major ultras that were older than it?
For example: In 1973, the year before Gordy Ansleigh became the lone first finisher of the first Western States, the JFK 50 Mile in Maryland had 673 finishers. Other ultras that thrived before Western States include the Sri Chinmoy hundred-mile and thousand-mile ultras in New York, the Two Bridges 36-Mile in Washington, DC, a series of 6-day races in New Jersey, and some others I'd have to go look up my dusty old copies of the long-defunct Long-Distance Log to recall.
In one respect, the "Trail" blurb is right: ultrarunning boomed in the 1980s, and Western States certainly had a role in it. But the boom had nothing to do with the launch of that race in 1974; it was spurred by the ABC network TV show "Wide World of Sports," which featured Western States in 1984 and '85, and gave ultrarunning a kind of media exposure to the general public it hadn't had before. In a very PR-savvy rewriting of history, Gordy Ansleigh's heroic feat was recast as the launch of the ultra boom. But it wasn't. A TV special was, ten years later.
The real "grandaddy" of American ultrarunning, of course, is not Western States but fthe JFK 50, which had 1,311 finishers over the decade before Western States had its first, and which has had hundreds more finishers than Western States in every one of the 37 years Western States has been run.
Of course, there will be those who say Western States is the more "legendary" race because it's more competitive. Well, I won't at all disagree with the "Trail" special's observation that top runners come from all over the world to run Western States (although that has been true only in the past few years). But they come to JFK, too--just with much less fanfare. In 2010, when seven-time Western States winner Scott Jurek came to JFK, a lot of us assumed he'd win in a breeze. He finished 11th.
As for what race is "the best" trail ultra, again, I wouldn't argue with the "Trail" writer. Western States has a spectacular course through the Sierras--taking you through pristine snow and 100-degree heat just hours apart; it traverses awesome canyons, vistas, and climbs; it has superb aid stations and organization. And the legacy of Scott Jurek's seven wins (and Ann Trason's 11 wins) there is truly awesome. I also know that at least 20 other ultras around the country might make the case that they are "best," but just less publicized. The Lake Waramaug run in Connecticut, for example, started up the same year as Western States (actually three months earlier than WS), and became iconic to ultrarunners in the late 1970s but never became known to the public. "Best" is a subjective word. To paraphrase an old saw, "Best is in the eyes, heart, and feet of the beholder."
My point is just that in the world of long-distance running, "best" cannot be determined by TV producers, film-makers, and publicists. It can only be decided by runners.
I noticed the item "Best Ultra Race," and out of curiosity took a moment to read it. I had read in Ultrarunning magazine that there are now more than 550 ultras in the U.S., so how does anyone decide which one is "best"?
The selection was "The Western States Endurance Run." This was no surprise. Western States is probably the most famous, most publicized, and--as I note in a forthcoming piece (August issue) in Running Times magazine--the most "legendary" ultra. And, while I have run around 50 ultras (along with around 600 other long-distance races) in my life so far, I wouldn't particularly argue about this one-shot magazine's designation of Western States as "the best."
What amazed me, though, was the writer's reason for selecting Western States. In the interests of journalistic accuracy (and what journalists call "fair use"), let me quote the three-sentence reason the writer gave:
"This is the granddaddy of them all, the 100 mile race that spurred the sport of ultradistance trail running in the U.S. It morphed out of a horse race; when Gordy Ainsleigh's horse came up lame in 1974, he decided to run the entire course through the Sierra Nevada mountains himself. Since then, Western States has become the Boston Marathon of trail running, attracting the best trail runners from around the world."
I have read a lot of "rewriting of history" over the years, but this one should get a prize. Hey, I have a few dusty trophies and medals from popular ultras I ran in the years before Western States was born, so maybe I could send the writer one of those. And maybe I could enclose a note asking how Western States got to be a "grandaddy" of some major ultras that were older than it?
For example: In 1973, the year before Gordy Ansleigh became the lone first finisher of the first Western States, the JFK 50 Mile in Maryland had 673 finishers. Other ultras that thrived before Western States include the Sri Chinmoy hundred-mile and thousand-mile ultras in New York, the Two Bridges 36-Mile in Washington, DC, a series of 6-day races in New Jersey, and some others I'd have to go look up my dusty old copies of the long-defunct Long-Distance Log to recall.
In one respect, the "Trail" blurb is right: ultrarunning boomed in the 1980s, and Western States certainly had a role in it. But the boom had nothing to do with the launch of that race in 1974; it was spurred by the ABC network TV show "Wide World of Sports," which featured Western States in 1984 and '85, and gave ultrarunning a kind of media exposure to the general public it hadn't had before. In a very PR-savvy rewriting of history, Gordy Ansleigh's heroic feat was recast as the launch of the ultra boom. But it wasn't. A TV special was, ten years later.
The real "grandaddy" of American ultrarunning, of course, is not Western States but fthe JFK 50, which had 1,311 finishers over the decade before Western States had its first, and which has had hundreds more finishers than Western States in every one of the 37 years Western States has been run.
Of course, there will be those who say Western States is the more "legendary" race because it's more competitive. Well, I won't at all disagree with the "Trail" special's observation that top runners come from all over the world to run Western States (although that has been true only in the past few years). But they come to JFK, too--just with much less fanfare. In 2010, when seven-time Western States winner Scott Jurek came to JFK, a lot of us assumed he'd win in a breeze. He finished 11th.
As for what race is "the best" trail ultra, again, I wouldn't argue with the "Trail" writer. Western States has a spectacular course through the Sierras--taking you through pristine snow and 100-degree heat just hours apart; it traverses awesome canyons, vistas, and climbs; it has superb aid stations and organization. And the legacy of Scott Jurek's seven wins (and Ann Trason's 11 wins) there is truly awesome. I also know that at least 20 other ultras around the country might make the case that they are "best," but just less publicized. The Lake Waramaug run in Connecticut, for example, started up the same year as Western States (actually three months earlier than WS), and became iconic to ultrarunners in the late 1970s but never became known to the public. "Best" is a subjective word. To paraphrase an old saw, "Best is in the eyes, heart, and feet of the beholder."
My point is just that in the world of long-distance running, "best" cannot be determined by TV producers, film-makers, and publicists. It can only be decided by runners.
Friday, May 4, 2012
The State of Running, 2012: Have We Gone Off Course?
My article by this title, in the May issue of Running Times magazine, questions whether some of us long-distance runners may be unconscionably failing to share, with the quick-gratification, tech-addicted culture that surrounds us, something essential we've learned about human nature and survival. Read it here: http://runningtimes.com/Article.aspx?ArticleID=25868
Sunday, April 22, 2012
Barefoot vs. Running Shoes: Can't We Talk?
A disturbing divide has appeared in what was once considered the long-distance-running "community." Maybe it was inevitable: The more than 50 million Americans who run regularly constitute too large a population to share similar views about anything at all, inluding running!
The divide, which really shouldn't divide us at all, is about barefoot running. Judging by what I've seen on Twitter, Facebook, running blogs, and other popular communications, barefoot running has become not just an important new phenomenon of 21st-century athletic culture; it has also become something close to a religion.
What's disturbing is that religions, while virtually always arising from the most essential and uplifting of human desires--to understand the meaning of our lives--can, if taken to extremes, lead to hardened views, intolerance, and alienation. And sadly, I see signs of that with barefoot running. There's a fervor among barefooters that, if I'm not mistaken, seems to say "If you wear traditional running shoes, you are not a true believer!"
Before I go any further, let me make two quick points that are very pertinent to what follows:
1. I am not "against" barefoot running. On the contrary, in my forthcoming book (coming in October), I include some fairly extensive discussion of the now very persuasive scientific evidence that modern humans evolved as long-distance-running persistence hunters who probably ran barefoot for tens of thousands of years before civilization began. I'll post some excerpts of that discussion in the coming days. There's reason to believe that running barefoot (and naked) profoundly affected many of the sensibilities, pleasures, and preferences we have today.
2. I train and race in traditional running shoes--not minimalist shoes, which I tried decades ago, long before the recent minimalist resurgence--but the kind of comfortable shoes popularized by Nike's Bill Bowerman, Adidas-clad Olympians, and others in the 1970s and '80s. The shoes I run in now are designed for lateral stability, cushioning, and protection of the toes from kicking rocks, and I'm glad I can still get them. I might sound like an old curmudgeon who doesn't know a good new thing when he sees it, but I have to say I'm not just any old curmudgeon. I've been running long-distance races for 54 years in a row, at distances from 2 miles to 100 miles and beyond, and my knees, back, hips, and feet are all still healthy and intact. Might that not say something about the possible value of having protective running shoes?
These two points (#1 and #2 above) might seem contradictory. How can I be an enthusiastic supporter of the view that the human animal is by nature a barefoot runner with my own practice of always running in shoes?
A simple answer is that modern humans left the wild world behind about 10,000 years ago, with the advent of civilization. Civilization arose via the invention of agriculture, which allowed us to settle in fixed locations thanks to the domestication of wild plants and animals. Crops replaced foraging for edible plants, and livestock replaced hunting. And along with the domesticating of animals and plants, we humans domesticated ourselves. That's the part our souls rebel against. We are no longer wild. And sadly, in many ways, it's no longer possible for us to live as if we are.
That's the simple answer. But of course, the reality is more complex. Civilization brought enormous changes to the hunter-gatherer life of our Paleolithic ancestors. Those changes included a now rapidly growing dependence on technological assists to our bodies and brains. We no longer have to lift heavy objects with our arms because we have forklifts, elevators, and airplanes. We no longer have to develop acute mental mapping (as our hunter-gatherer ancestors did) because we now have GPS. We don't even have to manually turn the faucet in a public restroom, because they've installed motion sensors.
The problem is that little by little, the techno-assists cause some of the native capabilities we once had as wild humans to weaken. As athletes, we all know that a muscle that isn't used will atrophy. That's also true of mental functions. A study at the University of London found that taxi drivers who relied on GPS to find their way around the city for three years or longer had a smaller hippocampus (the part of the brain responsible for finding your way, among other functions) than drivers who used their own mental mapping. And extensive research has indicated that people who don't read, think, or otherwise actively use their brains are more likely to slip into dementia or Alzheimer's disease.
Weak muscles or weakened mental functions may be what happens to individuals who don't get sufficient physical or mental exercise. But weakening can also be multi-generational, as a result of either biological or cultural evolution. Biological evolution is very slow, and we are pretty much the same animal now as we were in Paleo times. According to evolutionary biologists like Harvard's Edward O. Wilson, we have about the same genetics now as we did 100,000 years ago. That might explain the powerful hold that the feel of sun or breeze on bare skin, or the earth under bare feet, has on so many of us. But cultural evolution can be staggeringly fast, with substantial impacts on brain and body. And one of the parts of us most directly affected is our feet.
When we stopped chasing antelope as a way of life, and started wearing shoes, our feet became profoundly more passive than they'd been. Every foot has 28 bones and a huge complex of ligaments, tendons, muscles, blood bessels, and nerves, and that complex developed through many millennia of running for hours at a time over variegated landscape that necessitated skillful navigation and balance. Every bone, tendon, and muscle had a regular, active, role.
But what happens when those roles are abandoned for thousands of years? The answer isn't simple, but a useful way of approaching it is to ask what happened--over many generations--as wild wolves were transormed into dogs. Some dogs are a lot like wolves; others are unrecognizable. Breeding is accomplished by domestication, not by natural selection. A show-dog poodle wouldn't survive two weeks in the wild. A German shepherd or huskie might be OK.
An analogous situation prevails with domesticated humans. Some of us may still have feet that can successfully re-adapt to that pre-domesticated, persistence-hunter mode; others may not. I'm afraid I'm one of those who don't. I wish I did. But I can still enjoy running, and still run well.
An educated assessment might be this: Genetically, all humans have the capability to feel--and even yearn for--the pleasure of barefoot running. But biomechanically, many and probably a majority of us can't actually do it with any long-run success, except perhaps on pine-needle paths or nicely groomed trails in parks (which are themselves domesticated wildlands). In true wilderness, I'd guess very few can. In most of us, our 28-bone wonders haved gone unused--encased in coffin-like shoes or boots--for too many generations.
So, why can't we discusss this in a reasonable way? Instead of romanticizing barefoot running the way some 19th-century painters romanticized the state of nature as if it were a Garden of Eden, why can't we recognize that (1) we are all genetically capable of appreciating the feel of bare feet on earth, but (2) only a very few of us can do so for very long without injury?
To put this in perspective, I offer two recent observations:
1. At America's largest ultramarathon (the JFK 50-Mile) last November, where virtually all of the 1,000-plus starters were seasoned trail runners, I didn't see a single one of those runners attempting this event with bare feet. Why not?
2. In a recent issue of Ultrarunning magazine, I noticed a photo of an unusual sight in this sport--a barefoot runner. But oddly, this runner was also wearing a large assortment of equipment other than shoes: camel-back hydration system, belt, bottles, shirt, hat, etc. And if I recall, in the list of finishers, this runner finished next-to-last. If you're truly a minimalist aspiring to "run free," why all this stuff? If you truly wish to learn the way of the Paleo runner, and are willing to undergo the training needed to re-adapt your feet, wouldn't you want to do the same with your other needs as well--running as naked and unencumbered as possible, or relying on streams or wild berries (as I did one summer when I fell and my water bottle spilled) for some of your hydration? Wouldn't you want to develop some adaptation to heat and dehydration instead of carrying a ton of water on your back?
The barefoot boom has brought us a great opportunity to engage in a lively discussion of topics that are deeply meaningful to us all: how we evolved as humans, how civilization and domestication have changed us, how much we want to let our native capabilities be replaced by technology, and what makes us differ from each other as individuals while also having so much in common. And finally, how we can live most successfully with the internal tugs-of-war all modern humans must feel--between the bodies and brains we developed in a wild world long ago, and the vastly different conditions we live under now.
The divide, which really shouldn't divide us at all, is about barefoot running. Judging by what I've seen on Twitter, Facebook, running blogs, and other popular communications, barefoot running has become not just an important new phenomenon of 21st-century athletic culture; it has also become something close to a religion.
What's disturbing is that religions, while virtually always arising from the most essential and uplifting of human desires--to understand the meaning of our lives--can, if taken to extremes, lead to hardened views, intolerance, and alienation. And sadly, I see signs of that with barefoot running. There's a fervor among barefooters that, if I'm not mistaken, seems to say "If you wear traditional running shoes, you are not a true believer!"
Before I go any further, let me make two quick points that are very pertinent to what follows:
1. I am not "against" barefoot running. On the contrary, in my forthcoming book (coming in October), I include some fairly extensive discussion of the now very persuasive scientific evidence that modern humans evolved as long-distance-running persistence hunters who probably ran barefoot for tens of thousands of years before civilization began. I'll post some excerpts of that discussion in the coming days. There's reason to believe that running barefoot (and naked) profoundly affected many of the sensibilities, pleasures, and preferences we have today.
2. I train and race in traditional running shoes--not minimalist shoes, which I tried decades ago, long before the recent minimalist resurgence--but the kind of comfortable shoes popularized by Nike's Bill Bowerman, Adidas-clad Olympians, and others in the 1970s and '80s. The shoes I run in now are designed for lateral stability, cushioning, and protection of the toes from kicking rocks, and I'm glad I can still get them. I might sound like an old curmudgeon who doesn't know a good new thing when he sees it, but I have to say I'm not just any old curmudgeon. I've been running long-distance races for 54 years in a row, at distances from 2 miles to 100 miles and beyond, and my knees, back, hips, and feet are all still healthy and intact. Might that not say something about the possible value of having protective running shoes?
These two points (#1 and #2 above) might seem contradictory. How can I be an enthusiastic supporter of the view that the human animal is by nature a barefoot runner with my own practice of always running in shoes?
A simple answer is that modern humans left the wild world behind about 10,000 years ago, with the advent of civilization. Civilization arose via the invention of agriculture, which allowed us to settle in fixed locations thanks to the domestication of wild plants and animals. Crops replaced foraging for edible plants, and livestock replaced hunting. And along with the domesticating of animals and plants, we humans domesticated ourselves. That's the part our souls rebel against. We are no longer wild. And sadly, in many ways, it's no longer possible for us to live as if we are.
That's the simple answer. But of course, the reality is more complex. Civilization brought enormous changes to the hunter-gatherer life of our Paleolithic ancestors. Those changes included a now rapidly growing dependence on technological assists to our bodies and brains. We no longer have to lift heavy objects with our arms because we have forklifts, elevators, and airplanes. We no longer have to develop acute mental mapping (as our hunter-gatherer ancestors did) because we now have GPS. We don't even have to manually turn the faucet in a public restroom, because they've installed motion sensors.
The problem is that little by little, the techno-assists cause some of the native capabilities we once had as wild humans to weaken. As athletes, we all know that a muscle that isn't used will atrophy. That's also true of mental functions. A study at the University of London found that taxi drivers who relied on GPS to find their way around the city for three years or longer had a smaller hippocampus (the part of the brain responsible for finding your way, among other functions) than drivers who used their own mental mapping. And extensive research has indicated that people who don't read, think, or otherwise actively use their brains are more likely to slip into dementia or Alzheimer's disease.
Weak muscles or weakened mental functions may be what happens to individuals who don't get sufficient physical or mental exercise. But weakening can also be multi-generational, as a result of either biological or cultural evolution. Biological evolution is very slow, and we are pretty much the same animal now as we were in Paleo times. According to evolutionary biologists like Harvard's Edward O. Wilson, we have about the same genetics now as we did 100,000 years ago. That might explain the powerful hold that the feel of sun or breeze on bare skin, or the earth under bare feet, has on so many of us. But cultural evolution can be staggeringly fast, with substantial impacts on brain and body. And one of the parts of us most directly affected is our feet.
When we stopped chasing antelope as a way of life, and started wearing shoes, our feet became profoundly more passive than they'd been. Every foot has 28 bones and a huge complex of ligaments, tendons, muscles, blood bessels, and nerves, and that complex developed through many millennia of running for hours at a time over variegated landscape that necessitated skillful navigation and balance. Every bone, tendon, and muscle had a regular, active, role.
But what happens when those roles are abandoned for thousands of years? The answer isn't simple, but a useful way of approaching it is to ask what happened--over many generations--as wild wolves were transormed into dogs. Some dogs are a lot like wolves; others are unrecognizable. Breeding is accomplished by domestication, not by natural selection. A show-dog poodle wouldn't survive two weeks in the wild. A German shepherd or huskie might be OK.
An analogous situation prevails with domesticated humans. Some of us may still have feet that can successfully re-adapt to that pre-domesticated, persistence-hunter mode; others may not. I'm afraid I'm one of those who don't. I wish I did. But I can still enjoy running, and still run well.
An educated assessment might be this: Genetically, all humans have the capability to feel--and even yearn for--the pleasure of barefoot running. But biomechanically, many and probably a majority of us can't actually do it with any long-run success, except perhaps on pine-needle paths or nicely groomed trails in parks (which are themselves domesticated wildlands). In true wilderness, I'd guess very few can. In most of us, our 28-bone wonders haved gone unused--encased in coffin-like shoes or boots--for too many generations.
So, why can't we discusss this in a reasonable way? Instead of romanticizing barefoot running the way some 19th-century painters romanticized the state of nature as if it were a Garden of Eden, why can't we recognize that (1) we are all genetically capable of appreciating the feel of bare feet on earth, but (2) only a very few of us can do so for very long without injury?
To put this in perspective, I offer two recent observations:
1. At America's largest ultramarathon (the JFK 50-Mile) last November, where virtually all of the 1,000-plus starters were seasoned trail runners, I didn't see a single one of those runners attempting this event with bare feet. Why not?
2. In a recent issue of Ultrarunning magazine, I noticed a photo of an unusual sight in this sport--a barefoot runner. But oddly, this runner was also wearing a large assortment of equipment other than shoes: camel-back hydration system, belt, bottles, shirt, hat, etc. And if I recall, in the list of finishers, this runner finished next-to-last. If you're truly a minimalist aspiring to "run free," why all this stuff? If you truly wish to learn the way of the Paleo runner, and are willing to undergo the training needed to re-adapt your feet, wouldn't you want to do the same with your other needs as well--running as naked and unencumbered as possible, or relying on streams or wild berries (as I did one summer when I fell and my water bottle spilled) for some of your hydration? Wouldn't you want to develop some adaptation to heat and dehydration instead of carrying a ton of water on your back?
The barefoot boom has brought us a great opportunity to engage in a lively discussion of topics that are deeply meaningful to us all: how we evolved as humans, how civilization and domestication have changed us, how much we want to let our native capabilities be replaced by technology, and what makes us differ from each other as individuals while also having so much in common. And finally, how we can live most successfully with the internal tugs-of-war all modern humans must feel--between the bodies and brains we developed in a wild world long ago, and the vastly different conditions we live under now.
Monday, April 9, 2012
Irresponsible Media and the Death of an Ultrarunner
The death of a well-known ultrarunner, Micah True (aka Caballo Blanco), who'd been lost for four days after going out for a run in a New Mexico wilderness, sent a tremor through some of the social media a few days ago. For millions of Americans, running trails has become a passionate avocation, and the story of Micah True, as recounted in the book Born to Run, has been an inspiration for many. His death recalls the similarly shocking demise of Jim Fixx, author of the bestselling Complete Book of Running, in 1984. Fixx's book had helped promote the belief that distance running is good for one's heart, overall health, and longevity--and his death at age 58 brought a hail of "I-told-you-so" scorn from skeptics who thought that belief was hokum. Micah True, like Jim Fixx, was just 58 when he died.
Since Fixx's death, of course, the skeptics have been resoundingly refuted. Thousands of cardiologists, sportsmedicine physicians, and other doctors are now avid distance runners--many of them ultrarunners like True. The evidence of enhanced cardiovascular health and longevity bestowed by aerobic running has only been strengthened. But with True's death, some of the skeptics are back. And once again, they are wrong. And now, with millions of Americans hitting the trails (Surveys by the Sporting Goods Manufacturers Association indicate that more than 49 million Americans are active lifestyle runners or joggers), the damage that could be done by misinformed skeptics is much larger.
Here's the problem. Americans have fought a losing battle against obesity, passivity, and sloth over the past half-century--partly because our economic system pours an endless river of money into advertising and marketing junk food, junk science, and pop-culture pursuits of quick gratification. By comparison, only a minuscule amount of funding finds its way to enlightening the population about the kinds of lasting benefits to body, mind, and soul that come from activities like long-distance hiking, swimming, bicycling, mountain climbing, or long-distance running. That makes the public image of endurance sports very vulnerable to media distortion or marginalization. Sports like ultrarunning get no attention from the nightly sportscaster chatter on ESPN or NBC, or newspaper sports pages managed by editors who've never even heard of the Leadville 100-Mile or American River 50.
So, if a bizarre news item like the death of an ultrarunner does come to their attention, the resulting stories give the public a hugely distorted impression of the risks of wilderness trail running. One result might be that the parents of young athletes might prefer to have their kids play football, under the watchful eyes of trained coaches, rather than let them go wandering off into the woods where who knows what might happen. Never mind that statistically, the danger of a concussion and brain injury on the football field are vastly higher than that of serious injury or death on the trail.
Meanwhile, with such distorted impressions being promulgated, who gets hurt? Not we long-distance trail runners, who know perfectly well that the someone dying while out for a run is in fact extremely rare, and who will not be the least deterred. The people who get hurt are the general public, whose culturally inculcated and media-shaped reluctance to engage in physical exercise, not to mention endurance activities, will only be reinforced. The major-media sports reporters, ESPN talking torsos, and gladiator-sport groupies should be ashamed.
Since Fixx's death, of course, the skeptics have been resoundingly refuted. Thousands of cardiologists, sportsmedicine physicians, and other doctors are now avid distance runners--many of them ultrarunners like True. The evidence of enhanced cardiovascular health and longevity bestowed by aerobic running has only been strengthened. But with True's death, some of the skeptics are back. And once again, they are wrong. And now, with millions of Americans hitting the trails (Surveys by the Sporting Goods Manufacturers Association indicate that more than 49 million Americans are active lifestyle runners or joggers), the damage that could be done by misinformed skeptics is much larger.
Here's the problem. Americans have fought a losing battle against obesity, passivity, and sloth over the past half-century--partly because our economic system pours an endless river of money into advertising and marketing junk food, junk science, and pop-culture pursuits of quick gratification. By comparison, only a minuscule amount of funding finds its way to enlightening the population about the kinds of lasting benefits to body, mind, and soul that come from activities like long-distance hiking, swimming, bicycling, mountain climbing, or long-distance running. That makes the public image of endurance sports very vulnerable to media distortion or marginalization. Sports like ultrarunning get no attention from the nightly sportscaster chatter on ESPN or NBC, or newspaper sports pages managed by editors who've never even heard of the Leadville 100-Mile or American River 50.
So, if a bizarre news item like the death of an ultrarunner does come to their attention, the resulting stories give the public a hugely distorted impression of the risks of wilderness trail running. One result might be that the parents of young athletes might prefer to have their kids play football, under the watchful eyes of trained coaches, rather than let them go wandering off into the woods where who knows what might happen. Never mind that statistically, the danger of a concussion and brain injury on the football field are vastly higher than that of serious injury or death on the trail.
Meanwhile, with such distorted impressions being promulgated, who gets hurt? Not we long-distance trail runners, who know perfectly well that the someone dying while out for a run is in fact extremely rare, and who will not be the least deterred. The people who get hurt are the general public, whose culturally inculcated and media-shaped reluctance to engage in physical exercise, not to mention endurance activities, will only be reinforced. The major-media sports reporters, ESPN talking torsos, and gladiator-sport groupies should be ashamed.
Wednesday, March 14, 2012
A False Alarm About Landing on Your Heels
Some recent research has revived a very old issue among runners--whether you should land on your heels (as in fact most of us distance runners do) or on the balls of your feet (forefeet) as most sprinters and top middle-distance runners do. The new research, reported by the Harvard professor of human biological evolution Daniel Lieberman and his colleagues, has raised a ruckus and caused a lot of consternation by suggesting that runners who land on their heels may be at greater risk of injury than those who land on their forefeet.
I have good reason to think that for the great majority of runners, that conclusion might be mistaken. I'm not challenging Lieberman's study at all, and in fact I think Lieberman is one of the most important figures in the science of human biomechanics. So, how could the injury-risk worry be a mistake? A simple answer is that unless you are a college-level cross-country runner or elite competitor at sprints or middle-distances, the study may not apply to you--and in fact its findings may be the opposite of what they'd be for you and me.
Here's why.
Of the approximately 50 million Americans (according to surveys by the Sporting Goods Manufacturers Association) who are active lifestyle runners or joggers, I'd bet that more than 95 percent are either (1) middle-of-the-pack road or trail runners (those at the back of the pack still get to call themselves "middle-of-the-pack") who run distances ranging from 5k to half-marathons, or (2) long-distance runners whose main interests range from half-marathons to marathons or ultras. People in those two categories are mostly runners who have a natural inclination to land on their heels. And for most, to land on their forefeet would feel so unnatural to them that few would even try it.
To illustrate this point, before going on to explain why heel-striking probably does not increase injury risk, consider the case of Clarence deMar, the legendary Boston Marathon runner of a century ago. In his 1937 memoir, Marathon, DeMar tells of how he began his running career by going out for cross-country in his third year of college: "At that first trial . . . captain Stevens kept yelling at me, 'Run on your toes, on your toes!'" DeMar tried it, but "I couldn't get the idea of hitting the toes first.... Sometime within a year I learned how to run on my toes. Still, I have never done that any more than one third of the time." Staying mainly on his heels, DeMar went on to win the Boston Marathon seven times. And considering how little was known about exercise physiology, optimal training, shoes, nutrition, and endurance fueling at the time, his sub-2:20 marathons would probably rank him right up there with the top American marathoners of today.
When I read that "run-on-your-toes" passage in DeMar's memoir, I almost fell out of my chair in surprise: the same thing had happened to me, half a century after it happened to him! When I went out for cross-country as a high-school sophomore in 1956, my coach took one look at me and shouted, "Run on your toes! Run on your toes!" I, too, learned to do it, and as it turned out I was quite successful at cross-country; the following year, I broke the Westfield (N.J.) High School course record that had been set by Westfield's New Jersey state champion Edgar Hoos ten years earlier. And I ran fairly well in college as well. But--keep this in mind--it was in cross-country that I received and followed that "run-on-your-toes" admonition, as it was for DeMar and, no doubt, thousands of others. I'll come back to this cross-country connection in a minute.
A few years after my graduation from college, in the mid-1960s, I decided to run a marathon. I entered the Cherry Tree Marathon in New York City (a predecessor of the New York Marathon), and at around 17 miles into the race found myself catching up with a man I knew to be a legend in the New York running community--Ted Corbitt. Ted had run the marathon for the U.S. in the 1952 Olympics, and had been the national champion in 1954. He'd also been the founding president of the New York Road Runners Club, and first president of the Road Runners Club of America. He was older now, and not as fast as he'd once been, but I could hardly believe I was catching him. As I pulled alongside, Ted glanced at my feet and smiled, and said, "You know, you might run easier if you let yourself land on your heels." It was as if I'd been spoken to by God. I took what he'd said to heart, and over the next several months I let myself gradually return to the way I'd landed on my feet before that first day of high-school cross country. My feet, ankles, legs, and back all became more relaxed, and ever since, I've felt more in touch--well, I have been more in touch--with the ground I was running on. Years later, I would surmise that that quiet suggestion from Ted Corbitt had added 20 miles to my running longevity (I'm now in my 55th consecutive year).
Then, last month, the study reported by Professor Lieberman and his colleagues at Harvard hit the news. What they found was that over a four-year span, Harvard cross-country runners who landed on their heels had higher rates of injury than those who landed on their forefeet--the very opposite of what my own experience had suggested. The result seemed counterintuitive, because in short events, from 100 meters to 5000 meters or so, where most top runners do land on their forefeet, the greater speed and longer strides required for success in those races require greater biomechanical force, which puts greater stress on the legs and feet. In the slower, longer, distance events, most of us land on our heels--exerting less force and therefore presumably making ourselves less vulnerable to injury than we'd be if we ran like sprinters or 5000-meter runners.
The Harvard study seemed to belie my Ted Corbitt epiphany, but I think I might have an explanation. That study's results were for cross-country, which is run at neither very-short (primarily forefoot) nor very-long (primarily heel-strike) distances. At the college level, cross-country races are run at in-between distances, around 10,000 meters, where both heel-strikers and forefoot strikers can be competititive--and where of course both were represented on the Harvard men's and women's teams. But--here's the rub--the training for cross-country requires a lot of interval work, or other high-speed running. Those of the Harvard runners who were naturally inclined to run on their heels may therefore have been farther out of their element--pushing the envelope, biomechanically--than were the natural forefoot strikers. The stresses on feet and knees were therefore relatively greater for the heel strikers than for their forefoot-striking teammates, so their vulnerability to injury at those speeds was greater. I suspect that if the same group of runners had been training for marathons or ultras, with much less speedwork but greater mileage, it would have been the forefoot strikers who were more out of their element--and the injury results might well have been reversed.
That explanation might beg the question of why long-distance runners shouldn't (or don't) take advantage of the greater power that can be deployed by running on their forefeet, just as sprinters or milers (or about a third of the Harvard cross-country runners) do. The answer is suggested by another study of the heel-vs-forefoot question (reported by David Carrier, the University of Utah persistence-running theorist), which found that while heel-striking is slower, it is about 55 percent more energy-efficient than running on the forefeet. In shorter-distance foot races, just as in short-distance car races, energy efficiency offers no competitive advantage. If a drag-race driver knew he had the fastest car, it wouldn't matter to him if the car were so energy-squandering that it got only 2 miles to the gallon. Only its power and speed would matter. Similarly, if a sprinter or 800-meter runner gets only two-thirds the distance per 100 calories of fuel that a heel-striker gets, it doesn't matter. But for a marathoner or ultrarunner, it matters hugely. Sprinters have to run on their forefeet to maximize their power. Most ultrarunners land on their heels to maximize their staying power. Clarence DeMar, it seems, knew this all along. Reflecting years later on that "run-on-your-toes" exhortation, he mused that forefoot running "seems to me a trifle faster, but it is more fatiguing."
I had been a good cross-country runner in high school and college, but at the Cherry Tree Marathon I was a prime candidate to learn how much more easily and sustainably I could run this much longer distance--and eventually run even farther--by reducing the force of my footplant. Ted Corbitt, who a few years earlier had been the national marathon champ and the best ultrarunner in the country, knew that at a glance.
I have good reason to think that for the great majority of runners, that conclusion might be mistaken. I'm not challenging Lieberman's study at all, and in fact I think Lieberman is one of the most important figures in the science of human biomechanics. So, how could the injury-risk worry be a mistake? A simple answer is that unless you are a college-level cross-country runner or elite competitor at sprints or middle-distances, the study may not apply to you--and in fact its findings may be the opposite of what they'd be for you and me.
Here's why.
Of the approximately 50 million Americans (according to surveys by the Sporting Goods Manufacturers Association) who are active lifestyle runners or joggers, I'd bet that more than 95 percent are either (1) middle-of-the-pack road or trail runners (those at the back of the pack still get to call themselves "middle-of-the-pack") who run distances ranging from 5k to half-marathons, or (2) long-distance runners whose main interests range from half-marathons to marathons or ultras. People in those two categories are mostly runners who have a natural inclination to land on their heels. And for most, to land on their forefeet would feel so unnatural to them that few would even try it.
To illustrate this point, before going on to explain why heel-striking probably does not increase injury risk, consider the case of Clarence deMar, the legendary Boston Marathon runner of a century ago. In his 1937 memoir, Marathon, DeMar tells of how he began his running career by going out for cross-country in his third year of college: "At that first trial . . . captain Stevens kept yelling at me, 'Run on your toes, on your toes!'" DeMar tried it, but "I couldn't get the idea of hitting the toes first.... Sometime within a year I learned how to run on my toes. Still, I have never done that any more than one third of the time." Staying mainly on his heels, DeMar went on to win the Boston Marathon seven times. And considering how little was known about exercise physiology, optimal training, shoes, nutrition, and endurance fueling at the time, his sub-2:20 marathons would probably rank him right up there with the top American marathoners of today.
When I read that "run-on-your-toes" passage in DeMar's memoir, I almost fell out of my chair in surprise: the same thing had happened to me, half a century after it happened to him! When I went out for cross-country as a high-school sophomore in 1956, my coach took one look at me and shouted, "Run on your toes! Run on your toes!" I, too, learned to do it, and as it turned out I was quite successful at cross-country; the following year, I broke the Westfield (N.J.) High School course record that had been set by Westfield's New Jersey state champion Edgar Hoos ten years earlier. And I ran fairly well in college as well. But--keep this in mind--it was in cross-country that I received and followed that "run-on-your-toes" admonition, as it was for DeMar and, no doubt, thousands of others. I'll come back to this cross-country connection in a minute.
A few years after my graduation from college, in the mid-1960s, I decided to run a marathon. I entered the Cherry Tree Marathon in New York City (a predecessor of the New York Marathon), and at around 17 miles into the race found myself catching up with a man I knew to be a legend in the New York running community--Ted Corbitt. Ted had run the marathon for the U.S. in the 1952 Olympics, and had been the national champion in 1954. He'd also been the founding president of the New York Road Runners Club, and first president of the Road Runners Club of America. He was older now, and not as fast as he'd once been, but I could hardly believe I was catching him. As I pulled alongside, Ted glanced at my feet and smiled, and said, "You know, you might run easier if you let yourself land on your heels." It was as if I'd been spoken to by God. I took what he'd said to heart, and over the next several months I let myself gradually return to the way I'd landed on my feet before that first day of high-school cross country. My feet, ankles, legs, and back all became more relaxed, and ever since, I've felt more in touch--well, I have been more in touch--with the ground I was running on. Years later, I would surmise that that quiet suggestion from Ted Corbitt had added 20 miles to my running longevity (I'm now in my 55th consecutive year).
Then, last month, the study reported by Professor Lieberman and his colleagues at Harvard hit the news. What they found was that over a four-year span, Harvard cross-country runners who landed on their heels had higher rates of injury than those who landed on their forefeet--the very opposite of what my own experience had suggested. The result seemed counterintuitive, because in short events, from 100 meters to 5000 meters or so, where most top runners do land on their forefeet, the greater speed and longer strides required for success in those races require greater biomechanical force, which puts greater stress on the legs and feet. In the slower, longer, distance events, most of us land on our heels--exerting less force and therefore presumably making ourselves less vulnerable to injury than we'd be if we ran like sprinters or 5000-meter runners.
The Harvard study seemed to belie my Ted Corbitt epiphany, but I think I might have an explanation. That study's results were for cross-country, which is run at neither very-short (primarily forefoot) nor very-long (primarily heel-strike) distances. At the college level, cross-country races are run at in-between distances, around 10,000 meters, where both heel-strikers and forefoot strikers can be competititive--and where of course both were represented on the Harvard men's and women's teams. But--here's the rub--the training for cross-country requires a lot of interval work, or other high-speed running. Those of the Harvard runners who were naturally inclined to run on their heels may therefore have been farther out of their element--pushing the envelope, biomechanically--than were the natural forefoot strikers. The stresses on feet and knees were therefore relatively greater for the heel strikers than for their forefoot-striking teammates, so their vulnerability to injury at those speeds was greater. I suspect that if the same group of runners had been training for marathons or ultras, with much less speedwork but greater mileage, it would have been the forefoot strikers who were more out of their element--and the injury results might well have been reversed.
That explanation might beg the question of why long-distance runners shouldn't (or don't) take advantage of the greater power that can be deployed by running on their forefeet, just as sprinters or milers (or about a third of the Harvard cross-country runners) do. The answer is suggested by another study of the heel-vs-forefoot question (reported by David Carrier, the University of Utah persistence-running theorist), which found that while heel-striking is slower, it is about 55 percent more energy-efficient than running on the forefeet. In shorter-distance foot races, just as in short-distance car races, energy efficiency offers no competitive advantage. If a drag-race driver knew he had the fastest car, it wouldn't matter to him if the car were so energy-squandering that it got only 2 miles to the gallon. Only its power and speed would matter. Similarly, if a sprinter or 800-meter runner gets only two-thirds the distance per 100 calories of fuel that a heel-striker gets, it doesn't matter. But for a marathoner or ultrarunner, it matters hugely. Sprinters have to run on their forefeet to maximize their power. Most ultrarunners land on their heels to maximize their staying power. Clarence DeMar, it seems, knew this all along. Reflecting years later on that "run-on-your-toes" exhortation, he mused that forefoot running "seems to me a trifle faster, but it is more fatiguing."
I had been a good cross-country runner in high school and college, but at the Cherry Tree Marathon I was a prime candidate to learn how much more easily and sustainably I could run this much longer distance--and eventually run even farther--by reducing the force of my footplant. Ted Corbitt, who a few years earlier had been the national marathon champ and the best ultrarunner in the country, knew that at a glance.
Sunday, January 1, 2012
How Do Runners Breathe? "Born to Run" Got it Backward. Why Won't the Author Own Up?
Chris McDougall, author of Born to Run, is wrong again. And this time, it doesn't seem to be either his relative lack of experience as a runner or his carelessness in checking the facts about what he writes. This time, it looks like outright denial that he made a serious mistake--a mistake that could cause a lot of discomfort to other people.
I don't like bringing this up, because I think McDougall did something quite admirable in writing a book that has evidently inspired many thousands of people to take up running. As I said in a previous post, his tale about the barefoot-running Tarahumara Indians is quite entertaining. However, I also pointed out that he was quite mistaken about a few things--such as his suggestion that Nike running shoes were the cause of innumerable running injuries, and that the University of Utah researcher Dennis Bramble had said that we human runners have an advantage over other (quadruped) animals because humans can take more than one breath per stride--as if that were somehow an advantage.
In my earlier post, I noted that when I read that purported quote, I knew something had to be wrong. As an experienced long-distance runner, I knew perfectly well that it has to be the other way around--humans normally take two or more strides per breath. Anyone who tries running the way McDougall implied, in his quoting of Bramble, would quickly hyperventilate. I e-mailed Bramble to ask if he'd been misquoted, and he confirmed that he had.
Then, a few days ago, I got an email from a sportsmedicine and exercise-medicine physician, Dr. Rajat Chauhan, who had seen my post and quoted it in a post of his own, on a Wall Street Journal blog. Dr. Chauhan, like Dr. Bramble, confirmed my point. A couple of days later, Dr. Chauhan sent another email saying that McDougall had commented on his post:
Rajat, you and Ed Ayres are incorrect. The passage you mention in "Born to Run" refer(s) to two steps per breath, not two breaths per step.
Unfortunately, that isn't what the passage says at all, as Dr. Chauhan pointed out in the following reply:
In hardcover copy of 37th printing, January 2011 edition of book Born to Run: a Hidden Tribe, Superathletes, and the Greatest Race Never Seen, on page 223, Christopher McDougall states the following:
"Whenever quadrupeds run, they get stuck in a one-breath-per-locomotion-cycle," Dr. Bramble said. "But the human runners we tested never went one-to-one. They could pick from a number of different ratios, and generally preferred two to one." The reason we're free to pant to our heart's content is the same reason we need to shower on a summer day: we are the only mammals that shed most of our heat by sweating.
Clearly, this passage quotes Bramble as saying humans prefer two to one breaths per locomotion cycle (stride). But if there's any doubt about that being a misquote that could cause a lot of beginning runners a lot of discomfort or worse, two other points settle it. First, McDougall's wording "we're free to pant to our heart's content"(!) is an appropriate description for two breaths per stride, but not for two strides per breath. "Pant" is exactly what you'd do if you took two breaths per stride: you'd be like a dog panting with its tongue out after a hard dash. And second, there's that point about sweat, which McDougall seems also to have misunderstood. The reason a dog pants is precisely because it does not have the human's bare skin and capacity to get evaporative cooling from sweat, except maybe on its tongue. And because the human runner has that unique cooling mechanism, he or she does NOT have to "pant!" Since when is it desirable or comfortable, or even sustainable for more than a minute or two, for us human runners to "pant to our heart's content"? Apparently, McDougall got quite mixed up about what Bramble was saying about how human runners breathe, and about how that is related to cooling. Our cooling system means we don't have to pant like dogs with their tongues hanging out. In other words, a big part of what makes humans outstanding long-distance runners, compared with other mammals, is that we get a lot of distance for each breath we take. I have no doubt that many of those Tarahumara McDougall admires can run up steep hills with two strides per breath and on easier terrain with four or five strides per breath. I do it myself, all the time. Beginning runners need to know that they can too.
Being wrong is undestandable, of course; it's only human. "What's more worrying," writes Dr. Chauhan about McDougall's flat denial, "is his attitude: is he going back on what he says, but can't publicly say he is wrong. That could help the running community in a far bigger way."
I don't like bringing this up, because I think McDougall did something quite admirable in writing a book that has evidently inspired many thousands of people to take up running. As I said in a previous post, his tale about the barefoot-running Tarahumara Indians is quite entertaining. However, I also pointed out that he was quite mistaken about a few things--such as his suggestion that Nike running shoes were the cause of innumerable running injuries, and that the University of Utah researcher Dennis Bramble had said that we human runners have an advantage over other (quadruped) animals because humans can take more than one breath per stride--as if that were somehow an advantage.
In my earlier post, I noted that when I read that purported quote, I knew something had to be wrong. As an experienced long-distance runner, I knew perfectly well that it has to be the other way around--humans normally take two or more strides per breath. Anyone who tries running the way McDougall implied, in his quoting of Bramble, would quickly hyperventilate. I e-mailed Bramble to ask if he'd been misquoted, and he confirmed that he had.
Then, a few days ago, I got an email from a sportsmedicine and exercise-medicine physician, Dr. Rajat Chauhan, who had seen my post and quoted it in a post of his own, on a Wall Street Journal blog. Dr. Chauhan, like Dr. Bramble, confirmed my point. A couple of days later, Dr. Chauhan sent another email saying that McDougall had commented on his post:
Rajat, you and Ed Ayres are incorrect. The passage you mention in "Born to Run" refer(s) to two steps per breath, not two breaths per step.
Unfortunately, that isn't what the passage says at all, as Dr. Chauhan pointed out in the following reply:
In hardcover copy of 37th printing, January 2011 edition of book Born to Run: a Hidden Tribe, Superathletes, and the Greatest Race Never Seen, on page 223, Christopher McDougall states the following:
"Whenever quadrupeds run, they get stuck in a one-breath-per-locomotion-cycle," Dr. Bramble said. "But the human runners we tested never went one-to-one. They could pick from a number of different ratios, and generally preferred two to one." The reason we're free to pant to our heart's content is the same reason we need to shower on a summer day: we are the only mammals that shed most of our heat by sweating.
Clearly, this passage quotes Bramble as saying humans prefer two to one breaths per locomotion cycle (stride). But if there's any doubt about that being a misquote that could cause a lot of beginning runners a lot of discomfort or worse, two other points settle it. First, McDougall's wording "we're free to pant to our heart's content"(!) is an appropriate description for two breaths per stride, but not for two strides per breath. "Pant" is exactly what you'd do if you took two breaths per stride: you'd be like a dog panting with its tongue out after a hard dash. And second, there's that point about sweat, which McDougall seems also to have misunderstood. The reason a dog pants is precisely because it does not have the human's bare skin and capacity to get evaporative cooling from sweat, except maybe on its tongue. And because the human runner has that unique cooling mechanism, he or she does NOT have to "pant!" Since when is it desirable or comfortable, or even sustainable for more than a minute or two, for us human runners to "pant to our heart's content"? Apparently, McDougall got quite mixed up about what Bramble was saying about how human runners breathe, and about how that is related to cooling. Our cooling system means we don't have to pant like dogs with their tongues hanging out. In other words, a big part of what makes humans outstanding long-distance runners, compared with other mammals, is that we get a lot of distance for each breath we take. I have no doubt that many of those Tarahumara McDougall admires can run up steep hills with two strides per breath and on easier terrain with four or five strides per breath. I do it myself, all the time. Beginning runners need to know that they can too.
Being wrong is undestandable, of course; it's only human. "What's more worrying," writes Dr. Chauhan about McDougall's flat denial, "is his attitude: is he going back on what he says, but can't publicly say he is wrong. That could help the running community in a far bigger way."
Subscribe to:
Posts (Atom)