Purging the bad stuff of 2012 so the new year can start with a clean slate has become something of a tradition in New York City’s Times Square, where the Times Square Alliance hosted its sixth annual Good Riddance Day last week.
A Dumpster and giant shredder were hauled in so people could trash all their worst memories – tax records, insurance forms and painful reminders of Hurricane Sandy, unemployment and cheating ex-boyfriends.
Catharsis is supposed to be a good thing, so here’s some health-related baggage we’d like to see consigned to the shredder before the year is over:
- The politicizing of health.
- Public messages that hype or oversimplify health issues.
- Health scare tactics.
- The demonizing of obesity.
- The urge to pathologize a condition in order to sell a product or service (“You aren’t coming down with a cold; you have a pre-cold!”)
- Cluelessness by health organizations about the patient experience.
- Automated phone systems that strand patients in voice-mail no-man’s-land.
- Pill bottle caps that are complicated and difficult to open.
- The scale at the doctor’s office.
Into the Dumpster with all of them! Here’s to a fresh start in 2013.
First it was fats, then carbohydrates. Now sugar has joined the ranks of nutritional villainy.
With Christmas approaching on a tidal wave of candy canes and gingerbread, one can’t help wondering: Is it OK to indulge in a little sweetness, or is sugar entirely bad?
There’s no denying a certain amount of hysteria when it comes to sugar. Critics claim sugar causes everything from hyperactivity to premature aging. A common – and inaccurate – belief about cancer is that cancer cells feed on sugar.
Some of this is hyperbole but there’s also a considerable amount of science that has examined the effects, both good and bad, of sugar consumption. Sugar has been linked, for instance, to increased risk of weight gain, diabetes and heart disease. Earlier this year, some U.S. health experts went so far as to declare that sugar is as addictive and dangerous as alcohol or tobacco and should be regulated accordingly.
Unfortunately it’s not always clear whether sugar itself is the culprit or whether something more complex is going on.
At least part of the reason why higher sugar consumption is linked to weight gain may simply be the extra calories. One of the issues with sugar-sweetened sodas and other beverages isn’t just that they contain lots of sugar, it’s that they often end up replacing water or milk in someone’s diet. Processed foods high in sugar also can be higher in fat and sodium, which are associated with negative health effects of their own.
A certain amount of sugar is necessary in order for the human body to function, but moderation seems to be called for here. Consuming large quantities of sugar each day also tends to be a marker for an overall diet that may not be optimal for health.
This brings us to the real vexation: the proverbial sweet tooth. Why do so many people love sugar and why can it be so hard to consume less of it?
I admit to not totally understanding the whole sweet tooth thing. If you were to invite me to your holiday buffet, I would go directly to the spinach dip and the shrimp cocktail. Cookies and candy, not so much. But this wouldn’t necessarily be the case for other guests.
There’s debate about whether so-called sugar addiction is real or imagined. Some studies have found clinical similarities between food cravings and drug dependence. A study published this year in the Journal of Psychoactive Drugs found, for instance, that when people binge on sugar-dense foods, it increases the amount of extracellular dopamine in their brain which has the potential to lead to addiction.
The authors wrote, “There appear to be several biological and psychological similarities between food addiction and drug dependence, including craving and loss of control.” They also note that for some people, consuming these foods is comforting and therefore might be regarded as a form of self-medication.
So far, however, the sugar addiction theory has mostly been tested on rats and mice, with implications for human behavior that are unclear at best. A 2010 review in the Clinical Nutrition journal examined the evidence and concluded there’s nothing yet in the literature suggesting that humans can become addicted to sugar or that sugar addiction plays a role in obesity or eating disorders.
The bottom line is that when it comes to “sugar addiction,” the jury still seems to be out.
In the meantime, here’s some guidance. The U.S. Department of Agriculture’s 2010 dietary guidelines recommend limiting consumption of added sugars and solid fats to somewhere between 5 and 15 percent of total daily calories. The American Heart Association suggests no more than 100 calories a day of added sugar for most women and no more than 150 calories a day for most men. That’s about 6 teaspoons for women and 9 for men.
Here’s something else to keep in mind. For most Americans, the main source of the sugar they consume isn’t in that spoonful they dump into their coffee, or a homemade dessert or even a Christmas cookie. Most of our dietary sugar comes in the form of added sugar – sugars and sugar-based syrups that are added to food during processing. Although the tendency is to single out highly sugared products, such as sodas, as the problem, added sugar can show up in a variety that may not be readily recognized – chicken nuggets, for instance, or ketchup or children’s breakfast cereals, all of which are often surprisingly high in sugar.
So go ahead and have that reindeer-shaped holiday cookie if you want. If you’re worried about the sugar, just take one.
Take a look at the guys around you this month and count how many of them are displaying more facial hair than usual.
Chalk it up to Movember, a global charity event that invites men to grow mustaches during November to raise awareness and money for men’s health. According to the website, the initiative had more than 854,000 participants – they’re known as “Mo Bros” – worldwide last year and raised $126 million on behalf of prostate and testicular cancer.
Well, fair enough. After all, the entire month of October is devoted to breast cancer awareness and fundraising and all things pink. Maybe it’s time men had their own health month.
But the critics are cautioning: Don’t be too quick to get behind this health campaign without asking more questions about what’s really being accomplished.
What is the substance behind the “awareness” the Movember campaign says it promotes? Take a look at the list of Movember health tips, which include a recommendation to get an annual physical: “Getting annual checkups, preventative screening tests and immunizations are among the most important things you can do to stay healthy.” Nary a mention is made of the debate surrounding the value of the yearly physical exam. Nor is there discussion about the risks vs. the benefits of prostate cancer screening, an issue that’s of considerable controversy amongst the medical and scientific community, or how men can weigh the evidence to make appropriate, informed decisions.
Another health checklist on the website advises men 40 and older to talk to their doctor about the use of aspirin and statins to lower their risk of heart disease, even though the preventive benefit of these two therapies has not been clearly established in people who don’t have existing heart disease.
Most would agree men are well served by education that gives them accurate, realistic information about their health. Are they served as well by information that’s overly simplified or that fails to adequately convey evidence-based pros and cons? Or by messages that confuse screening with prevention?
Perhaps the bigger issue is whether Movember, which started out with good intentions, is turning into a gimmick that allows people to feel good about a cause merely by growing a mustache and donating a few dollars.
Blogger Ashley Ashbee calls it “a type of slacktivism.”
“Does your moustache share information about the importance of screening, or where to get screened?” she wrote last year. “Does it tell you how you can prevent prostate cancer (if you even can)? Does it tell you the symptoms? Does it tell you who’s affected?”
Moreover, critics say one of the flaws of catchy public awareness campaigns, whether they’re exemplified by mustaches or by pink ribbons, is that they can skew the public’s perspective about risk and disease and lead to inaccurate or exaggerated beliefs that sometimes spill over into health-related behaviors. Although prostate cancer is far and away the most commonly diagnosed type of cancer among men in the United States, it’s actually lung cancer that is responsible for the most cancer deaths in men. Heart disease continues to be a significant health risk for men and, many would say, is the leading male health issue. Men also outnumber women when it comes to alcoholism, fatal traffic crashes and suicide.
To their credit, the Movember organizers added men’s mental health this year to their list of causes. But whether this helps improve the public’s understanding about male health remains to be seen.
The Toronto Globe and Mail spoke last week to medical ethicist Kerry Bowman of the University of Toronto, who lamented, “There’s not a direct relationship between the diseases we hear most about and either their occurrence in society or the lethality and the amount of suffering they create.”
Ideally, there should be a form of “ethical triage” that helps the public be better informed about the most widespread and urgent health care needs before donating their money to a cause, Bowman said. But for most fundraising campaigns, this kind of analysis is “very much lost,” he said.
Writer Ragen Chastain can think of several things that would be more fun than being under holiday surveillance by what she calls “the Friends and Family Food Police”: a root canal, a fishhook in the eye… you get the picture.
Chastain, who blogs at “Dances With Fat,” tackled the subject last year of holiday eating and the well-meaning individuals who comment, nag or react in other ways to someone else’s food choices, particularly if that someone is overweight.
She clearly hit a nerve, because the comment section quickly filled with stories about people’s experiences at the holiday dinner table.
One woman was scolded by a cousin for eating high-carb carrots. Someone else was told “You don’t need that!” when she reached for the bread.
For others, the guilt tactics were more subtle – for instance, people asking them if they’d lost weight, or commenting, “I’m really being bad, I shouldn’t be eating this” while downing a sliver of pie.
Maybe it’s the food, maybe it’s the family dynamics, maybe it’s the emotional expectations we have for Thanksgiving and Christmas. Whatever the reason, there’s something about the holidays that can bring out the worst in people’s guilt and disordered attitudes about eating. When I Googled the words “holiday food and guilt,” there were 7.9 million results.
If you’re on the receiving end of the guilt tactics, how do you cope?
Chastain, who teaches workshops on self-esteem and the Health at Every Size approach and has written a book, “Fat! The Owner’s Manual,” advises deciding where the boundaries lie and what the consequences are for those who overstep them.
She writes, “I give people clear information, and several chances, but I don’t keep anybody in my life who consistently fails to treat me with the level of respect that I require.”
This might mean, for instance, simply saying “yes” or “no” if someone asks whether you really need that second helping of mashed potatoes – and then proceeding to eat it. Or it might mean giving a pointed response when someone gets too persistent: “I have absolutely no interest in discussing my food intake with you.”
Although much of the food guilt is aimed at obesity, it’s a minefield for other people as well. Thin people can be equally likely to have their weight commented on at the dinner table, or urged to eat more. And for those dealing with or recovering from eating disorders, holiday meals can be doubly difficult. Not only must they cope with food, and lots of it, but they may also be subjected to intense scrutiny over how much, or how little, they’re eating and whether they’re sticking to their prescribed meal plan.
This isn’t to say people shouldn’t try to eat sensibly for the holidays. The amped-up food choices can be difficult for those who have diabetes, need to limit their sodium or cholesterol intake, or simply want to watch calories.
Some tips from the Duke University Health System: Sample a little of everything but balance it with more fruits and vegetables. Stock up on healthy snacks for when temptation hits. Eat before a party to avoid overdoing it. Drink moderately. Don’t be afraid to say no if someone applies pressure to eat more.
The real question about food guilt is whether it actually works. According to a new study by the Rudd Center for Food Policy and Obesity at Yale University, the answer is no.
Researchers asked 1,000 study participants to evaluate several public health obesity campaigns by rating how positive or negative the campaign messages were and whether they were motivating or stigmatizing.
The best ratings went to campaigns that promoted specific health behaviors, such as eating more fruits and vegetables, and campaigns that encouraged people to become confident and empowered. Those that ranked the highest didn’t even mention the word “obesity.”
The least motivating? Messages that promoted shame, blame and stigmatizing.
Someone who truly cares about a friend’s or relative’s health should discuss it alone, at an appropriate time and in a way that invites dialogue, rather than shaming him or her at the dinner table, says Chastain. “Guilt is not good for your health. So I hope that if you choose to eat it, you also choose to enjoy it.”
A seemingly frivolous tweet about superstorm Sandy drew a sharp rebuke this week from the Michigan Nurses Association.
The offending tweet came from the Detroit News and actually was a retweet of what another Twitter user (a civilian, apparently) said about the storm: “BREAKING: Frankenstorm upgraded to Count Stormula.”
The nurses’ association tweeted back:
Would appreciate media being serious; people died RT @detnews: RT @zhailey: BREAKING: Frankenstorm upgraded to Count Stormula #sandy
The organization’s communications director, Dawn Kettinger, then contacted media uberblogger Jim Romenesko to further press home the point. “People are hungry for information and connection right now,” she wrote. “Moments of misjudgment are understandable, but perhaps it is worth another discussion of how media use their resources and power.”
Is it ever OK to use dark humor when confronted with tragedy, whether it’s death, life-threatening illness or injury, or natural disaster?
It’s a little ironic that the “Count Stormula” criticism would come from someone in the health professions, a demographic that’s well known to engage in black humor to cope with the illness, injury, tragedy and stress they encounter on a daily basis.
There’s been little formal research on the use or frequency of black humor among health care professionals. The use of this type of humor seems to be most common in the higher-stress specialties: emergency medicine, critical care and surgery.
A small-scale study several years ago, involving 608 paramedics in New Hampshire and Vermont, as well as members of the National Flight Paramedics Association, found that almost 90 percent used gallows humor as a coping mechanism. In fact, it was their most frequent method of de-stressing, even more than venting with colleagues or spending time with family and friends.
The use of irreverent terms such as “acute lead poisoning” to refer to a gunshot wound or “celestial transfer” to describe patients who have died is so widespread that several online dictionaries have been compiled to list all of them.
Some studies even have found that black humor can benefit clinicians and patients by reducing tension and allowing the clinician to focus on the situation at hand.
But there seems to be a growing consensus that it’s not acceptable to resort to black humor within earshot of patients and families, or to frame the joke at someone else’s expense (and I daresay this applies to the media as well).
A group of Canadian researchers who explored the issue agreed that although laughter is often therapeutic, cynical humor that’s directed toward patients “can be seen as unprofessional, disrespectful and dehumanizing.”
Furthermore, the use of black humor to cope is frequently a learned behavior that medical educators need to address, they wrote:
“There is disturbing but compelling evidence that medical education and acculturation are partly to blame, by tolerating and even fostering a certain detachment and cynicism. Recent moves to encourage the development and evaluation of professionalism in medicine embrace concerns about this issue and the distinction between dark humor about the human condition and the particular observations of those who style themselves as healers.”
At its worst, black humor can stereotype or dehumanize the patient and make it harder to be objective or empathetic, wrote a nurse at Those Emergency Blues, a group blog of Toronto ER nurses. When this happens, doctors and nurses can end up providing poor care, she wrote. “The wisdom is having the insight to understand the sources of black humour in our own relative helplessness, and to recognize it, first, as an inevitable part of our practice, and secondly, as having a time and place.”
Readers, what do you think? When is black humor appropriate and when is it unacceptable?
Two days after the news of Lance Armstrong’s fall from grace, I’m still not sure what my reaction should be. Disappointment? Disgust? Outrage?
Probably all three.
Not because he cheated his way to the top of the competitive cycling world, all the while steadfastly denying the doping allegations that dogged him – first in whispers, then in shouts – for years. Bad enough as this was, the real moral offense lies elsewhere: in leading everyone to believe in his amazing recovery from cancer, in accomplishing victories that inspired so many of his fellow survivors, when apparently all along his feats were artificially enhanced.
Were those Tour de France performances the real Lance Armstrong, the guy who battled back from testicular cancer that had spread to his abdomen, lungs, lymph nodes and brain, the guy who at one point was given only a 40 percent chance of survival? Or was his incredible comeback helped along by performance-enhancing drugs?
We’ll never know for sure. And in all honesty, I feel sort of betrayed by one of my own, because Armstrong and I belong to the same tribe: young adults who were diagnosed with cancer (in fact Armstrong’s diagnosis came a year after I completed treatment for lymphoma), beat the disease and returned to normal life.
The difference is that I didn’t cheat to do it while pretending to the rest of the world that my recovery was entirely the product of my own motivation and endurance. If anything, my life – like that of all too many other cancer survivors – has never gone back to what it used to be.
Truth be told, Armstrong has been a bit polarizing among some within the cancer community – not because of the long-standing doubts about whether he was doping but because he set the recovery bar so high.
This isn’t meant to diminish his accomplishments. Even without winning seven Tour de France races in a row, Armstrong would have been inspiring for his post-cancer career as a world-class cyclist – not to mention the founding of Livestrong, his charitable foundation that has done so much to offer help, hope and community to people with cancer.
Many survivors don’t fare as well as Armstrong, however. That cancer and the treatments for it are often accompanied by long-lasting and even permanent effects is one of the dirty little secrets of CancerWorld. It has only been within the last decade or so that the health community has begun recognizing this and trying to address it.
How people recover seems to be, at least to some extent, an individual outcome. Armstrong may have been better equipped than most, due to his training, physical and mental conditioning and perhaps even a genetic edge, as some research has suggested. Other survivors might not; does it mean they’re doing it wrong or not trying hard enough if they fall short of Armstrong’s level?
Perhaps the real issue has nothing to do with alleged doping and everything to do with the hero narrative. Armstrong gave us a shining example of overcoming adversity that turned out to be flawed. Maybe the correct response here isn’t outrage but forgiveness for placing the unreasonably high expectation on him of being everyone’s perfect cancer survivor superhero.
Why can’t the heroes be the survivors quietly showing up every day, working, going to school, raising families and being part of their community? Why can’t it be heroic enough to simply persevere through chemo brain, fatigue, nerve damage and the rest of the long-term cancer baggage?
For what it’s worth, it comes as somewhat of a relief that Lance Armstrong may be fading as the image to aspire to. Maybe now we can go back to more realistic expectations that embrace the still-rich human potential of life after cancer while remaining honest about the limitations.
I’m rooting for Robin Roberts. As if breast cancer in her 40s wasn’t enough, the warm, sparkly co-anchor of Good Morning America is now dealing with myelodysplastic syndrome, a rare bone marrow disorder that may have been caused by the chemotherapy she underwent five years ago.
Robin, 51, underwent a bone marrow transplant last week and her colleagues from ABC were there on the spot, cameras and all, capturing the moment. Described in news accounts as “visibly spent”, she nevertheless recorded a message from her hospital bed, telling viewers she could “feel the love” from her legions of fans.
She has been open about her medical situation and exceedingly gracious about sharing this challenging journey with the public. No one could possibly wish her anything but the best.
But there’s a troubling side to this story: When does going public with a disease become too much?
ABC has begun taking heat from critics who think the network has gone overboard. Some have leveled accusations that the story is being exploited to gain better ratings. ”It’s a fine line between educating the audience and bringing them up to date, and crossing over and turning that into a ratings booster or an audience grabber,” Arthur Caplan, director of medical ethics at the New York University Langone Medical Center, told the Associated Press.
ABC has denied engaging in hype, saying viewers care about Robin and genuinely want to know how she’s doing. There’s value in the emotional support she’s receiving from her many fans.
Nevertheless it ought to make us feel a little uneasy, and not only because of the ethical issues it raises about newsroom decision-making.
At what point does our collective love of a feel-good story trump what’s best for the subject of that story?
It’s not hard for me to put myself in Robin’s place. Regular readers of this blog know I had non-Hodgkin’s lymphoma when I was in my 30s. I’ve done chemotherapy, I’ve done radiation, I’ve done time in an inpatient oncology unit. I know the physical and emotional toll exacted by cancer treatment. I’m all too aware of the potential for serious long-term and late complications such as the myelodysplastic syndrome Robin is being treated for, and the limited options for reducing the risk of late treatment-related toxicity – an angle of Robin’s story that most people unfortunately don’t seem to have picked up on.
So it was rather disconcerting to see Diane Sawyer, Sam Champion and presumably at least one television camera operator crowd into the hospital room with Robin and her family while new bone marrow cells were infused into her. Sure, Robin is a colleague and has many friends at ABC who wish her well. Sure, everyone was appropriately gloved and masked and (I hope) mindful of the infection risk.
But first and foremost this is a patient. Moreover, this is a patient with virtually no immune system, someone who’s highly vulnerable and undergoing a very challenging medical procedure. Even a seemingly minor infection can be a serious threat. Is it truly in her best interest to expose her to the extra risk for the sake of a heart-warming TV moment?
One can only hope it didn’t inadvertently send a message that the immune suppression that accompanies bone marrow transplants, not to mention many standard chemotherapy regimens, is not a big deal. People whose immune system is compromised depend on the community around them to exercise good judgment and avoid the unnecessary spread of germs.
Finally, there’s the emotional aspect to consider. ABC has said that everything being broadcast is with Robin’s permission, and I believe them. Even when there’s consent, though, media coverage of these kinds of stories can exert subtle pressure on the subjects to go along with it. Sometimes the story takes on a life of its own and it becomes difficult to tell the cameras and reporters, “Not now”, especially when someone is sick and perhaps not able to think clearly or be assertive. Sometimes the story stops being about the patient and becomes more about tugging at heartstrings or manipulating the emotions of the audience.
“At a certain point, Robin needs to heal,” Shelley Ross, former executive producer of Good Morning America, told the Associated Press.
Exactly. Although it’s generous for sick people to share their experience with the public, they don’t owe it to us. Rather, we have a responsibility to protect them at a vulnerable time. What’s in their best interest, physically and emotionally, ought to come first always.
If there was any doubt that New York City was serious about downsizing the giant sugary drinks sold at restaurants and concession stands, it was erased Thursday with the enactment of a new rule by the city’s Board of Health.
The rule places a 16-ounce limit on the size of non-diet sodas, sweetened teas and other sugar-laden drinks sold at restaurants, theaters, workplace cafeterias and other venues that offer prepared food.
Many public health experts have wrung their hands over the amount of sugared beverages consumed by the average American. Few entities, though, have gone so far as to impose an outright ban on super-sized drinks.
Those who support the measure see it as an important – and pioneering – step for public health. Here’s the take by the Associated Press:
They say the proposal strikes at a leading cause of obesity simply by giving people a built-in reason to stop at 16 ounces: 200 calories, if it’s a regular Coke, compared to 240 in a 20-ounce size. For someone who drinks a soda a day, the difference amounts to 14,600 calories a year, or the equivalent of 70 Hershey bars, enough to add about four pounds of fat to a person’s body.
Beyond the numbers, some doctors and nutrition experts say the proposal starts a conversation that could change attitudes toward overeating. While there are many factors in obesity, “ultimately it does come down to culture, and it comes down to taking some first steps,” said Dr. Jeffrey Mechanick, a Mount Sinai School of Medicine professor who has studied the effect of government regulation on the obesity epidemic.
The ban goes into effect March 13, assuming it isn’t struck down before then.
Supporters of the measure have a point. Soft drinks are large and getting larger. Consider the 7-Eleven Big Gulp series: The Double Gulp contains 50 (!) ounces – more than the capacity of the average human stomach. We have become culturally accustomed to supersized portions of everything from soft drinks to french fries to bagels, with the result that it’s increasingly difficult to gauge what a normal-sized serving should be.
But here’s the big question: Will New York’s ban on the largest sugary drinks actually make a difference in people’s health? The answer is not at all clear.
For one thing, the rule contains a multitude of exceptions. It doesn’t apply to beverages sold in retail grocery stores, vending machines or most convenience stores, allowing people to continue to buy their favorite large sizes without restriction.
It exempts beverages that are 100 percent fruit or vegetable juices, even though these can be, ounce for ounce, almost as full of sugar as a soft drink. (For a comparison, check out this chart compiled by the federal government; a 12-ounce serving of grape juice contains 12 teaspoons of sugar – more than a same-sized serving of either cola or root beer.)
Nor is there anything in the rules that prohibits consumers from short-circuiting the intent by simply buying more smaller drinks to equal a large one. And New Yorkers can continue to drink as much soda in the privacy of their homes as they please (at least for now).
Although this is, strictly speaking, a New York City story, it matters to the rest of us as well. Indeed, the Board of Health’s action has captured wide interest across the United States. Seattle lies the width of the continent from Manhattan, but when the Seattle Times offered an online poll on what readers thought of a similar ban in their own city, folks were quick to weigh in.
There’s considerable – and valid – debate over whether regulation and government enforcement are an appropriate strategy for influencing health-related behavior.
Ethically speaking, it’s a murky area. Should government be making people’s food choices for them? Do consumers have the right to make their own decisions about what they buy and drink, or is this outweighed by the public health impact? If the target today is sugared drinks, what’s going to be next?
In the months before the New York City Board of Health voted on the mega-soda ban, a handful of studies attempted to quantify what the health results might be. In one study that involved analyzing the receipts of 1,600 fast-food customers on the East Coast, researchers concluded that if everyone who had been buying a large-sized drink cut back to a single 16-ounce beverage, they would consume 63 fewer calories per meal. But at least 40 percent of consumers had to take this step, otherwise the impact would be negligible, concluded the study.
Finally, banning giant-sized drinks at some commercial venues does little to address other areas of health-related behavior that may be just as important – physical activity, stress, alcohol use and timely access to appropriate medical care, to name just a few.
It’s going to be interesting to watch how the soda ban plays out. Perhaps this is what it takes to begin changing a community environment into one that fosters better health – the proverbial snowball that gathers speed and mass as it rolls downhill. On the other hand, this is still an experiment with unknown results. It’s to be hoped that the New York City Board of Health will watch this closely and collect some real evidence to help decide whether it was worthwhile or not.
I’ve noticed it for some time, in a vaguely-paying-attention sort of way. But it wasn’t until seeing a news article in the Star Tribune of Minneapolis this weekend that it really hit home: For better or worse, snacking has become a mainstream American way of eating.
From the article:
Two studies from 2010 by University of North Carolina researchers looked at snacking trends between 1977 and 2006 and found that children now eat three snacks a day and adults snack twice a day. That is one additional snack for each group compared with 30 years ago.
Three square meals don’t exist anymore, said Larry Finkel, director of food and beverage research at Packaged Facts, a publishing company that focuses on consumer product research. “Meals and snacks have become all blurred together.” The authors of the report concluded that “our children are moving toward constantly eating.”
What is up with all the nibbling between meals? More to the point, how did we get this way?
Finkel attributes it to the American on-the-run lifestyle which has led to the decline of structured mealtimes. Instead of sitting down to dinner together, dinner has become what you eat in the car on the way to work or soccer practice, he said.
Once you start noticing, the signs are everywhere that snacking is an eating behavior that’s widely accepted and even expected. Manufacturers have developed hundreds of “snack-sized” products meant to be eaten at your desk or on the run. Restaurants are open late into the night or, increasingly, 24 hours a day. Few kids’ events are complete anymore without a snack of some kind. Even diet plans now include recommendations for morning and afternoon snacks, on the assumption that this is part of people’s daily routine.
In an indication of how entrenched the snack habit has become, a study that appeared in Health Affairs in 2010 found that nearly one-third of the daily calories consumed by American children now come from snacks.
It all raises the question: Is this a trend we should be welcoming?
There in fact seems to be no universally agreed-upon definition of what constitutes a snack. Are snacks a replacement for a full meal or are they something that’s consumed in addition to a meal? How many calories can a snack contain before it stops being a snack and becomes a meal?
There’s some limited evidence that “grazing,” or consuming half a dozen small meals a day, may be more effective for losing weight, curbing hunger and controlling blood sugar levels than the traditional three squares a day. But the research findings so far have been somewhat contradictory. At least two studies have found that for those who want to lose weight, what ultimately matters is how many calories they consume, not how often they eat. According to other studies, however, grazing can be beneficial, especially in helping people feel more sated throughout the day and less likely to overeat.
Grazing might also benefit some groups more than others. Athletes and active children often need snacks to replenish the calories they burn. Some studies involving older adults, an age group at risk of malnutrition, have found that large meals can be unappetizing for them and that they fare better on smaller meals throughout the day.
What it seems to come down to is the quality and amount of snacking that takes place. There’s a difference, after all, between snacking on carrot sticks vs. a bag of chips. And there’s a difference between snacks that are part of an overall healthy eating plan vs. snacks that add to one’s daily calorie load. Munching on something a few extra times a day might not seem like much but the calories can add up in ways that might astonish many people.
Left unanswered in all of this are the social implications of replacing structured mealtimes with grazing and snacks eaten on the run. Are we losing something when we don’t sit down at the table together, or doesn’t this really matter? What happens when the whole world becomes our dining room?
I’m not sure we’ve pondered these questions, and in any case it’s too late. This particular train has already left the station. The challenge, it seems, is how to manage this cultural shift in eating in ways that are healthful rather than toxic.
Watching competitive eaters cram dozens of hot dogs down their gullets is both mesmerizing and nausea-inducing.
Apparently few do it better than Joey Chestnut, a 28-year-old from San Jose, Calif., (his nickname is “Jaws”) who won his sixth straight title this week in the Nathan’s Famous Fourth of July International Hot Dog Eating Contest by downing 68 hot dogs in 10 minutes. (The second-place winner only managed a paltry 52 hot dogs.)
What is the deal with competitive eating? At one time it was a fringe activity most often found at local fairs and on college campuses. These days it has gone mainstream, with televised contests, cash prizes and competitors who actually train for a shot at stardom.
Somewhere in here is a commentary on America’s disordered cultural attitudes towards food. But that’s fodder for another day. What I want to know is: How on Earth do they do it? And how does the human body tolerate it?
The reaction of most in the health community seems to be: Speed-stuffing yourself with that many hot dogs (or anything else, for that matter; other competitive eating contests involve burritos, oysters, meatballs, jalapeno peppers or pie) is not good for you.
Health experts have been speaking out for some time against the so-called sport of competitive eating. The American Medical Association views it as an unhealthy practice akin to binge eating, with possible long-term consequences.
ABC’s Good Morning America tallied up the excess fat and calories consumed on July Fourth by the Nathan’s Famous hot dog-eating contestants (or “gurgitators,” as they’re often called) and came up with some whopping numbers. For instance, a Nathan’s hot dog contains about 300 calories; by this measure, Chestnut gorged himself on 20,400 calories in just 10 minutes – 10 times the number of daily calories recommended for adult men by the USDA.
Keith-Thomas Ayoob of the Albert Einstein College of Medicine told Good Morning America, “I’m not sure if eating that many hot dogs can damage your blood, but it will probably raise your cholesterol level temporarily. And it puts a strain on your body’s organs to handle that amount of calories, fat and sodium all at once.”
Someone susceptible to high blood pressure who downs such massive amounts of sodium “is really rolling the dice and could end up in the hospital,” Ayoob said.
Here’s what many of us would really like to know: How do the contestants avoid, um, experiencing what’s known as a “reversal of fortune”? Answer: Often they don’t. In most contests, eaters are disqualified if they vomit during the actual competition. Sometimes this is extended through a set time after the contest has ended – say, two minutes. After that, gurgitators may reverse without penalty.
For the record, Chestnut told the media after his chow-down that he felt “good” and was “looking forward to next year already.”
Sonya Thomas, the 100-pound winner of the women’s competition (45 hot dogs) admitted she began feeling sick during the contest but kept going.
For all the criticism of the excess surrounding competitive eating, there’s been surprisingly little actual study of its effects on the human body. In one of the few investigations into how competitive eaters manage to consume so much at one time, a group of researchers at the University of Pennsylvania conducted imaging studies of a speed-eating champion matched with a control subject.
They wrote that the competitive eater’s stomach was extraordinarily adapted to his sport: “Unlike the control subject, the speed eater had markedly altered gastric physiology that enabled his stomach to rapidly accommodate an enormous quantity of ingested food by progressively expanding until it became a giant, flaccid sac occupying most of his upper abdomen.”
Interestingly, he was not overweight – and it seems many of the top-ranked eating contestants are quite trim, despite their sport. He told the researchers, however, that he had lost the ability to feel sated after a meal, and that he followed strict portion control in everyday eating.
It’s unclear what these findings mean for the long-term health of competitive eaters. This was an extremely tiny study, involving only one participant. Nevertheless, the researchers raise some key questions: What happens as competitive eaters get older, perhaps struggle with willpower and begin to engage in chronic binge eating? What’s the long-term impact of having a chronically dilated stomach?
The researchers conclude that speed eating “is a potentially self-destructive form of behavior that over time could lead to morbid obesity, intractable nausea and vomiting, and even the need for gastric surgery.”
So there you have it. Only one hot dog at a time for me, thanks very much.