Dinner at home: ideal vs. reality

According to the common nutritional wisdom, families who sit down together for home-cooked meals tend to be both healthier and happier, and research for the most part supports that this is true.

But when a group of sociologists decided to study what it really takes to prepare a family dinner, they learned that all is not well in the kitchen. In fact, moms reported feeling pressured to live up to unrealistic ideals and many felt the benefits of home-prepared food weren’t worth the hassle.

Utopia, meet Real Life.

Food gurus may romanticize about the love and skill that goes into preparing a meal and the appreciation with which it’s eaten, but “they fail to see all of the invisible labor that goes into planning, making and coordinating family meals, ” the researchers concluded. “Cooking is at times joyful, but it is also filled with time pressures, tradeoffs designed to save money, and the burden of pleasing others.”

For their study, aptly titled “The Joy of Cooking?”, they spent a year and a half conducting in-depth interviews with a social and economic cross-section of 150 black, white and Latina mothers. They also spent more than 250 hours observing poor and working-class families as they shopped for groceries, cooked and ate meals.

They found mothers were strapped for time, sometimes working two jobs and unpredictable hours to make ends meet and with little energy left over to plan a meal, prepare it and clean up the kitchen afterwards while their children clamored for attention. “If it was up to me, I wouldn’t cook,” one mother bluntly told the researchers.

They discovered that in most of the poorer households they observed, mothers routinely did their own cooking to save money. But these women often were disadvantaged by kitchens that were too small and inadequately equipped – not enough counter space, a shortage of pots and pans, lack of sharp knives and cutting boards, and so on. One family living in a motel room prepared all their meals in a microwave and washed their dishes in the bathroom sink.

A common barrier was the cost of fresh fruit, vegetables, whole grains and lean meats, typical ingredients of a healthy meal. Some of the mothers didn’t have reliable transportation so they only shopped for groceries once a month, which limited the amount of fresh produce they could buy. Even in the middle-class households, moms fretted that the cost of quality food forced them to buy more processed foods and less organic food than they wished.

The final straw: family members who fussed, picked at the food or refused to eat what was served. The researchers rarely observed a family meal during which no one complained. In many of the low-income homes, moms resorted to making the same foods over and over rather than try something new that might be rejected and go to waste. For middle-class moms, the pressure to put healthy, balanced meals on the table often led to considerable anxiety over what they cooked and served.

Despite all this, is it possible for families to consistently prepare good meals at home and actually enjoy the experience instead of viewing it as a chore? Of course it is. But for many households, getting there clearly will be a slog.

When the reality surrounding the home-cooked meal is often at odds with the ideal, why then do food system reformers insist that the revolution needs to happen in the household kitchen? the researchers wonder.

They call the emerging standard for the home-cooked meal “a tasty illusion, one that is moralistic, and rather elitist, instead of a realistic vision of cooking today. Intentionally or not, it places the burden of a healthy, home-cooked meal on women.”

Perhaps we need more options for feeding families, such as introducing healthy food trucks or monthly town suppers, or getting schools and workplaces involved in sharing healthy lunches, they suggest. “Without creative solutions like these, suggesting that we return to the kitchen en masse will do little more than increase the burden so many women already bear.”

Rich, poor, healthy, sick

Take two people, one with a college degree and earning $100,000 a year and the other with a high school diploma earning $15,000, and guess which one is likely to have better health.

Those who study population health have long known that, elitist as it sounds, income and education are two of the strongest predictors of overall health in the United States. Americans who are educated and financially secure tend to live longer. They’re more likely to be physically active and less likely to use tobacco. Often they have better health outcomes and better management of chronic conditions such as diabetes or high blood pressure.

Exactly why this would be the case is not clearly understood. One assumption is that it’s all about access – people with more money are able to afford better food, live in safe neighborhoods and receive adequate medical care. Another assumption is that it has to do with healthy or unhealthy community environments. But an interesting review from a few years back indicates it’s much more complex than this.

The authors identify some influences that are familiar. Having more money, for example, means that people have the resources to join a health club or buy fresh fruits and vegetables. Where people live can shape how they behave – whether they have access to parks and sidewalks or the amount of tobacco and fast-food advertising they’re exposed to.

But the authors also identify several factors that are psychological, social and more subtle to tease out.

Education and efficacy. One of the functions of education is to teach critical thinking, problem-solving and self-discipline, which can better equip people to process health information and apply it to their lives. These skills can also make them feel more confident about their ability to successfully manage their health.

- Peer group identification. People tend to associate with their own socioeconomic group and usually will adopt similar norms that reinforce their place in the social ecosystem. If everyone else in your educated, professional social circle is a nonsmoker, chances are you won’t smoke either, or will make serious attempts to quit. Likewise, blue-collar individuals may smoke to show independence, toughness and solidarity with their social group.

- Optimism about the future. Lower lifetime earnings and a sense of limited opportunity can make lower-income people feel there’s less reason to invest in their long-term health. Their health decisions can be more focused on the here and now than on the future. The authors of the review also suggest that the higher levels of stress associated with being economically disadvantaged can lead people to use tobacco, alcohol and/or eating as a way of coping.

Who remembers seeing the headlines this past month about a study that found a link between saving for retirement and being in better health? The researchers may have been onto something, namely that planning for one’s retirement could be just one of many markers for the psychosocial factors that influence health – disposable income, self-efficacy, peer group norms, belief in the future and so on.

Money does indeed make a difference but it isn’t just about money, the authors explain in their review. Walking as a form of exercise costs nothing, while smoking can be an expensive habit. What motivates someone to choose one over the other?

This is only scratching the surface, of course. Many of these factors are interrelated – for example, someone at a lower socioeconomic level who is motivated to adopt healthy habits but has difficulty achieving this because of a lack of means. And it’s hard to outsmart your genetic background regardless of your income or education level or motivation to pursue a healthy lifestyle.

There’s a huge national conversation taking place about being healthy: what it is, how to achieve it and how to reduce some of the obvious income and racial disparities. Do we just keep urging everyone to “make better choices”? Do we legislate? It’s clear from all the research that the social and psychological factors surrounding health-related behavior are complex and not easy to untangle. If ever there was an area to resist simplistic solutions, this is it.

When you just can’t sleep

Reservations about the safety of prescription sleeping pills have been around for a long time. But recent new research has raised fresh concerns about when they’re appropriate and who’s most at risk.

To summarize: A study by the U.S. Centers for Disease Control and Prevention and Johns Hopkins University found that psychiatric medications – a category that includes sedatives – account for thousands of emergency room visits in the U.S. each year. One of the key findings, which may have come as somewhat of a surprise to the public, was that zolpidem, or Ambien, was implicated in 90,000 emergency room visits annually for adverse drug reactions.

The majority of ER visits for drug reactions associated with sedatives and anti-anxiety medications were among adults in their 20s, 30s and 40s. But among older adults who were taking these medications and ended up in the ER, the consequences were often more severe and were more likely to result in hospitalization.

This could be an opportunity to address adverse drug events, or emergency room utilization, or prescription drug use, or medication use by older adults. But I’m not going there, at least this time.

If I ruled the world, we would have a long-overdue national conversation about sleep and insomnia.

We’d open with a discussion of the “sleep is for wimps” mindset. Where does this come from, and who do these people think they’re kidding?

We’d take a look at the science. What do we know about the human body’s need for sleep and the mechanisms of sleep? How many questions still lack good answers?

We’d involve the medical community. How often are patients queried about their sleep? Is there more than one option for helping them, or is the immediate response to hand out (or refuse) a prescription for a hypnotic or to assume the problem is related to stress or lifestyle?

Finally, we’d get real about insomnia. Although sleep difficulties can often be traced to how people live their lives, simply telling them to practice better “sleep hygiene” may not cut it for those whose insomnia is longstanding, complex and more challenging to treat.

Somewhere in the discussion we might talk about shift work and the impact it has on sleep and health. We could talk about sleep apnea and restless legs syndrome as specific causes of poor sleep, while at the same time recognizing that many people with insomnia don’t have either of these conditions.

We could probably talk about the punishing levels of daily stress experienced by many people and how it interferes with their sleep.

And yes, we’d have a serious discussion about where pills fit into this. We would acknowledge that sleep aids are sometimes prescribed to people who don’t really need them or whose safety might be compromised by taking them. But if we’re being fair, we’d also have to recognize that clamping down on sleeping pill prescriptions could consign many people to chronic, intractable insomnia – and as anyone with longstanding insomnia can attest, it’s a miserable and ultimately unhealthy place to be.

Who’s up for the conversation?

Competing our way to better health

Reactions to the finale of the latest season of “The Biggest Loser” seem to fall into two general categories:

1) Rachel Frederickson, who shed more than half of her body weight to win the competition and take home $250,000, looks fabulous and deserved the prize for her hard work.

2) Rachel lost too much weight too rapidly and is a prime example of everything that’s toxic about how Americans view food behavior and entertainment.

Amid all the chatter, not to mention the snark, that has swirled around the show for the past several days, there’s one point that seems worthy of extended, thoughtful public discussion: What are the ethics of creating competition around people’s health?

Treating individual health as fodder for a contest is by no means confined to “The Biggest Loser.” Corporate and community initiatives abound that pit individuals and teams against each other to see who can lose the most weight, walk the most steps, exercise the most hours and so on. The popularity of “The Biggest Loser” has spawned imitation contests in workplaces, neighborhoods and even churches.

What’s the harm, as long as it helps motivate people to change their behavior? Well, not so fast.

The competitive approach may be successful for some, and it may inspire many on-the-fencers to become more engaged. For a story last year on corporate wellness trends, the Washington Post talked to an office worker from Maryland who joined a corporate fitness challenge and lost 42 pounds. “There’s sort of like a peer pressure and a competitiveness to it,” Sal Alvarez told the Post, adding that he found it very motivating.

Not everyone responds well to competition, however, and some people may even find it alienating, especially if they feel coerced to participate.

Although there are plenty of positive stories about how wellness challenges have helped people change their lives, there’s surprisingly little solid evidence of competition’s overall effectiveness as a health improvement strategy. More importantly, it’s not clear whether competition leads to behavior change that’s sustained after the contest is over.

Then there’s the matter of prizes. Are people motivated by the prospect of T-shirts, water bottles and gift cards? Do they need carrots to entice them to take action or is it better to emphasize good health as the reward? What if the carrot is really, really significant, such as the quarter of a million dollars dangled before the Biggest Loser contestants?

With that amount of money at stake, along with a chance to be on national TV and impress their family and friends, it shouldn’t be surprising that the participants in the show would become uber-competitive, possibly to the point of going overboard.

In a recent interview with People magazine, Rachel herself conceded that she “maybe” was “a little too enthusiastic in my training to get to the finale.”

Maybe competition is OK, as long as it helps people accomplish their health goals. Then again, maybe it exploits people’s vulnerabilities and propels them into doing something perhaps the wrong way or for the wrong reasons. Does the end justify the means? That’s the moral question that hasn’t really been answered.

The ‘shiver’ diet? Don’t we wish

We hardy inhabitants of the Snow Belt have joked for years about the calories we (theoretically) burn from shivering our way through winter.

But wait – could there be scientific evidence supporting shivering as a hot new form of winter exercise?

Apparently so, according to the New York Times, which reported today on a study suggesting that shivering boosts the metabolism in the same way that exercise does.

Study participants were brought into a lab on three different occasions. During the first two sessions, they were told to exercise on a stationary bike in an indoor temperature of 65 degrees, and samples of their blood, fat and skin cells were obtained. For the last session, the participants were instructed to lie down, lightly clad, for half an hour while the indoor heat was reduced from 75 to 53. Their skin and muscle reactions were measured and samples taken again to see what happened.

Lo and behold, the study subjects produced the same amount of irisin, a hormone involved in the conversion of white fat cells to more desirable brown fat, from shivering as they did from exercise.

From the article:

What seemed to matter, the researchers concluded, was not the exertion of the exercise, but the contraction of various muscles, which occurred during shivering as well as cycling.

In view of the fact that the temperature this morning on my way to work was 0 and most of Minnesota is under its umpteenth wind chill advisory of the season, readers will have to excuse me for not climbing enthusiastically onto the shivering-as-a-form-of-exercise bandwagon.

A variety of studies have shown that we do indeed burn more calories when we’re trying to stay warm. But whether this is an appropriate substitute for exercise is debatable, especially since the study described by the New York Times only involved 10 people – hardly enough to build a strong scientific case. And the article does in fact offer an important caveat: There seems to be no evidence that working out in the cold helps rev up the production of irisin anymore than exercising in warmer temperatures.

In a sentence that could only have come from someone blithely unaware of the dangers of frostbite, the author concludes that if you can’t get to the gym, “at least consider lingering outside at the bus stop and shivering.”

No, thanks. I’ll take indoor exercise over lingering at the curb with a minus 25 breeze in my face any day.

Tribune photo by Ron Adams

Dry skin blues

There’s a bottle of fish oil capsules sitting on our kitchen counter. From the veterinary clinic. For the cat, who’s been experiencing the same misery as a lot of humans during this long winter: dry, itchy, flaky skin.

Dry skin doesn’t get much respect. After all, it’s not as serious as, say, heart disease. But as anyone who’s ever had it can attest, it’s certainly uncomfortable to live with.

Dry skin (the medical name for it is xerosis) can have many causes, some more difficult to address than others. Most of the time and for most people, though, the cause is environmental – and if it happens to be wintertime, you can usually blame it on constant exposure to dry air, both indoors and out.

The consensus seems to be that we function best with an indoor relative humidity of 40 to 60 percent. But during the winter, indoor humidity is usually much lower – around 20 percent in most buildings (and in fact it shouldn’t be much higher than this if you want to avoid problems with condensation). Unfortunately, the lower the outdoor temperature drops, the lower the relative humidity inside your home, workplace or school and the greater the chance that your hide will feel dry and uncomfortable.

Signs of dry skin can range from tightening, roughness and flakiness to fine lines and cracks. In more severe cases, deep painful fissures can develop that may become infected if bacteria are introduced – no small matter for health care workers and others who deal with chronically dry hands from washing them many, many times in a single day.

What to do? Here’s some advice from the Mayo Clinic:

- Choose an emollient-rich moisturizer and apply it several times a day. Products with petroleum jelly also can help but they tend to be greasy so you might want to limit them to nighttime use.

- Avoid harsh soaps. Cleansing creams, bath gels and soaps containing added moisture or fat are better choices.

- A long hot bath or shower may sound tempting after being outdoors in subzero weather but beware – hot water can strip valuable moisture from your skin. Stick to warm water and limit your time in the shower. After bathing, lightly pat your skin dry, then immediately apply a moisturizer; doing so will help trap water in the surface cells of your skin.

- Use a home humidifier during the dry winter months (but be sure to follow the manufacturer’s instructions for proper cleaning and maintenance).

So what about fish oil? Although consumption of omega 3 fatty acids has some benefit to heart health, there’s little proof it does much for dry skin, at least in the human species. There’s one small study that found fish oil supplements helped reduce itching among patients undergoing hemodialysis. Another rather interesting study on orange roughy oil found that it lessened dry skin – but again, this was a very small study and the fish oil was applied directly to the skin, not consumed in a capsule.

Thankfully the cat has stopped biting at herself and pulling out tufts of fur to try to relieve the itching, so we’re taking that as a sign the omega 3 supplement is helping. But we’ll  let her keep the fish oil all to herself.

A health scare, but not enough to change their ways

A heart attack or a cancer diagnosis is usually life-changing, yet many people do little afterwards to alter their lifestyle or behavior in ways that might reduce their future risk.

Various studies have been cropping up lately, all with the same conclusion. One can’t help connecting the dots and wondering what it bodes for the long-term health picture.

The bigger question  here, though, isn’t “what.” It’s “why.”

The latest study comes from Canada, where researchers found that even when people had a history of coronary heart disease or stroke that put them in a higher risk group, they weren’t much more likely than the general population to adopt three key changes associated with reducing their risk of a second heart attack or stroke: smoking cessation, regular physical activity and a healthy diet.

The study used epidemiological data on more than 154,000 individuals from 17 countries. Of the 7,500 participants who reported a previous history of heart attack or stroke, about 18 percent continued to smoke and 60 percent didn’t follow the recommendations for a healthier diet.

Not surprisingly, those who lived in higher-income countries fared better on all three measures.

Here’s another study, this time from the cancer front: Researchers who looked at survivors of melanoma, the most serious form of skin cancer, found that about one in four skipped the use of sunscreen and 2 percent continued to visit tanning salons.

The study results “blew my mind,” Dr. Anees Chagpar, the study’s author, told CBS News.

Other studies have found that cancer survivors are just as likely as everyone else to be overweight and inactive, even though these two factors are tied to a higher risk of recurrence for certain forms of cancer.

Is this a huge collective failure of patients to heed the so-called teachable moment in health care? Or does it signal something deeper?

I suspect it’s the latter. As anyone who has attempted to adopt a healthier lifestyle can attest, changing your ways is often very difficult. It takes a high degree of motivation and support to persevere, and the stress of a serious health event can add complicating factors that might not be addressed or even recognized.

Depression, for example, is common among heart attack survivors, yet the possibility of post-heart attack depression is rarely discussed with these patients. Multiple studies have found that among those who develop depression after their heart attack, the majority are undiagnosed and untreated. That they may struggle and fail to adopt healthier lifestyle habits should not be surprising.

One survivor, responding to a frank entry on the Heart Sisters blog about depression and heart attack survival, put it this way: “Physically I am not the same person and don’t think I ever will be. Everyday life details are not important to me anymore. I see myself stepping further and further behind and no one understands.”

Ditto for the physical and mental toll of cancer treatment, which can leave survivors with long-lasting fatigue, cognitive impairment, nerve damage and more. Although efforts are underway to improve survivorship care in the U.S., progress is slow and uneven, leaving many survivors – perhaps the majority – still under the radar.

The health care system itself hasn’t completely figured out who should handle the “teachable moment.” Should it be the cardiologist? The oncologist? The primary care doctor? A rehab nurse? In the meantime, opportunities to talk to patients about making behavior changes are being missed.

Then there’s the question of who pays to help people change their habits after a major health event – and I’m assuming here that many will need some support, even if it’s only minimal.

It takes staff and resources to provide the education that may be necessary, and reimbursement is often low. Although many health insurance plans include coverage for smoking cessation, there’s considerable variation in what they offer, and some states don’t cover tobacco cessation at all for their Medicaid enrollees.

We could ask people to pay out of pocket for their patient education, nutritional counseling, depression screening and tobacco quit services, but this doesn’t mean they can afford it or that they would make it a financial priority – or, indeed, that they would recognize they might need all of these.

Maybe the whole notion that a health scare should be enough to make people change their ways is flawed. It might be motivation only for some. For others, the motivating factor may be something very different. If we hold the tsk-tsk’ing long enough, we might start to figure out what really lies behind the seeming lack of lifestyle change and what can be done to have a more positive impact.

The hard life of kids

Alec Fischer’s documentary about bullying in Minnesota’s schools is only 45 minutes long but it clearly packs an emotional punch, as a local audience saw for itself last week.

During a showing of the film at Ridgewater College, the room grew hushed while photos flashed across the screen of students who killed themselves after prolonged bullying by their peers.

One of the filmmaker’s messages: Kids are singled out by bullies for many different reasons, and it won’t stop unless more people, both kids and adults, speak up about it.

From the vantage point of adults, childhood and adolescence can seem like a golden time. But there’s mounting evidence that this no longer is the case for all too many kids (if indeed it ever was).

Take the Minnesota Student Survey. Conducted every three years for students in grades 6, 9 and 12, it tracks them as they progress through middle school and high school. It’s seen as a good barometer for risky behaviors such as alcohol and tobacco use and early sexual activity.

But look at what students are reporting about their state of mind. In 2010, the most recent year in which the survey was administered, 8 percent of sixth-grade boys and 9 percent of sixth-grade girls reported feeling almost more pressure than they could handle at some time within the past 30 days. Among 12th-graders, 11 percent of boys and 19 percent of girls said they felt this way. Many kids also reported distressing symptoms such as frequent headaches and stomachaches, sleep difficulties and feeling unhappy or sad.

To be sure, these weren’t the majority. Most kids in fact seemed to be doing OK, and the vast majority said they liked school and that they had parents and other adult relatives who cared greatly about them.

It’s disturbing, however, that so many young people are experiencing high levels of stress and anxiety. Bullying, which seems to have become much more pervasive than a generation ago, is only one of the problems that students encounter.

A recent Associated Press story, which explored what some high schools are doing to reduce kids’ anxiety, noted that adolescents have a lot on their shoulders these days. School officials pointed to hectic schedules, academic overload and pressure to achieve, and kids spoke of days packed with nonstop activity. Here’s a typical day for Abbie Kaplan, a student at Boston Latin School:

On a scale of 1 to 10, she places her stress level at a pretty steady 9. She regularly has four hours of homework a night, some done before swim practice. She eats dinner around 9:30 p.m., then finishes the rest of her homework and generally goes to bed at 11:30. Then she’s up at 6 a.m. so she can be at school by 7:45.

She calls her hectic schedule “the new normal.”

“You keep telling yourself that it will prepare you for the future,” Kaplan says. “It’s just sort of how it is.”

She, too, has had anxiety attacks related to her workload, she says.

And a rising tide of stress among the younger generation was highlighted in yet another recent survey, this one by the American Psychological Association, that found stress levels have increased for Americans of all ages but are being felt most keenly by young adults. The study also found that younger adults seem to have more difficulty managing their stress and that their stress has increased in the past year.

Some of this may simply be how the human psyche matures and ages. Other studies have found that the years past middle age, when people tend to have accumulated life experiences and learned to cope with them, are often the happiest.

But as many observers have pointed out, the stresses faced by kids today are different than their parents’ or grandparents’ generations. The world is a more complex place than it once was, the economy is more difficult and the future more uncertain – and it’s all being intensified by the pervasive presence of the social media.

When kids are stressed and not managing it well, does it put them on the path to becoming tomorrow’s stressed adults, with all the unhealthy and potentially destructive behaviors this entails? While adults can’t always make the world an easier place for kids, acknowledging that there’s far more pressure on kids than there used to be seems like the first step toward taking this issue seriously and helping them develop coping mechanisms that can carry them into a healthier adulthood.

Parents, overweight kids and a minefield of blame and judgment

When Dara-Lynn Weiss wrote an article for Vogue magazine last year about putting her then 7-year-old daughter on a diet, she created a firestorm of controversy. “One of the most (bleeped) up, selfish women to ever grace the magazine’s pages” is how Jezebel summed it up.

Weiss described policing everything that went into her daughter’s mouth, depriving her of meals as a punishment for overeating, and humiliating her in public.

“I stopped letting her enjoy Pizza Fridays when she admitted to adding a corn salad as a side dish one week,” she wrote. “I dressed down a Starbucks barista when he professed ignorance of the nutrition content of the kids’ hot chocolate whose calories are listed as ’120-210′ on the menu board. Well, which is it? When he couldn’t provide an answer, I dramatically grabbed the drink out of my daughter’s hands, poured it into the garbage, and stormed out.”

Ultimately her daughter lost 16 pounds, and mother and daughter celebrated with a shopping spree for new clothes.

If all of this sounds over the top, brace yourself because Weiss recently published a full-length book, “The Heavy,” that chronicles in much more detail her efforts to help her daughter lose weight. And I’m beginning to think she doesn’t deserve the vitriol that’s been heaped upon her, because it’s often incredibly difficult for parents to know how to address a child’s weight in ways that are constructive rather than shaming, bullying and falling prey to America’s collective horror of excess pounds.

Full disclosure: I didn’t read Weiss’s article in Vogue last April and the magazine’s website doesn’t appear to have it available for nonsubscribers. So I’m relying on excerpts, secondhand accounts and published interviews with Weiss herself.

What comes across is a mother who’s trying to make her way through an emotional minefield of food, obesity, other people’s judgments and expectations, and her own desire to help her daughter without being either too permissive or too rigid.

She tells New York magazine that she accepts much of the criticism that erupted in the wake of her essay in Vogue: “I am strict. I was abrasive at times. I made a million mistakes.”

She talks about the awkward position parents find themselves in when they have a child who’s overweight: “Parents of obese children have this extra standard that’s very uncomfortable: If you tell a child he can’t have a piece of cake you’re embarrassing him by drawing attention to his problem; the same limit-setting would be considered fine for parents of normal-weight children.”

She points out how hard it can be to tackle a child’s weight issues: “It’s so awkward to talk to a child about food and weight, that’s why so many parents don’t do it.”

She’s candid about her own issues with food and body image and her fear of passing them on to her daughter.

Elsewhere, she discusses the “darned if you do, darned if you don’t” dilemma foisted on  the parents of children who are overweight. “You can’t get it right, you can’t be perfect – you’re going to make some people feel like you’re denying your kid her childhood, while others are standing there staring at every cupcake she eats,” she tells Motherlode, the New York Times parenting blog.

She expresses frustration with school lunches and children’s party menus that often undermine parents’ efforts to help a child adopt healthier eating habits. She describes how some of the common advice – “make better choices”, “stop when you’re full” – was too unstructured to be helpful.

And she challenges people to examine their assumptions about overweight children. Her daughter wasn’t lazy and didn’t eat unhealthy food, she said in a recent interview with USA Today: “She was a child with an enormous appetite… She has a brother a year younger, same parents, same food, who doesn’t want to eat sweets.”

Public opinion polls and research seem to back up Weiss’s observations about how we view children and weight.

A study published last year in PLOS One, for instance, found that obese children are more likely to be stigmatized than middle-aged and older adults who are obese – and that obesity among children is more likely to be blamed on external factors such as parenting style or environment.

There has been public debate about whether extremely obese children should be removed from the custody of their parents. And a couple of years ago a legislator in Illinois went so far as to suggest that parents of fat children should lose their state income tax deduction (although he later backed down, saying he was only kidding).

At the same time, a billboard campaign in Georgia that used pictures of real children and messages such as “It’s hard to be a little girl when you aren’t” was widely criticized for its fat-shaming approach and the lack of evidence that this strategy even works; the billboards ultimately were taken down.

And a poignant story that appeared last month in the New York Times revealed that children often feel bullied by adults about their weight, sometimes at the cost of developing eating disorders and obsession with body image.

As Weiss told the Motherlode blog, “People are so critical of childhood obesity, and then you try to do something about it – to help your child – and they’re critical of that, too. Parents can’t win.”

Whether you applaud Weiss’s story or deplore it, it seems to have launched a conversation about parents, children and weight. Perhaps it can lead in a more rational direction that reduces the judging and the bullying in favor of approaches that are actually helpful, both for kids and their parents.

How sweet it isn’t?

First it was fats, then carbohydrates. Now sugar has joined the ranks of nutritional villainy.

With Christmas approaching on a tidal wave of candy canes and gingerbread, one can’t help wondering: Is it OK to indulge in a little sweetness, or is sugar entirely bad?

There’s no denying a certain amount of hysteria when it comes to sugar. Critics claim sugar causes everything from hyperactivity to premature aging. A common – and inaccurate – belief about cancer is that cancer cells feed on sugar.

Some of this is hyperbole but there’s also a considerable amount of science that has examined the effects, both good and bad, of sugar consumption. Sugar has been linked, for instance, to increased risk of weight gain, diabetes and heart disease. Earlier this year, some U.S. health experts went so far as to declare that sugar is as addictive and dangerous as alcohol or tobacco and should be regulated accordingly.

Unfortunately it’s not always clear whether sugar itself is the culprit or whether something more complex is going on.

At least part of the reason why higher sugar consumption is linked to weight gain may simply be the extra calories. One of the issues with sugar-sweetened sodas and other beverages isn’t just that they contain lots of sugar, it’s that they often end up replacing water or milk in someone’s diet. Processed foods high in sugar also can be higher in fat and sodium, which are associated with negative health effects of their own.

A certain amount of sugar is necessary in order for the human body to function, but moderation seems to be called for here. Consuming large quantities of sugar each day also tends to be a marker for an overall diet that may not be optimal for health.

This brings us to the real vexation: the proverbial sweet tooth. Why do so many people love sugar and why can it be so hard to consume less of it?

I admit to not totally understanding the whole sweet tooth thing. If you were to invite me to your holiday buffet, I would go directly to the spinach dip and the shrimp cocktail. Cookies and candy, not so much. But this wouldn’t necessarily be the case for other guests.

There’s debate about whether so-called sugar addiction is real or imagined. Some studies have found clinical similarities between food cravings and drug dependence. A study published this year in the Journal of Psychoactive Drugs found, for instance, that when people binge on sugar-dense foods, it increases the amount of extracellular dopamine in their brain which has the potential to lead to addiction.

The authors wrote, “There appear to be several biological and psychological similarities between food addiction and drug dependence, including craving and loss of control.” They also note that for some people, consuming these foods is comforting and therefore might be regarded as a form of self-medication.

So far, however, the sugar addiction theory has mostly been tested on rats and mice, with implications for human behavior that are unclear at best. A 2010 review in the Clinical Nutrition journal examined the evidence and concluded there’s nothing yet in the literature suggesting that humans can become addicted to sugar or that sugar addiction plays a role in obesity or eating disorders.

The bottom line is that when it comes to “sugar addiction,” the jury still seems to be out.

In the meantime, here’s some guidance. The U.S. Department of Agriculture’s 2010 dietary guidelines recommend limiting consumption of added sugars and solid fats to somewhere between 5 and 15 percent of total daily calories. The American Heart Association suggests no more than 100 calories a day of added sugar for most women and no more than 150 calories a day for most men. That’s about 6 teaspoons for women and 9 for men.

Here’s something else to keep in mind. For most Americans, the main source of the sugar they consume isn’t in that spoonful they dump into their coffee, or a homemade dessert or even a Christmas cookie. Most of our dietary sugar comes in the form of added sugar – sugars and sugar-based syrups that are added to food during processing. Although the tendency is to single out highly sugared products, such as sodas, as the problem, added sugar can show up in a variety that may not be readily recognized – chicken nuggets, for instance, or ketchup or children’s breakfast cereals, all of which are often surprisingly high in sugar.

So go ahead and have that reindeer-shaped holiday cookie if you want. If you’re worried about the sugar, just take one.