Rx for primary care

When primary care doctors openly admit they wouldn’t advise medical students to follow in their footsteps, it does not bode well for the future of patient care.

A couple of weeks ago I attended a local meeting on the future of primary care. It was one of several that were hosted around Minnesota this past month to gather ideas and perspectives on the issues surrounding primary care.

On many levels, the discussion was truly depressing. Among the concerns I heard: There’s too much paperwork. Aging and chronic disease have made patient care much more complicated. Doctors are overburdened and dissatisfied. The primary care workforce is shrinking.

There was some talk of potential solutions, such as bringing in scribes to reduce the burden of medical charting or finding ways for physicians’ work time to be more flexible.

But what I didn’t hear was where patients might fit into all of this.

For better or worse, there’s a lot of frustration in health care these days. Consumer Reports recently published the results of an online survey of 660 doctors and the findings are revealing:

- 70 percent of the respondents said they were getting less respect and appreciation from their patients.

- The top complaint was failure by patients to follow advice or treatment recommendations.

- The volume of insurance paperwork was the No. 1 barrier to providing optimal care, followed by financial pressures that force doctors to see more patients for shorter visits.

If health care providers are frustrated, so are patients. Consumer Reports also surveyed 49,000 of its subscribers and found that although the majority were very satisfied with their doctor, they were less so if they felt their doctor rushed them through a visit, dismissed their symptoms or were too quick to whip out the prescription pad.

By now you’ve probably connected some of the dots. No one’s happy with short visits, unappreciative patients, harried doctors or tons of paperwork. Whether we realize it or not, patients are deeply enmeshed in what ails primary care, and if the ship is going down, patients are going down with it.

It’s all the more unfortunate, then, that many of the so-called solutions to primary care often are formulated and implemented with little, if any, input from patients. Take medical scribes. On the surface, it sounds like a great idea: someone who can take over the burden of charting and documentation so the doctor can concentrate his or her attention on the patient. But how do patients feel about this? Will it change the dynamics of the encounter? It’s one thing, after all, to have someone transcribe the doctor’s notes after the visit; it’s quite another thing to have a third person in the room, during real time, taking notes on the visit.

Evaluations of demonstration projects to implement the medical home concept found that patient satisfaction actually eroded. Assumptions that patients don’t mind having their care turned over to mid-level professionals also can be mistaken.

For what it’s worth, I think patients and doctors still value the relationship-building that lies at the heart of primary care. In fact, when the Consumer Reports survey asked doctors what patients could do to obtain better medical care, establishing an ongoing relationship with a primary care doctor was at the top of the list.

Unfortunately the ability to form and sustain those relationships is being seriously fractured by multiple pressures from the outside. There’s a very real danger that in the rush to come up with solutions, we overlook or devalue what makes primary care unique, what draws physicians to primary care in the first place. Who benefits from that? Not physicians, and certainly not patients.

This is a discussion in which patients need to participate. I hope someone out there is listening.

Photo: Wikimedia Commons

A balanced life

For kids, a snow day is like a gift that falls unexpectedly from the sky: a whole day to stay home and do whatever you want.

For adults? Not so much. In fact it’s often a giant inconvenience, and thanks to the wonders of telecommuting, many people are expected to continue working from home.

“Snow day, schmo day. Get to work!” groused Time magazine in a pithy little commentary last year:

Blizzard or no blizzard, it’s business as usual for today’s wired workers. Forget building a snowman with your kids. You’ve got conference calls and e-mails to attend to. And also, since the kids are home and you must work, you’ve got some extra work to do: You’ve got to find someone to watch them.

Whatever happened to slowing down for a day or two?

The statistics about American leisure time aren’t encouraging. Workers in the United States have some of the lowest vacation benefits of any industrialized nation. Whereas time off is guaranteed in countries such as France and Finland, U.S. employers aren’t even required to offer paid vacation time.

Expedia, the travel company, has been tracking vacation trends since 2000, and its data are showing an increase in the number of Americans who have vacation time but don’t use it. Some of the key statistics from its most recent study of 1,530 working-age adults in the United States are rather enlightening:

- About one-third of those who participated in the 2009 poll didn’t take all their vacation time. About one in five also reported canceling or postponing vacation plans because of work.

- One-fourth of the respondents said they continued to check their voice mail and e-mail while they were vacationing.

- Forty percent of the women in the survey said they felt guilty about taking time off; only 29 percent of the men felt this way.

- On average, employed adults in the United States leave three unused vacation days on the table each year.

All of this appears to be a symptom of a deeper issue: an increasing struggle to achieve a balance between work and home, between fostering one’s career and nurturing the self.

One of the most compelling studies on the impact of U.S. work habits and expectations was carried out in 2005 by the Families and Work Institute. Titled “Overwork in America: When the Way We Work Becomes Too Much,” it reaches this conclusion: “For a significant group of Americans, the way we work today appears to be negatively affecting their health and effectiveness at work.”

Among the highlights from the report: Employees who reported feeling overworked were more likely to experience stress, show signs of depression and rate their health as poorer. And here’s a real clincher: Fewer than half (41 percent) of those who reported high levels of overwork said they were successful at taking good care of themselves.

What will it take to change this? It’s not clear whether there’s a universal answer. In some quarters, however, people are trying.

Dr. Patricia Lindholm, a family physician from Fergus Falls and president of the Minnesota Medical Association, has made it one of her priorities to focus on physician well-being. A task force has been formed to come up with ideas and recommendations, and back in January the MMA devoted an entire issue of its monthly magazine to physician well-being.

A survey conducted by the task force identified some of the strategies doctors use for coping: making family time a priority, taking vacations, reducing their work hours, practicing yoga or meditation, being involved in hobbies or activities outside of medicine.

Earlier this month a new national partnership was announced between the Families and Work Institute and the Society for Human Resource Management to redesign workplaces for the 21st century. Among the ingredients: job autonomy, supportive relationships with supervisors, a culture of respect and, above all, flexibility that allows workers time off so they can nurture other aspects of their lives.

It’s no accident, says Ellen Galinsky of the FWI, that many of us get our best ideas while we’re in the shower; it’s the one place where we can be alone with our thoughts. Human beings need space in which to think, she writes in the Families and Work Institute blog:

We need to give ourselves time for rest and recovery. Ask anyone who is really proficient at anything – from intellectual to artistic to physical pursuits. They need time for full engagement and time for rest and recovery, as well as time for plugging in and unplugging from technology. Yet, our images of working hard at school or at work revolve around running non-stop, squeezing more and more in.

Many workplaces can’t (or won’t) change overnight. That leaves it up to individuals to change what they can on their own. Where to start? Here’s an online self-assessment that asks, “Are you overscheduled?” and helps identify areas in need of improvement. Maybe you skimp on sleep because you’re too busy, or have stopped taking time for friendships, or find yourself overcommitted because you can’t bring yourself to say no.

More strategies and ideas can be found here and here. Among them: Build downtime into your schedule; drop activities that sap your time and energy; make small changes, such as leaving work earlier one evening a week.

Is there such a thing as the perfect balance between work and the rest of life? Probably not. Priorities shift; life changes. The ideal seems to lie somewhere between being able to accomplish things in the workplace while having enough time to enjoy the rest of life.

Photo: Wikimedia Commons. Logo: Troy Murphy, West Central Tribune

Sleeping with the dog? Bad human!

Sharing your bed with the family pet? Yeah, I’ve done it.

But according to a new study published this month in the CDC’s Emerging Infectious Diseases journal, doggy and kitty bunkmates aren’t necessarily the best thing for human health. Among the risks: bacterial and viral infections, parasites, plague and cat scratch disease.

The study was carried out by researchers with the University of California-Davis and the California Department of Public Health, who reviewed the scientific literature for cases of zoonotic disease transmitted from pets to human companions.

Here’s a sampling of what they found: Several cases of plague that occurred in New Mexico over the course of a decade and were apparently transmitted by fleas found on cats and dogs who slept with their owners. A handful of documented cases of cat scratch disease linked to sleeping with or being licked by an infected cat. A case of meningitis in a 60-year-old British homemaker who “admitted to regularly kissing the family dog.” An Australian woman who developed septicemia and multiple organ failure after her puppy licked a wound on top of her foot. Instances of methicillin-resistant staph, hookworm and Chagas disease, all apparently transmitted through close contact with pets.

Whoa. Who knew?

Something tells me pet lovers aren’t going to take this news lying down. After the story was posted Tuesday on the ASPCA’s page on Facebook, the comment section exploded with reactions. At last count, more than 1,000 people had spoken their minds about their bond with their cats and dogs.

Some typical comments:

- “Now that I have Princess finally sleeping with me, I should kick her out? NO WAY!”

- “If I didn’t get to sleep with my little Yorky, I’d DIE!!!!”

- “The only health risk is the possibility of falling off the edge of the bed because they hog it.”

- “I don’t care what risks there may or may not be. My babies sleep with me, period!”

Talk about hitting a nerve. But I don’t think the response is surprising. Here in the U.S., we love our animal companions. The authors of the study cited statistics from several surveys indicating that more than 60 percent of American households are home to at least one pet. Many of these folks view their pets as members of the family (I know I certainly do), and the practice of allowing Rover and Fluffy on the bed at night is common. That makes the potential health risks a widespread issue.

Advice from the health community unfortunately is often at odds with how people feel about their pets. Recommendations for good “sleep hygiene” (a judgmental term if there ever was) often include banishing pets from the bedroom. Ditto for households where someone has asthma or allergies. I recall an intense discussion a few months ago in an online cancer forum, where someone posted that he was about to undergo a stem cell transplant and had been told by his oncologist to temporarily get rid of his dog.

The authors of this latest study in fact suggest that pet owners – especially young children and people who are immune-compromised – should be discouraged from sharing a bed with or kissing their pets.

In all fairness, the health implications of sleeping with our pets, kissing them or being licked by them haven’t been particularly well-studied. Nor is this level of human-animal contact often considered by hospitals and nursing homes when they craft their policies on animal visitors. Having some extra data about the potential risks isn’t automatically a bad thing, and being aware of the risks can be especially helpful for those who might be more vulnerable.

But should the results of this study make us change our ways? While it’s entirely possible to get plague or MRSA from the family pet, the incidence of cases like these is very low, at least in the United States. Some could probably have been prevented altogether with appropriate care – flea prevention, for instance, to reduce the likelihood that a pet will bring plague-carrying fleas into the house. Routine veterinary care, vaccinations and supervision to prevent dogs and cats from roaming freely all over the neighborhood also can go a long way toward enhancing the health of animals and humans alike.

As far as I’m concerned, the bottom line is this: The evidence that animal companions benefit our health far outweighs any of the bad stuff that might happen. So my cat will continue to share the bed at night – if, of course, that’s OK with her.

Photo: Wikimedia Commons

Those difficult patients

Admit it: Every so often you secretly wonder if your doctor thinks you’re a difficult patient.

Now you can calculate your odds, sort of. The Journal of General Internal Medicine recently published a study that attempts to quantify how many patients are considered difficult and what makes them so.

The researchers surveyed 750 adult patients who visited a primary care walk-in clinic at Walter Reed Army Medical Center in Washington, D.C., from 2005 to 2007. The patients were assessed for issues such as symptoms, expectations, functional status and the presence of any mental disorder. After their visit with the doctor they were surveyed two more times – as they were leaving the clinic and again two weeks after the visit.

Clinicians were surveyed as well to measure their perception of the visit and to rate each of the 38 participating doctors on their psychosocial skills.

Previous studies have estimated that up to 15 percent of patient encounters are perceived as difficult by clinicians, and this study was no exception. It put 17.8 percent of the patients in this rather undesirable category. If you do the math for a physician who sees, say, 35 patients a day, an average of three to five of those encounters will be considered difficult.

What made these patients difficult? A whole range of things, apparently. They were more likely to have five or more symptoms, more likely to report being stressed, and more likely to have depression or anxiety. They also were less likely to trust the doctor or feel satisfied with the visit, and less likely to report improvement in their symptoms two weeks after the visit. Other behaviors such as disrespect or belligerence can make the encounter difficult too, of course, but what made this study especially interesting was how it sought to identify some of the less obvious contributing factors.

Lest you think this is a one-way street, the researchers analyzed the surveys filled out by the participating physicians and came up with some conclusions about the doctors as well, namely that physicians who viewed certain patients as difficult often had less experience and fewer psychosocial skills.

In a companion article that appeared this week in American Medical News, one of the study’s co-authors, Dr. Jeffrey L. Jackson, suggested that physicians who were older and more experienced probably have a better perspective on dealing with patients whom they perceive as difficult. His advice: Younger doctors need to accept that some patient encounters will be challenging and should learn how to manage these visits more smoothly.

A few thoughts come to mind. First, patients with complex medical needs often are seen as difficult, a finding confirmed by other studies as well as this one. (Is “difficult” even the right word? Or is it more accurate to call these patients “medically challenging”?) It can take a lot of work and repeated visits to sort through and manage all their issues, as this vignette from the American Family Physician journal illustrates all too clearly:

A 45-year-old woman is being treated for depression, chronic daily headaches, type 2 diabetes, hypertension, and hyperlipidemia. She is taking eight prescription medications. Multiple antidepressants have been tried without improvement in her depressive symptoms. Her blood sugar level remains out of target range. She does not exercise and admits to overeating when feeling depressed. She discloses difficulties with her employer and says her headaches are worsening because of stress. Her weight is slowly increasing.

Were I this patient, I’d be frustrated, stressed and unhappy too with the lack of progress – but let’s face it, it’s frustrating for physicians as well, who tend to be programmed for success rather than failure.

It’s not surprising that patient acuity has become one of the issues of contention in the push to assign more primary care to nurse practitioners and physician assistants. Doctors often are reluctant to have mid-level professionals skim off the more straightforward cases, leaving the physician to manage all the patients whose needs are complicated. Even when doctors are up for the challenge, it can become overwhelming and lead to burnout when they’re forced to function at their utmost cognitive capacity all day long – particularly when their patients’ health shows little sign of improvement.

A second observation: Psychology seems to play a role in making some patients more difficult, suggesting there needs to be greater attention to this aspect of the patient’s health.

And third, sometimes it isn’t specifically the patient or the doctor who’s the problem, it’s the health care system itself. Fewer doctors (especially in primary care), more patients and shorter visits make it harder for doctors and patients to develop trust in each other, form a solid relationship and adequately address all the patient’s concerns. These “soft” objectives can be just as important as the concrete goal of improving the patient’s health, but they often take more time than most systems allow. The result? Everyone’s dissatisfied and the situation is that much more likely to become what doctors and patients dread alike: difficult.

The weekly rundown, Feb. 23

Blog highlights from the past week:

Most-read posts: Wedding ring = better health?; Zone of restriction; Linkworthy 3.1: Food for thought.

Most-read from the archives: No respect for the thyroid, from Jan. 21, 2010.

Most blog traffic: Wednesday, Feb. 16.

Link with the most clicks: an in-depth article about statins, which appeared in “Linkworthy 3.1.”

Search term of the week: “fat lady singers.”

Too many CT scans?

Are CT scans being overused? More to the point, are they being overused in hospital emergency rooms?

The flames seem to be finally dying down after a contentious online debate sparked by Dr. Robert Centor, who blogged recently about a study showing increased use of CT scanning in hospital emergency departments, and suggested this is most likely due to the patient load and pressures on emergency-room doctors.

“It appears that too often CT scanning takes the place of a careful history and examination,” he observes. “This can occur when the emergency physician is drowning in patients.”

The ensuing discussion was argumentative – one poster accused Dr. Centor of being “smug” – but it shed some interesting light on an issue that’s far more complicated than it appears on the surface.

The No. 1 cause of excess CT scans is fear of litigation, one physician said. Someone else stated, “We order too many scans because we can.”

In the high-stakes emergency setting, physicians can’t afford to miss a diagnosis because they’ve decided to skip a test, wrote another physician: “My experience doing emergency medicine is that a lot of CT scanning is ordered either to rule out conditions where we are expected to have 100% certainty not to miss (subarachnoid hemorrhages are one example already mentioned), or where the demand for certainty is being driven by the patient or our inpatient colleagues.”

The debate continued in a second post, where one of the key questions became: How do we know patients are receiving too many CT scans, and how do you define “too many”?

So why should the average person care about this issue? There’s no question that imaging technology has aided greatly in diagnosing what’s wrong with the patient. It has helped reduce uncertainty and in many cases eliminated the need for expensive and potentially risky exploratory surgery. But in much of the public discussion, the focus has been on the benefits rather than the down side. We talk about the certainty of diagnosis and less about the incidental findings that can lead to more testing and possible harm to patients. We talk about how beneficial it is to obtain high-definition images of what’s happening inside the body and not so much about the long-term impact of all that radiation exposure.

There are some signs the tide may be turning. The risks vs. the benefits of medical imaging are being discussed more often, especially among health care professionals, in ways that are thoughtful and evidence-based. Consider Dr. Bob Wachter, one of the leading patient safety gurus in the U.S., who blogged recently about the potential harm inherent in CT scanning. “Even if the risks turn out to be less than we fear, most skeptics now agree that we’re causing a lot of cancers, and that many could be prevented if we took a few sensible steps,” he wrote.

Dr. Wachter concludes the medical community needs to “Just Say No more than we ever have before.”

While this is a sound clinical concept, I suspect it’s easier said than done. According to a study published in December in the Annals of Emergency Medicine and conducted with a cross-section of patients who came to the ER with abdominal pain, patients felt more confident about the care they were receiving if a CT scan was part of the medical evaluation. They also underestimated how much radiation is contained in the average CT scan, and many couldn’t accurately recall their previous history of CT scans.

Patients aren’t the only ones contributing to the overuse of CT scans in the United States, but their expectations clearly are among the factors. A little education and awareness by the public certainly wouldn’t hurt and it might even help lead toward a more judicious, restrained use of CT scans.

West Central Tribune photo by Ron Adams

Linkworthy 3.1: Food for thought

There’s been a lot to think about in the health care news that has been dished up lately, starting with the story of CBS reporter Serene Branson, who unexpectedly stumbled over her words during a live broadcast at last weekend’s Grammy Awards ceremony.

Am I the only one who was bothered by the number of media outlets who seized on the footage and ran it over and over and over? Or the noisy chorus of speculation about what might have prompted Branson’s symptoms or the medical treatment she should have received?

As it turns out, Branson had a complex migraine with aura – not a stroke or even a seizure, as many of the armchair diagnosticians suggested. In an interview today on the “Early Show,” she said, “I was scared. I didn’t know what had gone on and I was embarrassed and fearful.”

Episodes like this can sometimes be a teachable moment… or are they? Gary Schwitzer, publisher of HealthNewsReview, blogged this week about having mixed emotions over how the NBC “Today Show” covered the incident. “Granted, Ms. Branson is a public figure whose performance was captured on video,” he wrote. “Does that mean she abdicates any right to privacy about what did or did not happen to her? Is such speculation vital for public discussion? Worthy of 5 minutes-plus of network television airtime? Or is this a matter of capitalizing on a person’s misfortune because you know the story is drawing lots of eyeballs?”

Good questions all, with further comments from readers.

Here’s an interesting piece of news via the Public Citizens Health Research Group, which recently got its hands on a transcript of a webinar held earlier this month for members of the American Society of Plastic Surgery and the American Society for Aesthetic Plastic Surgery. The issue: Surgeons apparently were advised to tone down their communication to patients about the risk of developing anaplastic lymphoma from breast implants.

This particular complication has only recently reached the attention of the public. Although the risk appears to be extremely small, a handful of published case reports indicates that some women have had to undergo chemotherapy and/or radiation, and some have had recurrence of their disease.

According to the transcript obtained by PCHRG, however, the surgeons who participated in the webinar were told to use the word “condition” in communicating with their patients, “rather than disturb them by saying this is a cancer, this is a malignancy.”

It can be a fine line between giving patients adequate information and scaring them with too much. But it’s a whole ‘nother story to deliberately manipulate the terminology in order to present patients with a watered-down version of the facts. I think this is a bad call; readers are welcome to disagree (or agree) in the comment section below.

Millions of Americans are taking statins to lower their cholesterol. This blockbuster class of drugs has been widely viewed as an important advance in the war on heart disease. But are statins really All That? Perhaps not. A very thought-provoking article appears in the most recent issue of Proto, a magazine published by Massachusetts General Hospital, that delves into the wisdom of what we think we know about statins.

Among the questions it examines: Do statins really save lives? Do they benefit women as well as men? Why have some of the clinical studies on the use of statins produced contradictory findings? This is an article that goes well beyond the superficial, displaying why so much about medicine fails to be cut and dried. If you’ve always wanted to know more about statins or about evidence-based care in general, it’s worth a read.

Awhile back I blogged about an essay contest featuring personal stories about patient and provider experiences with paying for health care. The sponsoring organization, Costs of Care, has posted a handful of new stories on its blog, and here they are: Getting an estimate, Cruel shoes, A medical student’s dilemma, A $1,000 coding error, and Not colon cancer. Recommended reading.

Patients are supposed to participate in their care, right? It sounds easy but all too often it isn’t – especially when the patient is sick. Jessie Gruman, president of the Center for Advancing Health and a current cancer patient, blogged this past week about the collision between ideals and reality in A Valentine to shared decision-making. During her first visit with her oncologist, she was still recovering from surgery and was “foggy with fatigue,” she writes. By week 3, she felt well enough to make decisions about which chemotherapy regimen to pursue. Week 5? “I have almost no recollection of this meeting,” she writes. “I feel so sick I can barely sit up.”

Why, she wonders, should we put so much energy into developing “this sweet moment of converging sanity, capacity and data?” Her conclusion: “Because we need a good model.” Go there to read the rest.

This edition of Linkworthy is concluding with a pair of anonymous blog entries, both written by health care professionals and both having to do with cancer: Tragedy of cancer in a small child and Today. How do you explain tragic outcomes? How do the professionals who care for patients make their peace with death when they’re forced to confront it over and over? “i don’t believe there are reasons for accidents and bad genes and screwed up dna, but i do believe there are reasons for faith,” the anonymous OncRn writes. “Especially today.”

Photo: Wikimedia Commons

Zone of restriction

The No White Foods Diet sounds simple: Just eliminate everything white from your plate, and weight loss will follow.

Longtime dieters have probably heard of the Grapefruit Diet, a high-protein, very low calorie diet that involves a limited menu of food choices and half a grapefruit or a glass of grapefruit juice with every meal. Then of course there’s the Atkins Diet, extremely low on carbohydrates and high on proteins and fats.

What these and many other fad diets have in common is restriction. Restriction in variety, restriction in quantity, restriction in calories, restriction in food groups such as carbs, sugar or dairy.

In theory they help people lose weight and become healthier. But do they really? More to the point, can any restricted diet be sustained over the long term?

It’s not an idle question. Hundreds of thousands of Americans are looking for a better way to eat and many of them are wondering: Should they eliminate carbohydrates? Should they cut out sugar? What about going vegetarian?

But the jury is still out on many of the restrictive fad diets. Although the highly popular Atkins diet has helped countless people lose weight, it remains controversial for its high-protein, low-carb philosophy. For one thing, it limits the consumption of fruits and vegetables, which nutrition experts unanimously agree are important for a healthful diet. For another, critics are concerned that long-term pursuit of the Atkins diet may lead to heart disease and increased risk of kidney stones and gallstones.

Then there’s the restriction on carbs – no more than 40 grams a day, although the human body generally needs at least three times that much in order to function properly. Refined sugar, milk, white potatoes, white rice, and bread and pasta made with white flour are supposed to be eliminated completely and forever.

When WebMD reviewed the Atkins diet, one of the experts they spoke with was Barbara Rolls, author of “Volumetrics” and a professor at Penn State University, who had this to say:

“No one has shown, in any studies, that anything magical is going on with Atkins other than calorie restriction. The diet is very prescriptive, very restrictive, and limits half of the foods we normally eat. In the end it’s not fat, it’s not protein, it’s not carbs, it’s calories. You can lose weight on anything that helps you to eat less, but that doesn’t mean it’s good for you.”

The sad truth about many of these diets is that they ultimately fail, at least in part because many people find the restrictions too unrealistic and too difficult to live with for the long term.

I learned this firsthand a few years ago when I made a short-lived foray into veganism. Initially, the prospect of eliminating meat, fish, dairy and eggs sounded doable. But soon I found myself spending more and more time and energy planning meals and tinkering with recipes, trying to meet the vegan restrictions and still achieve some kind of nutritional balance. Besides being high-maintenance, it was monotonous beyond belief – something the proponents of restricted diets often don’t warn you about. When I began contemplating whether I needed to take a multivitamin supplement to make up for the nutrients I wasn’t getting at the dinner table, it became apparent that a vegan diet didn’t really make much sense, at least for me. I’m sure there are people who happily subsist on veganism but I’m not one of them.

Indeed, amid all the clamor about obesity and overconsumption of food, there’s an issue we seem to have missed: malnutrition.

A study from Australia that recently appeared in the Nutrition and Dietetics journal found that as many as one in three elderly hospital patients and up to 70 percent of nursing home residents are either malnourished or at risk of malnutrition. Author and researcher Karen Charlton also found that malnutrition often isn’t seen as a clinical priority and hence is underdiagnosed. She called it “the skeleton in the closet of many Australian hospitals.”

Malnutrition is more likely to be a risk for certain populations – people who have undergone bariatric surgery, for instance, or have cancer or kidney disease. For those with eating disorders, it can be deadly. And although undernourishment is far more common in developing nations where many people go hungry, it also occurs in wealthy countries, in the form both of hunger and of overconsumption of an unbalanced diet.

The National Eating Disorders Association makes several important points about what can happen when people pursue diets that are overly rigid or restrictive: For one thing, restrictive diets are often short on vital micronutrients such as calcium. For another, they can affect muscle strength, metabolism, coordination and the ability to concentrate.

Perhaps one of the best-known studies on the effects of food deprivation was carried out at the University of Minnesota from 1944 to 1945. Known as the Minnesota Starvation Experiment, it involved 36 male volunteers who were subjected to an extremely low-calorie diet, then renourished with a specially designed rehabilitation diet. Although the main purpose of the experiment was to study the effects of famine and famine recovery in the aftermath of World War II, it resulted in several other important findings as well, one of them being the psychological effects of food restriction and calorie deprivation. During the starvation phase, the subjects became extremely preoccupied with food and experienced distress, depression and social withdrawal. Physical effects that were observed included a lowering of heart and breathing rates, reduced body temperature and reduced basal metabolism.

The average American consumer is unlikely to reach these extremes. But on a lesser level, it represents some of the issues that can arise when diets become highly restrictive.

So what’s the answer? A balanced diet, defined as “getting the right types and amounts of foods and drinks to supply nutrition and energy for maintaining body cells, tissues, and organs, and for supporting normal growth and development.” It should include all the major food groups in adequate proportions.

While the food pyramid might sound trite to many people, it’s as good a place as any to start. Concerned about carbs? Instead of cutting them out completely, consider whole-grain products instead. Want to reduce your fat intake from dairy? Try low-fat or fat-free options. It can be harder for vegetarians to meet all their daily nutrition requirements but it’s entirely possible, as long as they’re including enough protein, iron, calcium, zinc and other key nutrients.

The DASH eating plan is one of the few so-called diets that seems to have stood the test of time. Short for Dietary Approaches to Stop Hypertension, it was designed to reduce blood pressure but has been found to help with lowering cholesterol, insulin resistance, weight and even neurocognitive functioning as well. It emphasizes fruits, vegetables, whole grains, low-fat or non-fat dairy, lean meats, fish, nuts and legumes, plus limited amounts of sugar. The health benefits of following the DASH diet have been well studied and mostly reinforce its positive effects. Although it has some restrictions - mainly calories and sodium – it’s not unrealistically rigid, nor does it demand that people eliminate entire categories of food from their plate.

Making dietary changes isn’t easy, period. Even with a balanced diet, it can be challenging to manage the many requirements. Extreme restrictions can make it harder than it needs to be, though. It really does come down to achieving a good balance – not too much and not too little of anything.

Food photos: Wikimedia Commons. Logo: Troy Murphy, West Central Tribune

The weekly rundown, Feb. 16

Blog highlights from the past week:

Most-read posts: Matters of the heart; The weeknight dinner dilemma.

Up and coming: The great debate: Cameras in the delivery room.

Most blog traffic: Monday, Feb. 14.

Link with the most clicks: A list of the 20 least-healthful restaurant foods, from “The weeknight dinner dilemma.”

Search term of the week: “exploring the world of raccoons.” (Don’t ask; I’ve never blogged about raccoons and have absolutely no clue how someone managed to find this blog with this particular search term.)

Current top search terms: “wedding rings”; “no show patient”; “portion distortion”; “raw milk”.

Raccoon photo: Wikimedia Commons

The great debate: Cameras in the delivery room

For most parents, few things can match the wonder of photos and video capturing the arrival of a new child into the world.

But at some hospitals, families who bring a camera into the delivery room will be informed that picture-taking is off limits. This is the policy at several hospitals in Maryland and at least one in Massachusetts, where neither photos nor video are allowed until after the baby is safely delivered and the medical team has given permission for pictures.

It’s a substantial reversal to the long-standing practice of encouraging families to be involved and engaged in the birth of their child, and it has sparked some passionate debate about the pros and cons of allowing cameras in the delivery room. The New York Times led the pack with a recent story and online discussion about the issue.

Here’s some background from the Times article, which includes the story of a Maryland mom who was so upset about the camera restrictions that she started an online petition:

For the hospital, the issue is not about rights but about the health and safety of the baby and mother and about protecting the privacy of the medical staff, many of whom have no desire to become instant celebrities on Facebook or YouTube.

Their concerns come against a backdrop of medical malpractice suits in which video is playing a role. A typical case is one settled in 2007 that involved a baby born at the University of Illinois Hospital with shoulder complications and permanent injury; video taken by the father in the delivery room showed the nurse-midwife using excessive force and led to a payment to the family of $2.3 million.

Nationwide, photography and videography have been allowed in many delivery rooms for decades. But in recent years, technology creep has forced some hospitals to rethink their policies as they seek to balance safety and legal protection against the desire by some new mothers to document all aspects of their lives, including the entire birth process.

It’s seems there’s plenty to argue about. At the Times’ “Room for Debate” opinion section, an obstetrician-gynecologist questions how far parents should go in capturing and sharing the birth experience. “The first time I saw a camera lens creeping into my field of view while delivering a baby, I thought the biggest issue was the appropriateness of exposing others to really intimate videos,” wrote Dr. Amy Tuteur. “Our vacation videos are bad enough. Are we really going to force friends and relatives to watch such a personal event?”

Photographing or videotaping a live birth is a decision that ought to be negotiated between the parents and the OB team, she concluded.

Jennifer Margulis, a contributing editor with Mothering magazine had a completely different take: “Anytime you’re told you may not record what happens to you, be very wary of the people you are dealing with… Hospitals don’t want you taking pictures because they fear you might record their mistakes.”

At a deeper level, the whole debate raises issues about how we experience major life events when there’s a camera involved, says Raymond DeVries, a professor with the Center for Bioethics and Social Sciences in Medicine at the University of Michigan Medical School. If you’re going to bring a camera into the delivery room, think twice, he wrote. “Birth is an intimate moment to be experienced and cherished. Women in labor should not be thinking ‘I’m not decent for Facebook.’ When the camera is calling the shots we have lost something precious about human experience.”

The delivery room seems to be the flashpoint for a larger shift in how we view privacy and patients’ expectations of participating in their health care. Several months ago I blogged about allowing family members in the room when someone is being resuscitated. Do families have the right to be present, even when what they’re witnessing is bound to be disturbing and might invade the patient’s privacy? Many would argue that they do, but it has made for considerable tension between patients and health care professionals.

Health care workers often are ambivalent about the lengths to which they can accommodate the wishes of patients and families. What if a family member wielding a video camera gets in the way at a crucial moment? What if the family’s presence becomes disruptive?

The privacy boundaries can sometimes be exceeded for health care workers as well as for patients. Consider a 2007 incident at Children’s Hospital in Boston, where the family of a young cancer patient installed a webcam in the child’s hospital room so a favorite relative could see what was happening – but staff didn’t know the webcam was there until it was accidentally discovered by a nurse.

Judging from the online discussion at the New York Times, it’s a volatile topic. An anesthesiologist who attends C-sections wrote that “the focus needs to be on the patient and not rescuing a visitor from the floor after they have fainted (yes, it happens a lot).” From someone else: “It’s a medical procedure, not a fraternity kegger, for Pete’s sake. Why do people treat everything like it’s primarily some sort of photo op?”

A mother wrote:

I expressly forbade family members from videotaping me during labor and delivery. It’s not a pretty process in the best of circumstances, and still snapshots of me holding the baby after arrival suffice for commemorating the occasion. I don’t comprehend the sort of exhibitionism that would drive someone to want their delivery posted on Facebook.

Someone else argued that cameras are everywhere and we need to get used to it. “Cameras are documenting many facets of our lives and medical people should not be treated any different than other people,” he wrote.

Another commenter wondered whatever happened to pleasing the customer: “I’m not spending $10,000 a year on insurance to have my service providers dictate chapter and verse of how I experience the care I pay them to provide.”

It seems reasonable for hospitals to have some basic rules about the presence of cameras in the delivery room, if for no other reason than to make it clear what’s expected of families and to prevent the picture-taking from becoming disruptive. How far a hospital’s policy should go is less clear. Should there be an outright ban on photos and videos during the actual delivery? Should this be negotiated on a case-by-case basis between families and the OB team? What do readers think?

Photo courtesy of Brent and Gretchen Schlosser