Free-range kids

 Do kids fare better emotionally, socially and physically when they’re allowed plenty of unstructured time to play?

Children don’t benefit from being overly programmed, researchers contend in a series of articles appearing in the latest edition of the American Journal of Play. In fact, they argue that the tendency of many parents to schedule their child’s day down to the nanosecond may actually be harmful, giving rise to a generation more prone to depression, suicide and narcissism than that of their parents or grandparents.

Peter Gray, the journal’s guest editor and a research professor of psychology at Boston College, outlines the changes that have taken place in children’s play:

In the 1950s and 1960s, and to a lesser degree in the 1970s and 1980s, it was possible to walk through almost any North American neighborhood – after school, on weekends or any time in the summer – and see children outdoors at play. Today, in many neighborhoods, it is hard to find groups of children outdoors at all, and, if you do find them, they are likely to be wearing uniforms and following the directions of coaches while their parents dutifully watch and cheer.

Many adults, especially once they reach middle age, tend to look back on their own childhood with a certain amount of romanticism, exaggerating the good times and creating a standard against which the current state of childhood can’t possibly measure up.

But Gray’s description seems to be more than mere observation. Various studies have objectively reported a steady decline in the amount of time children spend in unstructured play. One of the larger studies, carried out by a research team at the University of Michigan, conducted an assessment in 1981 and again in 1997 of how children spent their day. Among the findings: Children in 1997 played less and had less overall free time than they did 16 years earlier.

The journal authors explain that without time to play, kids are missing out on opportunities to be physically active, tap into their creativity, gain social skills, manage risk, regulate their emotions and learn to become self-reliant – abilities they’ll need later in life.

Age-mixed play seems to have a special value, allowing younger children to learn from older peers and older children to develop skills in nurturing, teaching and leadership.

In an interview, authors Hara Estroff Marano and Lenore Skenazy raise a provocative question: When kids spend time in free play, can it help reduce the incidence of bullying? Marano says yes:

Most kids have built-in, internal restraints against bullying, or, at least, they used to – when kids were allowed to play with each other and develop social skills and when normal abilities to adjudicate disputes would come into play and be sharpened by playing. Lack of play is creating many of the conditions that allow bullies to exist.

Marano also cites research demonstrating how play stimulates the genes that govern nerve growth in the frontal cortex, the executive portion of the brain. Play might not seem goal-directed but it helps create the mental foundation that in turn allows kids to mature in their ability to govern their emotions and behavior, she said.

The two women argue that parental anxiety is one of the key factors behind the steady decline in children’s play time. They may be right but there seem to be plenty of other reasons as well. For one thing, the rise in the number of parents who both work has led, by necessity, to family lives that are more scheduled. For another, families are smaller, thus shrinking the available pool of neighborhood kids for spontaneous play. Between the increasing demands of schoolwork and the array of sports and enrichment opportunities now available to kids, play time also is being edged out by competing activities that may not have existed a generation ago.

How can parents turn this trend around? It may take some deliberate engineering. Marano suggests gathering a group of like-minded parents, setting aside an outdoor space for their children to play, possibly installing some playground equipment – and then going indoors and allowing the kids to play on their own.

It’s the essential job of parents to prepare their children for independent adulthood, she said. “Play is the next great facilitator of that system. In addition to all the great things play does for the brain, the peer play of childhood is important in giving kids social skills to be used in all kinds of situations and is important in shaping a whole generation.”

Photo: Wikimedia Commons

Taking concussions more seriously

In the past five years, brain experts have learned more about concussions than in all the previous years put together.

It’s now known that a concussion is actually a mild form of brain injury, with potential long-term consequences if it’s not managed correctly. We now know that when children and teens sustain a sports-related concussion, it takes longer for their still-developing brain to heal than it does for an adult, and that cognitive rest needs to be part of the recovery process.

The science has literally been a game-changer. This year Minnesota became one of the latest states to implement a law requiring young athletes to be removed from play if they’re suspected of having a concussion. The new state law also requires coaches and officials to undergo training on how to recognize and manage concussions. It’s hoped that, down the road, fewer sports-related concussions go undiagnosed and fewer kids end up with memory problems or long-term cognitive disabilities from a blow to the head.

It may take some time, though, for all of this knowledge to percolate down to the public’s level. In a report published last year in the Pediatrics journal, researchers came to the rather disturbing conclusion that many parents don’t take concussions as seriously as they should.

The researchers analyzed the medical records of 434 children admitted to a children’s hospital in Hamilton, Ontario, with a diagnosis of acquired brain injury. Out of 341 of these kids who had a brain injury, 32 percent were formally diagnosed with a concussion.

The kids who were diagnosed with a concussion left the hospital sooner and returned to school sooner, the researchers found. In fact, they noted an apparent trend: Simply being labeled as having a concussion seemed to be a strong predictor for how soon these young patients were sent home from the hospital.

Do parents understand what is meant by the term “concussion”? Many of them may not realize that a concussion involves injury to the brain and hence regard it as less serious, the researchers wrote. They argue for calling a spade a spade: The term “mild traumatic brain injury” should be used so parents have a more accurate understanding of their child’s head injury and can make better decisions.

Concern for the long-term impact of concussion extends to the world of professional sports as well. There are signs that pro baseball and the NFL are beginning to take head injuries more seriously. But only this week, an editorial in Canada’s National Post blasted the National Hockey League for caring more about market share than “the woozy, battered elephant in the room.”

If people don’t take concussions seriously enough, perhaps the real issue is that many of them are relying on outmoded information. After all, it wasn’t that long ago that young athletes were sent right back onto the field after taking a blow to the head, and it was considered acceptable to do so. As the state of knowledge continues to evolve, many of these practices are being discarded in favor of approaches backed by the accumulating evidence on the long-term impact of concussions.

Could some concussions be prevented altogether? Although this issue doesn’t seem to be talked about very widely, it seems ripe for further study. A study last year found that in the decade from 1997 to 2007, sports-related concussions among pre-teens and adolescents doubled in incidence, in spite of an overall slight decline in the number of kids participating in organized sports.

Some of this may simply be due to better recognition of sports-related concussions. But there also appears to be an overall increase in the number of concussions among middle-school and high school-aged kids, period. Is it because many young athletes these days are larger? Is it because they play more aggressively? The reasons aren’t clear.

Science will no doubt continue to advance what we know about concussions. It’s to be hoped that athletes will some day be much better protected against permanent damage from a blow to the head, and that there’ll be a better public understanding that concussions can be serious indeed.

West Central Tribune photo by Ron Adams

Rethinking hospital readmissions

There’s long been a belief that hospital readmissions are largely preventable, and that if a patient is readmitted after an earlier hospital stay, it’s a sign of lower-quality care. In recent years, hospitals in the U.S. and elsewhere have poured untold amounts of time and resources into lowering their readmission rates.

A new study from Canada has come along to challenge this thinking. The study, which appeared this week in the Canadian Medical Association Journal, analyzes emergency readmissions among 4,800 patients at 11 hospitals in Ontario and concludes that only about one in five could have been prevented.

Moreover, the researchers learned that the number of readmissions may be lower than previously thought. In this particular study, 13.5 percent of patients in the sample group had an urgent, unplanned readmission within six months after a hospital stay.

One study doesn’t make a trend. For one thing, the analysis didn’t include patients who were readmitted after being discharged to a nursing home and who may have been older, more frail and more vulnerable. For another, the determination of whether an admission was avoidable was to some extent subjective.

But it certainly calls into question the widespread belief, especially among health policy folks, that readmissions are a reliable indicator of whether a hospital is “good.” It also challenges the assumption that readmissions can be avoided if only hospitals work diligently and hard enough.

Among the many nuggets of information in the Canadian study: The majority of patients who had an urgent readmission were more likely to have chronic or serious health conditions and a history of previous hospital admissions than the sample group as a whole.

There appeared to be multiple reasons why these patients had to be readmitted. Case reviewers found that some readmissions were obviously avoidable but others were less clear-cut. Errors in patient management were among the most common factors. So were surgery-related complications and medication-related issues. Some patients were readmitted because they developed a hospital-associated infection, others because there was a diagnostic error during their initial hospital stay.

One point worth noting: Readmissions that were deemed preventable mostly occurred within a few days after the patient left the hospital. This could suggest many things – for instance, that this subset of patients was perhaps simply more sick to begin with, or not quite ready yet to be sent home.

Another crucial point: The case reviewers couldn’t always clearly determine whether a readmission was truly avoidable and often needed more information before classifying it as preventable or not.

Dr. Carl van Walraven, a clinical epidemiologist at the Ottawa Hospital Research Institute and lead author of the study, told the Ottawa Citizen, “Not all urgent readmissions are avoidable, despite the care that is provided. This means that a lot of them are caused by a patient’s condition, or other factors that are not treatable or modifiable.”

The use of metrics, or statistically measurable indicators of hospital care, is widespread in the industry. Increasingly, the federal government, payors and quality assurance organizations are tieing metrics to how much hospitals are paid – more money for hospitals who meet the standard, less for those that don’t. But what if the metrics are based on faulty assumptions, i.e. that most unplanned readmissions are avoidable?

If anything, this study seems to underscore how difficult it is to define and measure quality in health care. Can every pressure ulcer be prevented? Perhaps not, although the most serious pressure ulcers are probably avoidable and the goal should be to make them a very rare occurrence. Much more can be done to lower the rate of hospital-acquired infections but it may not be possible to get down to zero.

This isn’t to say that high-quality hospital care doesn’t matter, because it does. But hospitals deal with sick human beings in all their infinite variety and the results aren’t always standard or predictable. The risk with metrics is that they can reduce the definition of quality to “that which can be measured” instead of the complex, nuanced, many-faceted creature that seems to be emerging the more we study it.

Photo: Wikimedia Commons

Deep-fried everything on a stick

 I’m going to reveal a secret: I’ve lived in Minnesota all my life but have never taken part in that quintessential rite of late summer – the consuming of deep-fried cheese curds (or deep-fried anything, for that matter) at the Minnesota State Fair.

Does this mean I’m not a true Minnesotan?

It’s hard to comprehend how the hot, greasy, calorie-laden culinary offerings at the state fair can be so bad for us, yet so decadently appealing.

What do we love? Time magazine compiled a list a few weeks ago of America’s favorite traditional fair food: funnel cakes, pie, corn dogs, cotton candy, caramel apples, corn on the cob.

Tame, tame, tame.

The state fair food booths these days are capable of so much more.

Among the new treats at the Minnesota State Fair this year (the fair opens Thursday and goes through Labor Day, Sept. 5) are deep-fried cookie dough, chocolate-covered jalapeno peppers on a stick, and a concoction that’s been dubbed the Breakfast Lollipop, a sausage patty dipped in corn muffin batter, deep fried and served (what else?) on a stick with maple syrup on the side.

The Kill-Me-Now-With-a-Heaping-Helping-of-Cholesterol award undoubtedly goes to the Wisconsin State Fair, which this year introduced… wait for it… deep-fried butter, described by the the vendors as “pure butter, deep fried with a crisp coating.” Diners can consume it solo or go for a combination platter that also includes a stick of chocolate-covered bacon and a cheeseburger sandwiched between Krispy Kreme doughnuts.

If you want the unusual, no one does it better than the Texas State Fair, where the concessionaires stage an annual food contest. One of the winning entries last year: fried beer. The Texas State Fair also brought the world such gastronomic delights as deep-fried latte, fried banana splits and chicken-fried bacon.

Calories? It’s safe to say that many of the most popular state fair foods are appallingly high in calories, fat and sodium. WebMD spoke to a nutritionist from Texas who guesstimated about 660 calories for a serving of fried macaroni and cheese and 500 calories for a slice of fried cheesecake.

Figure on 444 calories and 29 grams of fat for a deep-fried Snickers bar. As for the giant turkey leg, it clocks in at 1,136 calories and 54 grams of fat.

Why do we eat and even relish this stuff? Scientists who study these things point to a number of reasons. Fat, for instance, enhances flavor and texture and makes food taste better. Some of the appeal seems to be at least partly rooted in physiology; we often crave chocolate when we’re stressed or unhappy because it contains sugar, a source of energy.

On a deeper level, there’s evidence that consuming salt, fat and sugar actually alters the chemistry of the brain by stimulating a desire for more of the same. This then prompts us to seek out and eat the kinds of food that meet what in essence is a neurological need.

Finally, there’s the emotional component. Smell and memory have a strong, primal connection centered within the hippocampus of the brain. The whiff of hot fresh corn dogs and deep-fried cheese curds often triggers happy memories of being at state fairs past, making us want to relive those experiences all over again.

There are plenty of healthful alternatives at the state fair, of course. And although many nutritionists would probably advise against giving in to the deep-fried temptations, there seems to be a consensus that unless you’re on a restricted diet, a teensy bite or two of a deep-fried candy bar won’t be the end of the world.

I think I’ve just talked myself into trying the deep-fried cheese curds.

Photo courtesy of the Minnesota State Fair

The supermom blues

Would supermoms be happier if they let go of the expectation that they can do it all?

A new study suggests that working mothers who believe they can successfully juggle work and family are more likely to be depressed than moms who are willing to make tradeoffs. The study, “Even Supermoms Get the Blues: Employment, Gender Attitudes and Depression,” was presented this past weekend at the annual meeting of the American Sociological Association.

Researchers analyzed responses from 1,600 women who participated in the National Longitudinal Survey of Youth, administered by the U.S. Department of Labor. The women in this sample group were all 40 years old, married, and represented a mix of working and stay-at-home mothers.

As young adults, the women were asked to rank how much they agreed with statements such as “A woman who fulfills her family responsibilities doesn’t have time for a job outside the home”, “Working wives lead to more juvenile delinquency” and “A woman is happiest if she can stay at home with her children.”

The researchers then went back and measured levels of depression among these women when they turned 40. Across the board, working mothers were less likely to report symptoms of depression than non-working mothers, a finding that reinforces a number of earlier studies. But among the working mothers, those who had expressed a supermom attitude as young adults were at higher risk of depression by age 40 than women who were more realistic about the work-life balance.

The study didn’t examine all the contributing factors, but lead author Katrina Leupp, a graduate student in sociology at the University of Washington, said there are probably many issues that feed into supermom depression.

“Employed women who expected that work-life balance was going to be hard are probably more likely to accept that they can’t do it all,” Leupp said in an accompanying news release. For example, they might be more comfortable with tradeoffs such as leaving work early to pick up the kids or dividing household chores with their spouse, she said.

Women who believe they can successfully combine work and family with minimal tradeoffs are probably more likely to become frustrated or to feel they’re failing if they fall short of this ideal, Leupp said.

Is this solely a women’s issue? Men are dealing too with changing roles and increased expectations that they’ll share the responsibility for earning a paycheck, raising their children and doing household chores. Although they weren’t addressed in this particular study, it’s probably fair to say that men can become unhappy too when there’s a perceived imbalance between work and the rest of their lives.

Perhaps realistic expectations are simply an overall indicator of people’s ability to cope with what life hands them – a trait that could apply whether they’re a working parent or not.

Leupp’s advice: “For better mental health, working moms should accept that they can’t do it all.”

Photo: Wikimedia Commons

Matters of experience

It was a bittersweet moment for Liz O’Brien. Dr. Wilson, her longtime dentist, was retiring and turning over his practice to his young associate.

Dr. Riley “looked younger than my own kids,” O’Brien blogged recently at MedPage Today. “I could feel my skepticism rising and my initial smile of welcome turning into a wooden grin. I hoped she didn’t notice. How much experience could this new dentist have?”

It’s an age-old question: Do patients benefit when their doctor is young, fresh out of training and up to date on the latest skills and knowledge, or are they better off with someone whose wisdom has been honed by years of experience?

As it turns out, the answer isn’t clear. Various studies have attempted to identify the effect of age on physician skills and produced conflicting results. Some found that as the physician’s experience increased, so did the quality of care on some outcomes – but not on others. A couple of studies found that physician performance peaked after a certain number of years in practice and then declined.

The study that perhaps has been cited most often appeared in 2005 in the Annals of Internal Medicine, where it generated considerable stir. The authors reviewed 62 previously published studies and found that in more than half, the results suggested that the longer physicians were out of medical school, the more likely they were to provide lower-quality care.

For example, in 15 of the studies that were reviewed, physicians who were in practice longer were less likely to follow guidelines on the appropriate use of diagnostic and screening tests. In the largest of these studies, physicians who had graduated from medical school more than 20 years previously were “consistently less likely” to follow recommended cancer screening guidelines, the authors of the Annals study wrote.

Perhaps even more significantly, a handful of the studies reported longer hospital stays and worse outcomes for patients whose doctors were farther along in their careers.

Taken together, these findings fly in the face of the usual assumption that the longer physicians are in practice, the more skilled and astute they become.

The authors of the study suggested several possible explanations:

Perhaps most plausible is that physicians’ “toolkits” are created during training and may not be updated regularly. Older physicians seem less likely to adopt newly proven therapies and may be less receptive to new standards of care. In addition, practice innovations that involve theoretical shifts, such as the use of less aggressive surgical therapy for early-stage breast cancer or protocols for reducing length of stay, may be harder to incorporate into the practice of physicians who have trained long ago than innovations that add a procedure or technique consistent with a physician’s pre-existing knowledge.

Another factor might be the cultural shift that has been taking place in medicine, ushering in concepts such as evidence-based care and performance evaluation. What we might be seeing is the “cohort effect,” the study’s authors wrote. “That is, when the current generation of more recently trained physicians has been in practice for a longer time, there may be smaller differences between their practice and those of their younger colleagues than our data would suggest.”

And yet this doesn’t seem to be the whole story.

Many competencies can only be developed by years of experience, one physician wrote in response to the Annals study. “I certainly thought I knew more when I finished my residency than I think I know now. However, like the teenager who knows everything, I could not always decide when best to use the information. Use of that knowledge comes with experience.”

If older doctors can become set in their ways, younger and more inexperienced physicians might be overconfident in their skills and possibly more aggressive in their use of prescription drugs and other therapeutic interventions that don’t necessarily benefit the patient.

What does it all mean for patients trying to decide between an older physician vs. a younger one? Sometimes the considerations are practical. A relative of mine who was looking for a new physician decided to steer clear of older doctors because he wanted a longer-term partnership, not one that would likely end in a few years when the doctor retired. It can also be easier to get an appointment with younger doctors who are building a practice and will take new patients. Often, however, it comes down to personal preference and what patients are looking for in the relationship.

What seems to matter most – and what the Annals study didn’t really address – are the individual qualities that separate an OK physician from a great one: conscientious, willing to listen, open to new ideas, intellectually curious, able to work well with a team, and humble enough to recognize what they know and what they don’t know. This can describe older physicians just as easily as younger ones.

Aside from several weaknesses with the methodology, the real message of the Annals study may be the importance of lifelong learning to ensure doctors stay on top of their game as they progress through their career.

There also seems to be a lesson here about not judging a book by its cover. O’Brien reflects on her first meeting with her longtime dentist: “Dr. Wilson was a young guy, but he had a bunch of diplomas on his office wall. He had been referred by someone I knew, seemed capable, calm, and nice, and I had trusted him.”

He might have been inexperienced but he was determined to be a good dentist, she wrote. “In his case, things worked out fine – very fine indeed. I hope it will work out for Dr. Riley too. I intend to give her a chance.”

The well-balanced doggie bowl

First there was MyPlate, the federal government’s new symbol of a well-balanced dinner plate for the human species. Now our canine companions have their own version called – yep, you guessed it – MyBowl.

The interactive doggie nutrition tool was unveiled last week by its collaborators, petMD and Hill’s Science Diet.

With so many brands and varieties of dog food on the market, why would owners need help determining what’s best for their dog? Because it’s not that simple, say veterinary experts. “Canine nutrition is a complex topic, one that many owners, and frankly, many veterinarians don’t fully understand,” says Dr. Jennifer Coates, who writes the Nutrition Nuggets blog at the petMD online Dog Nutrition Center.

One longstanding myth is that dogs are carnivores and should only eat meat. Dogs actually are omnivores and need some carbohydrates for a well-balanced diet, along with protein, essential fats and oils, and plenty of fresh water, explained Dr. Coates.

When Hill’s and petMD conducted consumer surveys, one of the things they learned was that a majority of dog owners believe, incorrectly, that canine nutritional needs are the same as those of humans. They also learned that only about 1 in 10 owners actually knows the right proportion of nutrients their dog should receive, and that many dog owners would welcome more guidance on what to feed their dog.

So what should the well-balanced dog bowl contain? There should be two to three carbohydrates from whole-grain sources such as whole-grain wheat, brown rice, whole corn and potatoes. Protein, one of the building blocks for growth and energy, should be another main ingredient; chicken, beef, lamb, fish and eggs are examples of quality sources of protein. At least one source of fat or oil is necessary to support heart health, brain function and a healthier coat and skin. Quality sources of fats and oils include olive oil, soybean oil and pork fat. Finally, dogs should always have access to lots of fresh water to prevent dehydration.

One of the goals of MyBowl obviously is to sell Hill’s dog food. But you don’t have to buy the Hill’s brand to use the interactive tool and learn a little more about canine nutritional needs. For clarity, depth of information and visual appeal, I give MyBowl a rating of four paws up.

I have a question for the folks at Hill’s and petMD, though. When is there going to be a MyBowl for cats? Our household’s feline contingent wants to know.

Deconstructing ‘The Big C’

I’m not a fan of movies or TV shows about cancer. Real-life experience was quite enough for me, thankyouverymuch. In any case, the reality is often far removed from the entertainment world’s concept of what the cancer experience should look like.

So I’ve never watched “The Big C,” the Showtime comedy/drama that chronicles the life and times of Cathy Jamison, a fictional Minneapolis teacher who learns she has late-stage melanoma. (“The Big C,” which is in its second season, airs on Showtime at 9:30 p.m. Mondays.)

Darlene Hunt, the show’s creator and executive producer, has taken the creative path less traveled for this TV series.  For one thing, the main character has melanoma. It’s a departure from the standard television fare of breast-cancer stories and a long-overdue recognition of a cancer that tends to receive less attention. Its edgy approach also makes “The Big C” very unlike many of the treacle-laden dramas in which cancer has an unwelcome starring role.

If this piques your interest, there will be a chance today to hear from the executive producer herself about her inspiration for creating the show and its cast of characters. Hunt will be the guest on Frankly Speaking About Cancer, an Internet radio show of the Cancer Support Community. It airs at 3 p.m. CDT today; if you miss it, you can access it later via the archive.

Critics have given high marks to “The Big C.” But I’m far more interested in the reactions from people who’ve had firsthand experience with cancer. “Cheerleader-level annoying” is how Mary Elizabeth Williams, herself a melanoma survivor, recently described Cathy’s character.

The show seems to accurately capture many of the smaller moments, such as the pitying expressions that family, friends and acquaintances often don around someone with cancer, Williams wrote. “But convincingly, wittily depicting the terrible ordinariness of it? That’s so much harder to get right.”

Dr. Elaine Schattner, an oncologist and breast cancer survivor who blogs at Medical Lessons, has been following “The Big C” and blogging about many of the episodes. Are the details accurate from a medical standpoint? Well, for one thing, the show’s first episode wasn’t clear about the doctor’s role, Dr. Schattner writes. “His white coat is too short, in the style of a medical student’s. He uses few polysyllabic words. He looks both well-rested and neat. In one strange scene, the patient and doctor meet for lunch at a pleasant outdoor restaurant. That’s not how oncology’s practiced, at least how I know it.”

Dr. Schattner also points out that many of the medical and scientific details are either glossed over or left out.

On the WebMD forums, reactions to the show when it made its debut last year were mixed. A couple of people said it was one of their favorite shows but someone else called it “insultingly unrealistic.”

“People who are sick do not have the energy to drive around in fancy cars and have hot sex with people they barely know,” she wrote. “They need to stay home and be sick, which is why it’s so sad.”

In online discussions elsewhere, the show’s dark humor resonated with some commenters but was a turn-off to others.

There in fact doesn’t seem to be any particular script for how people handle cancer (or other diseases) in real life. Some are angry; some are resigned. Some cope with humor and laughter. Some withdraw; some go off the deep end; many are scared and confused. Some use their illness as an opportunity to set life’s reset button, others find peace in the path they’ve already chosen.

I’m not sure Cathy should be expected to be the poster girl for How To Do Cancer Right. Perhaps one of the benefits of shows like “The Big C” is that they can get people thinking and talking about what’s fiction, what’s real, what’s emotionally honest and what isn’t.

If you’ve ever watched “The Big C,” what did you think of the show? Did it change some of your views about what it’s like to have cancer?

Scarcity in the midst of plenty

Rural American supplies most of the nation’s food, but when it comes to eating fruits and vegetables, rural residents often seem to fall short.

These rather contradictory findings appear in a new study by the Essentia Institute of Rural Health in Duluth, Minn. Using data from the 2009 Behavioral Risk Factor Surveillance System, collected by the CDC from more than 400,000 U.S. households, the study’s authors found that only in 11 states did rural adults consume more fruits and vegetables than adults who lived in urban or suburban neighborhoods.

In some states the differences were quite striking. In Kentucky, for instance, about 17 percent of rural adults reported eating the recommended five servings of fruit and vegetables per day, compared to just under 22 percent of non-rural adults. In Minnesota, 19.5 percent of rural adults and 22.7 percent of non-rural adults reported meeting the daily fruit and vegetable requirement.

When the researchers further analyzed the data, they found that rural Americans who ate more fruits and vegetables had several other factors in common. These individuals were more likely to be women, more likely to be married or living with a partner, and more likely to be living in a household without children. They also had higher incomes, were more physically active, less likely to be obese and generally reported their health as being good or excellent.

Much of this is echoed in other studies that have found a consistent association among income, education, social stability and better health. Across the board, these attributes tend to be clustered in urban areas, while those who live in rural communities on average are less educated and earn less money.

Although Essentia’s study didn’t specifically address the cost or availability of fresh produce, the authors noted in an accompanying news release that affordability is a serious issue, particularly for lower-income households.

“You could be a rural person living next to a huge farm that produces fruits and vegetables and not have the means to buy them, so people in the city, who are farther removed from the source, tend to be the more likely consumers. That really brings up issues of access and cost,” said Nawal Lutfiyya, a senior research scientist and chronic disease epidemiologist for the Essentia Institute of Rural Health and lead author of the study.

Another finding emphasized in the study: Rural adults who had children living with them were more likely to skip the produce aisle than rural adults who did not have children living in the household.

This was concerning, Lutfiyya said. “Adequate fruit and vegetable consumption reduces the risk for a number of diseases and early death. Our hope is that identifying groups that are at risk can lead to better targeted public health interventions.”

Photo: Wikimedia Commons

Knock, knock: exam-room etiquette

You’re waiting in the exam room and finally hear a knock at the door, announcing that the doctor has arrived.

What happens next? A) The doctor flings open the door; B) The doctor waits for your response before opening the door.

This little nuance of the doctor-patient encounter was discussed at some length recently in one of the online patient forums. It started with a question from someone: “Have you ever noticed – they knock on the door a couple of times and then barge right in without waiting for a response.”

Whether the doctor knocks and waits for permission to come in might not seem like a big deal. To a busy physician who’s racing to stay on schedule, it might even be a pointless, unnecessary delay.

Where the doctor-patient relationship is concerned, however, it’s often viewed, at the very least, as the polite thing to do. More than this, it can help set the tone for a positive encounter.

As participants in the online forum pointed out, there can be several reasons why patients might be uncomfortable when physicians fail to observe the door-knocking etiquette.

If the patient is slow at disrobing and the doctor is punctual or even ahead of schedule, it can be awkward when the doctor barges into the room without asking, one person noted. This same person also observed, “A couple of times I’ve been really, really anxious and/or not feeling well, and sort of wanted a few seconds to compose myself.”

Other people said they knit, read, or talk on their cell phone while they wait. These folks might appreciate a five-second warning allowing them to end the conversation and put the diversions away.

One individual in the forum confessed to snooping around and playing with the blood pressure cuff and otoscope while waiting for the doctor to arrive.

Within the medical profession itself, most of the experts seem to agree: The doctor should knock on the door and (preferably) wait for the patient’s permission before coming in. Indeed, one of the many skills on which future physicians are evaluated is their ability to communicate with the patient, starting with that all-important first impression.

“Always knock before entering the examining room,” advises an online guide on preparing for the U.S. Medical Licensing Exam, or USMLE. “Knocking on the door before entering is the first step in building trust and showing respect.”

In an essay from 2008 in the New England Journal of Medicine, Dr. Michael Kahn takes this one step further by proposing “etiquette-based medicine” as a guideline for physician behavior. Training programs might not be able to successfully instill empathy and compassion in every medical student, but students can certainly learn how to behave well, he wrote.

A checklist Dr. Kahn suggests for a first meeting with a hospitalized patient starts with: “1. Ask permission to enter the room; wait for an answer.”

I think I’d feel a little silly giving the doctor permission to come in. After all, technically speaking it’s not my hospital room or exam room; I just happen to be a temporary occupant. And does it really matter whether the doctor barges in vs. waiting a second or two for a response?

On the other hand, most patients notice and appreciate the courtesy, small though it is. And they definitely notice when the doctor skips the door-knocking routine altogether – and this goes for staff as well as the physician. A scathing review on raked an Illinois doctor over the coals for, among other things, allowing two staff members to walk into the exam room without knocking first.

“How do you say ‘I don’t give a damn’?” wonders Kristin Baird of the Baird Group, a consulting firm whose expertise is in the patient experience.

Baird recently accompanied her sister, who had been undergoing cancer treatment, to a doctor visit. She writes: “The physician opened the exam-room door without knocking, stood with one hand on the door handle and the other on his hip as he coldly announced, ‘There’s nothing I can do for you. You need blood tests drawn, so go to the hospital.'”

There’s something to be learned from every experience, Baird says. “What did I learn today? I learned that there are many ways to say, ‘I don’t give a damn'” – starting with not bothering to knock on the door.

Photo: Wikimedia Commons