Showing skin: When medical photos get provocative

Picture, if you will, a magazine cover photo of a (presumably) unconscious and mostly nude female patient, one leg raised and the other sprawled sideways, being positioned for surgery by three men in scrubs.

Outpatient Surgery magazine published a photo exactly like this on its cover last month to accompany a story about safe surgical positioning, igniting a barrage of criticism from readers who felt the image was degrading, distasteful, insulting and offensive. (Click here to judge for yourself.)

What were the editors thinking? Nothing, apparently, if we’re to believe editor-in-chief Dan O’Connor, who pennedmea culpa of sorts this month, claiming the blowback “caught us by surprise.”

His rationale: “Anyone who’s been around surgery knows that patients are very frequently left much more exposed than the patient on our cover was.”

Really? Really?

Illustrations for medical articles and public health messages tend to be a sensitive subject. Remember the firestorm over Georgia’s Strong 4 Life childhood obesity campaign? Or the FDA’s failed attempt to add graphic photos to cigarette packages? Or the Minnesota Department of Health’s colonoscopy campaign featuring huge billboards of a backside with pants at half-mast? What one person sees as blunt, refreshing honesty can look to someone else like crass overkill.

Titillation as a sales tactic is nothing new either. But it’s arguably a whole different ethical proposition when this strategy is used to depict the care of patients who are vulnerable, forced to compromise their dignity and control and, in the case of surgery, unconscious and unaware of what’s being done to them.

For the record, the magazine cover photo didn’t strike me as “pornographic,” as one critic accused. But I’d call it deliberately provocative, perhaps even exploitive.

Why the choice of a female patient? Did the three people in scrubs all have to be men? Why her entire anatomy below the waist and not, say, an arm or shoulder? Why an obviously young and slender woman instead of a middle-aged man? And what was up with the double entendre in the tagline’s promise of “maximum exposure” to patient safety tips?

The article itself tackled a serious and important subject. Patients who are sedated, especially for lengthy surgeries, can be at significant risk of developing pressure ulcers that are painful, debilitating and costly to treat. Safe, proper positioning during surgery is one of the keys to avoiding this complication.

But never mind the fact that the patient demographic at greatest risk of pressure ulcers is older adults. Sex sells, so we get a cover photo of a shapely upraised leg instead of one that’s bony and arthritic. In the same manner we get images of mammography featuring one attractive semi-nude young woman after another, even though breast cancer is most prevalent in middle-aged and older women. With all that skin on display, the underlying message can become cheapened and sometimes ultimately lost.

I suppose you could argue that even when medical imagery is tarted up with firm young flesh, it at least has the benefit of depicting something real. After all, not everyone was unhappy with the surgery magazine’s choice of cover photo; one (female) surgical nurse told the editor she thought the patient “was covered appropriately.”

It begs the question, however: Appropriate according to whose definition – the patient or the OR staff?

Tellingly, no one spoke up to say the photo was inaccurate.

To give him his due, Mr. O’Connor recognizes that health care practices long regarded as standard may leave patients feeling exposed and compromised. He writes: “Well, here’s a thought. If we think patients would be uncomfortable with how we treat them when they are anesthetized, then is there something wrong with how we’re treating patients?”

Well, here’s a thought for you, Mr. O’Connor. Why don’t you ask patients themselves? I bet they’d have something to say.

But enough of my opinion. What do readers think?

Medicine’s privilege gap

Have doctors become increasingly removed from the everyday struggles of their patients, especially patients who occupy the lower rungs of the socioeconomic ladder?

A letter to the editor in the July issue of Academic Medicine raises some thought-provoking questions about a “privilege gap” that’s opening up in medicine.

It starts at the very beginning, with the selection process into medical school, writes Dr. Farzon Nahvi, an emergency medicine resident at New York University’s Bellevue Hospital:

Data from the Association of American Medical Colleges show that over 60% of medical students come from families in the top quintile of household income, with only 20% coming from families who earned in the bottom three quintiles. Similarly, the median family income of American medical students is over $100,000. In other words, the average medical student comes from the upper 15% of America.

This is anything but reflective of the patient population, Dr. Nahvi goes on to explain: “They are all of America: rich, poor and in between.”

And it has an impact, he maintains:

The unfortunate consequence of this is that patients sometimes struggle to be understood by well-meaning but, ultimately, privileged doctors who sometimes cannot relate to patients of other backgrounds.

Being privileged does not necessarily make a physician incapable of understanding the daily lives of his or her patients, of course. And many physicians resent (often rightfully so) the stereotypes that portray them as money-grubbing, golf-playing, Beamer-driving plutocrats who consider themselves above the masses.

Yet the statistics cited by Dr. Nahvi don’t lie. And they’re a problem for a society in which the health gap between the well off and the not so well off has been extensively documented. As Dr. Nahvi points out, how can doctors be aware of the issues their low-income patients face – unable to afford prescription drugs, for instance, or unable to take time off work to get to the pharmacy – when “it often doesn’t occur to the more privileged that such issues even exist”?

If medicine in the U.S. is becoming a bastion of privilege, it’s probably because it increasingly takes privilege to survive the rigors and costs of becoming a doctor.

The cost of a medical education is a significant burden for aspiring doctors; a report from the Association of American Medical Colleges puts the median amount of medical school debt at $170,000 for the Class of 2012 (and this doesn’t include any debt students may have accumulated from their preceding four years of college).

Then there’s the protracted training time to consider: four years of undergraduate education, four years of medical school and, at a minimum, three years of residency before doctors actually start earning real money. Once they’ve arrived, they can start acquiring the trappings of an upper-middle-class lifestyle – but this is small comfort to the bright young high-schooler from a low-income family who dreams of being a doctor but lacks the financial wherewithal to even get a foot in the door.

One could also argue that the medical school admission process itself tends to favor students with the “right” kind of background, i.e. those who already possess strong socioeconomic advantages.

So what’s the solution? Dr. Nahvi writes:

The stopgap fix is to better train all students to deal with all types of patients. A true long-term solution, however, is to steer more representative slices of America – individuals from all income levels – into medicine. There are many ideas for how to do this, from special recruitment strategies to arrangements for financial aid. Fundamentally though, for change to occur, admission committees need to recognize the importance of getting more middle- and low-income students into our medical education system.

Doing so won’t be easy, because it’s not just about money. Many other ingredients come into play: a solid grade school and high school education, parents and teachers who encourage careers in medicine and hold aspiring students to high expectations, and even local role models who can show young people that someone like them can successfully become a doctor.

There doesn’t seem to be much public discussion about how to narrow the privilege gap in medicine. Since part of the solution likely will lie at the community level, maybe this needs to change.

Every move you make, the patient is watching you

Patients have a way of hanging onto every nonverbal cue they notice, no matter how small.

Was the doctor frowning as he/she entered the exam room? Did the nurse at the hospital bedside seem harried while checking your vital signs? What was up with that brusque welcome from the receptionist? Was that an eye roll behind your back?

In any setting, body language is a big deal. The fact that people are watching, interpreting and judging, especially when it comes to leaders, is illustrated vividly by New York Times columnist Adam Bryant in his latest piece, “Are You Mad at Me?”

One day a colleague pulled him aside and asked to speak privately with him in the conference room. The question, wrote Bryant, “came out of the blue. ‘Are you mad at me?’”

He goes on to explain:

I was puzzled, but I realized later what was going on. As an editor, I faced a lot of tight deadlines, and I would often have just a short window to get a story into shape for the next day’s paper. I’m guessing I was thinking hard about some story as I walked through the newsroom one day – probably furrowing my brow, my mind a million miles away – when I briefly locked eyes with my colleague, who was startled enough by my body language to later pull me into a conference room to wonder if the air needed to be cleared between us.

Bryant writes, “I learned a memorable lesson that day about how people can read so much into subtle, and often unintended, cues.”

How much more so in health care, where patients often feel vulnerable, needy and at the mercy of a system that may or may not be responsive – and where misinterpreting the cues can have very real consequences.

The literature on the role of body language is varied and fascinating. One study carried out in the Netherlands found that people had a positive physiological response to pictures of happy facial expressions. But they showed signs of higher anxiety in response to facial expressions that were fearful or angry – and their reaction to angry signals was heightened when they were already anxious. Other studies have found people are quicker and more accurate at detecting angry faces than happy faces.

Less is known about the impact of nonverbal cues in the health care setting. For just one example of how body language can influence the doctor-patient interaction, however, consider a study, published last year, that examined the differences in nonverbal communication between white and African-American doctors when talking to older patients.

The study found that white doctors tended to treat all their older patients the same way. But black doctors often gave contradictory nonverbal cues to their white patients – smiling, for example, while crossing their arms or legs.

Other studies have found a similar pattern among female physicians interacting with male patients.

Most of these studies are fairly small and don’t really explore the impact of the doctor’s body language on quality of care or outcomes. Disparities in health care are well known to exist, however, and unconscious social bias can be very difficult to root out and change.

Meanwhile, patients notice these subtle cues and draw their own conclusions, accurately or not.

“How do you say, ‘I don’t give a damn’?” asks Kristin Baird, owner of the Baird Group and a consultant on improving the patient experience.

She relates the experience of accompanying her sick sister to see the doctor. The doctor barged into the exam room, stood with one hand on the door handle and informed her sister there was nothing he could do and that she should go to the hospital.

“Everything about his demeanor, both verbal and nonverbal, screamed: “I don’t give a damn about you. You’re not worthy of my time, so don’t bother me,’” Baird writes. “This was hard enough for me to witness, and I wasn’t the one in need of his care.”

Maybe it wasn’t personal. Maybe the doctor had just had a terrible encounter with the previous patient, or was distracted by a family crisis or suffering from burnout or depression. It leaves a mark, however, and it can become one more piece of baggage for patients to tote around and unpack in their next visit to the health care system.

There seem to be a lot of angry patients these days, and at least some of it stems from the nonverbal cues that shape people’s perceptions of how they’re treated: “The person at the front desk barely made eye contact with me. “The doctor just sat there and looked at the computer.” “The nurse/physical therapist/radiologic technologist seemed in such a hurry.”

One consequence might be heightened anxiety that prevents the patient from communicating honestly and allowing the doctor to reach the right diagnosis. If patients perceive that their participation in their care is met with annoyance or secret eye rolls, they may become frustrated or outright hostile. Perhaps they’ll take the opposite route and decide it’s better to be passive and uninvolved, even if none of these approaches are beneficial to their health.

Like hawks, patients are watching every move their clinician makes and levying their judgment, whether it’s fair and accurate or not.

5 health messages from Angelina Jolie

It’s been more than two weeks since actress Angelina Jolie revealed the bombshell story of her preventive double mastectomy, and I’m still trying to wrap my head around the implications.

This was a news item that was hard to miss, given the reaction and commentary it ignited. For those who’ve been out of the loop, here’s the story in a nutshell: Jolie went public with a New York Times essay on May 14, telling her story of recently undergoing a double mastectomy to lower her risk of developing breast cancer. Her mother died at age 56 of ovarian cancer and Jolie herself has the BRCA1 gene, heightening her chances of someday being diagnosed with breast and/or ovarian cancer.

“Once I knew that this was my reality, I decided to be proactive and to minimize the risk as much as I could,” Jolie wrote.

She explained that although the decision wasn’t easy, “I feel empowered that I made a strong choice that in no way diminishes my femininity.”

It’s a compelling personal story but the ensuing reactions made it clear there was much more to it than one woman’s choice. After a lot of reading, I saw these main issues that kept rising to the surface:

1. Genetic testing – helpful or not? Jolie urged women to get tested for the BRCA gene, especially if there’s a family history of breast and/or ovarian cancer. What’s left unsaid is that only about 1 percent of women in the U.S. have the BRCA1 or BRCA2 gene. To be sure, genetic testing could help these women weigh their options, but it may not be useful to the population as a whole. For women who test negative for the gene, it might even create a false sense of security, since the vast majority of breast and ovarian cancers are not tied to any obvious risk factors.

Nor is it enough to simply have access to genetic information; people also need guidance to help them make sense of the information and make decisions based on their own values and tolerance of risk.

2. Risk isn’t always perceived accurately. By undergoing a preventive mastectomy, Jolie was able to lower her risk of breast cancer from 87 percent to under 5 percent. But these numbers seem to mostly reflect the odds ratio, i.e. overall likelihood given a specific set of circumstances. They don’t necessarily indicate actual risk. Moreover, even a drastic measure such as a preventive bilateral mastectomy does not lower the risk to zero, nor does it lower the risk of developing other types of cancer.

3. Be careful of the anecdote. Personal stories resonate with people. Jolie put a human face on the ordeal of learning you have the BRCA gene and pre-emptively having both breasts removed. But this is one person’s story; the experience may be quite different for someone else.

Jolie writes that her surgery and breast reconstruction were complication-free. “The results can be beautiful,” she says. No doubt this is the case for some people but it glosses over the possibility of scarring, infection, repeat surgeries and all the other things that can make these procedures anything but beautiful.

4. Surgery to remove healthy body parts, even when heightened risk is present, is a drastic measure. Maybe this speaks to how Americans have been conditioned to fear breast cancer. Some of this fear may be justified. In spite of massive investment in research, treatment options remain limited for metastatic, or widespread, breast cancer. The fact remains, however, that it’s a very aggressive way to try to prevent disease.

Interestingly, studies going back at least a decade indicate that most women who undergo a preventive mastectomy are happy with their decision and feel less anxious about their risk for cancer. Unfortunately some of the public discussion about Jolie’s story has become muddled over the distinction between preventive mastectomy and mastectomy once cancer is diagnosed. These are two different things that cannot accurately be weighed on the same scale.

5. Individual medical decisions are exactly that – individual choices. I wouldn’t judge Angelina Jolie for her choice. Only she can determine what level of risk she’s willing to live with and what she’s willing to do to reduce that risk. For someone else, the decision might be entirely different.

The crux seems to be whether patients have accurate, realistic information and a good understanding of their personal values and preferences – a principal that applies in countless other health decisions, from whether to take a prescription drug to making end-of-life decisions. Maybe by sharing her story, Jolie has contributed to moving along a complicated conversation that needs to happen.

A health scare, but not enough to change their ways

A heart attack or a cancer diagnosis is usually life-changing, yet many people do little afterwards to alter their lifestyle or behavior in ways that might reduce their future risk.

Various studies have been cropping up lately, all with the same conclusion. One can’t help connecting the dots and wondering what it bodes for the long-term health picture.

The bigger question  here, though, isn’t “what.” It’s “why.”

The latest study comes from Canada, where researchers found that even when people had a history of coronary heart disease or stroke that put them in a higher risk group, they weren’t much more likely than the general population to adopt three key changes associated with reducing their risk of a second heart attack or stroke: smoking cessation, regular physical activity and a healthy diet.

The study used epidemiological data on more than 154,000 individuals from 17 countries. Of the 7,500 participants who reported a previous history of heart attack or stroke, about 18 percent continued to smoke and 60 percent didn’t follow the recommendations for a healthier diet.

Not surprisingly, those who lived in higher-income countries fared better on all three measures.

Here’s another study, this time from the cancer front: Researchers who looked at survivors of melanoma, the most serious form of skin cancer, found that about one in four skipped the use of sunscreen and 2 percent continued to visit tanning salons.

The study results “blew my mind,” Dr. Anees Chagpar, the study’s author, told CBS News.

Other studies have found that cancer survivors are just as likely as everyone else to be overweight and inactive, even though these two factors are tied to a higher risk of recurrence for certain forms of cancer.

Is this a huge collective failure of patients to heed the so-called teachable moment in health care? Or does it signal something deeper?

I suspect it’s the latter. As anyone who has attempted to adopt a healthier lifestyle can attest, changing your ways is often very difficult. It takes a high degree of motivation and support to persevere, and the stress of a serious health event can add complicating factors that might not be addressed or even recognized.

Depression, for example, is common among heart attack survivors, yet the possibility of post-heart attack depression is rarely discussed with these patients. Multiple studies have found that among those who develop depression after their heart attack, the majority are undiagnosed and untreated. That they may struggle and fail to adopt healthier lifestyle habits should not be surprising.

One survivor, responding to a frank entry on the Heart Sisters blog about depression and heart attack survival, put it this way: “Physically I am not the same person and don’t think I ever will be. Everyday life details are not important to me anymore. I see myself stepping further and further behind and no one understands.”

Ditto for the physical and mental toll of cancer treatment, which can leave survivors with long-lasting fatigue, cognitive impairment, nerve damage and more. Although efforts are underway to improve survivorship care in the U.S., progress is slow and uneven, leaving many survivors – perhaps the majority – still under the radar.

The health care system itself hasn’t completely figured out who should handle the “teachable moment.” Should it be the cardiologist? The oncologist? The primary care doctor? A rehab nurse? In the meantime, opportunities to talk to patients about making behavior changes are being missed.

Then there’s the question of who pays to help people change their habits after a major health event – and I’m assuming here that many will need some support, even if it’s only minimal.

It takes staff and resources to provide the education that may be necessary, and reimbursement is often low. Although many health insurance plans include coverage for smoking cessation, there’s considerable variation in what they offer, and some states don’t cover tobacco cessation at all for their Medicaid enrollees.

We could ask people to pay out of pocket for their patient education, nutritional counseling, depression screening and tobacco quit services, but this doesn’t mean they can afford it or that they would make it a financial priority – or, indeed, that they would recognize they might need all of these.

Maybe the whole notion that a health scare should be enough to make people change their ways is flawed. It might be motivation only for some. For others, the motivating factor may be something very different. If we hold the tsk-tsk’ing long enough, we might start to figure out what really lies behind the seeming lack of lifestyle change and what can be done to have a more positive impact.

Health care’s new paternalism?

Is the patient’s blood pressure at 120 over 80 or below and controlled with one or more medications if necessary? Check.

Normal body mass index? Check.

Recommended screenings carried out according to the recommended schedule? Check, check, check.

But here’s the real question: Are all of these goals important to the patient, or does he/she see them as frustrating, burdensome and perhaps impossible to achieve?

The push toward better health care has organizations and clinicians focusing a tremendous amount of energy these days on patient outcomes. Few policymakers seem to have asked, however, where patients fit into this and how they feel about having their health goals – appropriate weight, appropriate blood pressure, glucose and cholesterol levels, appropriate prescription medications and so on – essentially dictated to them.

Is it going too far to call it a new form of health care paternalism?

If you listen closely, you can hear the beginnings of some pushback from people like Dr. Victor Montori of the Mayo Clinic, who last week talked to the Star Tribune of Minneapolis about hitting the pause button on all the checklists and having a heart-to-heart conversation with patients about what they really want.

An excerpt from the newspaper’s interview with Dr. Montori:

He argues that doctors must take into account the patients’ “values and preferences.” If one drug can bring their blood sugar down a notch, but doesn’t make them feel better, is it worth taking? “It’s making sure we don’t make any decisions about them without them,” he said. It’s a strategy that stops demanding perfection from patients and focuses on the treatments that are most important to them. “So they only get what they need and what they want.”

This is squarely at odds with the current approach of setting goals and measures and expecting all the players to reach these targets in order to achieve quality care.

There’s much at stake. By fiscal year 2015, fully 30 percent of Medicare payments to hospitals will be based on outcomes. Medical practices are dealing with similar pressure to reach specific goals in patient care.

This isn’t to say health care shouldn’t be held accountable for results. Outcomes do matter. But when an organization’s fiscal health and people’s livelihoods are on the line, it’s not hard to see why there would be a rising tide of all-around frustration when patients can’t – or won’t – meet the prescribed goals.

Meetings of the Rice Memorial Hospital board here in Willmar are normally rather subdued, but one of the most animated discussions I’ve seen in months erupted this week over the issue of patient adherence.

The doctors in the room spoke of the challenge of persuading their patients to adhere to the standard, and the frustration of being penalized when they don’t.

The hospital leaders in the room spoke of the unenviable task of being asked to meet goals that may hinge on patient decisions and behaviors beyond the hospital’s control.

The implications go deeper than this, though. What about hospitals who care for high numbers of elderly, frail and medically complex patients who may not have outcomes as favorable as that of a younger, healthy population? What about the 90-year-old who has been living with congestive heart failure for a decade and has decided it’s time to stop with aggressive management of the condition?

Should hospitals and medical practices become more selective about the patients they’re willing to take and start turning away those deemed to be too sick or too complicated or less likely to be compliant? Most Americans would agree it’s unethical (or at least unfair) to cherry-pick the “best” patients, but there’s no denying this looms as one of the unintended consequences of outcome-based payment. Left unanswered in all of this is who, exactly, will care for the sickest and most vulnerable when the reimbursement model is rigged against them.

Finally, there’s the issue of patient autonomy. The patient’s right to make his or her own medical decisions is one of the core tenets in American health care. This basic value seems to mesh uneasily, however, with performance-based payment. What happens to patients who don’t want to take a particular medication because the regimen is too burdensome or the side effects are intolerable? What about the patient who simply wants to feel better and function better rather than meeting specific target numbers?

To make things even more complicated, all of this is taking place simultaneously with a growing emphasis on patient-centered health care and shared decisionmaking.

It’s far from clear how this is supposed to fit together, or the extent to which the average consumer is aware of the push-pull between giving patients more say in their care, while at the same time deciding on their behalf what the measure of their health should be.

Don’t look for this to be resolved anytime soon. No one ever said health care was simple.

Change fatigue in health care

The patient was at the hospital for a straightforward same-day surgery but the admission process fell far short of efficient. She was directed hither and yon – first to the registration desk, then to the lab, then to another location for a pre-surgery assessment, then a fourth stop to collect a medical history.

When an accompanying friend asked one of the nurses why all these steps couldn’t be consolidated at the bedside, the response, delivered with a shrug, was “That’s the process – we’ve tried a lot of other things but they never work for long.”

What’s being described here is more than an obvious failure to be patient-centered. Tammy Merisotis, of GE Healthcare, sees it as a prime example of the apathy brought on by change fatigue.

“Change fatigue occurs when staff are expected to make multiple or continued changes in workflow process and patient protocols, without seeing the benefits of those changes in their everyday work,” she writes. “As they are bombarded with constant change, it is easy for people to become disengaged and resistant to change.”

Change fatigue is nothing new. But there seems to be much more of it these days, especially in health care, which is under intense pressure to change, change and change some more.

Although accurate estimates are hard to come by, it’s thought that as many as half of organizations overall are attempting to implement three or more major changes all at the same time.

In and of itself, change isn’t necessarily bad. A good chunk of the transformation taking place in health care is arguably for the better – more emphasis on evidence-based practices, more emphasis on safety, greater attention to how patients experience the health care system, and more awareness of the role of organizational culture in fostering and sustaining high-quality care, to name a few.

But the dark underbelly of all this transformation is that the people carrying out the work can grow tired of the continual demand to adapt and change, particularly when they may not see any immediate benefit. Burnout has always been an occupational hazard in the high-stress environment of health care; change fatigue is upping the ante even more.

Consider what nurses in an online forum had to say about the growing practice of “bedside report,” or including the patient and family in the exchange of information between nurses as they go on and off shift.

“I only hope it will be a short lived fad, but we were told by our new manager this will not be an option,” grumbled one nurse who was struggling with the logistics of how the bedside report process is supposed to happen.

Elsewhere, nurses shared their experiences with working in a unit that closed because of hospital restructuring.

“We were given practically no notice and they did not help us with a transfer, it was up to us to find a new opportunity,” one person wrote. “I’ve been in a new unit for about 4 months and it’s a constant struggle being the new one on the unit, learning a whole new way of doing things and just learning a new specialty.”

The sense of continually coping with change seems to extend throughout the health care world. Although the patient-centered medical home is viewed by many as the direction in which primary care needs to go, early adopters are learning that it requires deep, structural and sustained change in order to be successful.

Take, for example, an assessment of the first national demonstration project involving medical practices that were transitioning to the patient-centered medical home model. Even in physician groups that were highly motivated, change fatigue was a serious concern, wrote the authors of the analysis, which appeared in the Annals of Family Medicine:

The work is daunting and exhausting and occurring in practices that already felt as if they were running as fast as they could. This type of transformative change, if done too fast, can damage practices and often result in staff burnout, turnover and financial distress.

Surprisingly, there’s been little substantive research on how organizations cope with major changes. But in a study published a few years ago in the Personnel Psychology journal, researchers who tracked employees at a large government office going through major organizational restructuring found that the changes had a huge impact, not only on people’s emotional well-being but on their job performance as well.

Workers who perceived the changes as negative were at higher risk of calling in sick more frequently and performing more poorly on the job, the researchers found. These folks also were more likely to quit.

Another risk is that organizations may simply run out of steam, with new initiatives that go nowhere or fall short of what they intended to accomplish.

One lesson the researchers learned from their study: Although change is usually inevitable, change fatigue isn’t. Organizations that managed change most successfully were those that kept workers in the communication loop and emphasized what was positive about the changes, the researchers said.

“Communicate, communicate, communicate,” is the advice of Angelo Kinicki, one of the authors of the Personnel Psychology study. “In life, stuff happens. What matters is not so much what that stuff is, objectively speaking, but what matters is how we interpret it.”

Moving in with the adult children

Are middle-aged adults ready to have their aging parents move in with them?

Sort of, but if the results of a recent survey are any indication, many adult children anticipate there will be some difficult moments – if, indeed, they’ve thought about it at all.

The survey was released this week, just in time for Mother’s Day. It was commissioned by Visiting Angels, an in-home senior care company with some 450 private-duty agencies across the United States, and provides a revealing snapshot of how adult children view their responsibilities toward aging parents.

The good news: Most of the survey participants (there were 1,118 respondents age 40 and over, three-fourths of whom were women) said they would do whatever it takes to look out for their parents. Nor were they motivated by hopes of an inheritance; three out of four said they would pay for a parent’s care out of their own pocket if necessary.

The not-so-good news: Most adult children haven’t spent much time thinking about any of this. In fact, almost three-fourths of the survey respondents said they didn’t have a plan for taking care of an aging parent. And about half had never even talked to their parents about the kind of care they want as they age.

This is no small issue. Middle-aged adults increasingly are caught between managing their own lives and caring for an older parent (and often looking out for their own children as well).

Some interesting statistics from a Pew Research Center report released in January:

- Nearly half of American adults in their 40s and 50s have a parent who is 65 or older, and about one in five provided financial support to an aging parent.

- The majority believed adult children have a responsibility to take care of their parents.

- Adult children are a leading source of emotional support as well as financial and practical help for older parents. Also, their responsibilities usually increase as their parents get older.

But as the Visiting Angels survey revealed, decisions about a parent’s living arrangements adds a whole new layer of issues. The survey found, for instance, that lack of room and lack of privacy were the two biggest concerns among adult children contemplating having a parent move in with them. Only 31 percent wanted their parents to move in with them, and four out of 10 said they preferred to have their parents remain in their own home with a caregiver.

The survey uncovered potential for conflict among siblings as well. When asked which of the adult children should have the most responsibility for Mom and Dad, here was the breakdown: Adult children who live closest to their parents topped the list, followed by the child perceived to have fewer other responsibilities, i.e. no spouse and/or children, and lastly by the child who was the most secure financially. (The survey didn’t ask whether adult children felt this was fair.)

In a final and telling statistic, close to half of the survey respondents anticipated some kind of conflict with a parent, a sibling or a spouse or significant other over decisions on how to care for an aging parent.

When AgingCare.com recently posed the question, “Do you regret the decision to have an elderly parent move in with you?”, there were more than 200 responses ranging from positive to frustrated to outright at-the-end-of-my-rope.

“Yes, I regret agreeing to move cross country, leaving my good paying job and becoming primary caretaker for my father-in-law. God help me, I am so looking forward to this being over,” one person wrote.

Someone else lamented how her parents took over the family’s life after moving into a specially built addition to her home. “What I forgot when we decided to let them live with us is that they have never had a respect for our privacy and my father and I have never gotten along,” she confessed.

Yet another commenter was enduring sleepless nights, wondering whether to have her mother move in with her. “If she lives here our lives will completely revolve around her,” she wrote.

But then there were responses like this one: “I love having my Dad live with me… When I was a kid, he worked all the time, so I missed out on a lot of time with him. Now I REALLY know my Dad, and he is a wonderful man.”

Many adult children seem realistic about the issues they’re likely to face as their parents grow older, even if they haven’t thought too deeply about it yet.

But if the survey results, and the voices of those who’ve already been there, are any indication, adult children ought to start thinking well ahead of time about the what-ifs. Too often, decisions aren’t made until there’s a crisis and everyone’s emotions are running high, says Larry Meigs, CEO of Visiting Angels.

His advice: “You need to meet now with your parents and siblings to decide on a solution that appeals to everyone involved.”

A clinician who looks like me

The face of America, and the people who seek health care every day at clinics, hospitals, urgent care centers and emergency rooms, is becoming ever more diverse. But you’d never know it by looking at the average U.S. medical school, where the faculty remains resolutely white and male.

The Association of American Medical Colleges recently examined this implicit bias in an article that takes a look at the situation and what medical schools are doing to cultivate a leadership that is more diverse.

It matters because when medical school leaders and faculty come from varied backgrounds, they bring a more inclusive approach to how medical students – the physician workforce of the future – are trained, Dr. Hannah Valantine, senior associate dean of faculty affairs at the Stanford University Medical School Office of Diversity and Leadership, told the AAMC Reporter recently.

“We are facing complex problems that will require diverse perspectives to solve,” she said. “The extent to which we can retain diverse faculty will drive our excellence in education, research and patient care.”

Disparities in health and in health care unfortunately are pervasive. They’re manifested in many ways: how people live, whether their environment is safe, whether they have access to health insurance and affordable care. At least some of the disparities, however, seem to be rooted in a health care system that doesn’t always recognize or appreciate the differences, both clinical and cultural, that make human beings so diverse.

Take, for example, a cardiovascular health initiative that was one of the topics of discussion last month at the 10th annual national summit on health disparities. The initiative, now in its second phase, is aimed at giving doctors more knowledge about treating minority patients and improving their cardiovascular outcomes.

Speakers at the event said physicians often are unaware of the nuances in treating patients of diverse ethnic and racial backgrounds, hence may fail to recognize heightened risk or important early signs of chronic disease. One of them is the so-called triglyceride paradox, or the fact that blacks can have high levels of high-density lipoproteins (“good” cholesterol) and low triglycerides yet still be at high risk of cardiovascular disease.

To what extent does the failure to see these nuances reflect assumptions that “every patient is like me,” i.e. white? Some studies have noted that black patients are referred for cardiac catheterization or bypass surgery less often than white patients, even when their symptoms are the same, which suggests at least some level of inequality.

“Clearly there is some subconscious bias that is going on,” Dr. Conrad Smith of the University of Pittsburgh Medical Center told MedPage Today last week.

Physicians need to be aware of the differences in how they approach patients of varying backgrounds and the impact this has on outcomes, Dr. Smith said. “The education of physicians is going to be paramount if we want to close that gap.”

This isn’t to say prejudice and discrimination are rampant among health care professionals. The vast majority are skilled and well-intentioned. Yet their training and background may not necessarily have equipped them to recognize their own assumptions.

Consider, for example, the implications this may have for conducting end-of-life discussions. Americans strongly favor telling the truth to patients when further medical care is futile; other cultures view this as harmful and believe the patient should not be told.

U.S. clinicians might become deeply frustrated, perhaps even angry, with immigrant or refugee patients who refuse testing and treatment for tuberculosis. What they may not understand is the stigma associated with TB in these cultures and differing practices in when and how medication is prescribed.

It’s not hard to see how misunderstandings can arise. Sometimes these spill over into the clinician-patient relationship, not only in how people communicate with each other (Do they feel their perspective is heard? Do they feel their values are respected?) but in the quality of care the patient receives.

Some studies have found that doctors relate better to patients when they share common ground, such as socioeconomic background – in other words, patients who aren’t perceived as “other.” These same studies also have found that when doctor and patient come from similar backgrounds, the doctor is more likely to take the patient’s symptoms seriously and more likely to trust that the patient will follow medical advice.

It argues that unconscious bias in favor of one’s own tribe is very real. It also argues for a better doctor-patient relationship and greater comfort level when an increasingly diverse patient population can receive care from someone with whom they identify.

That the health professions are having robust discussion about instilling diversity and cultural competency within their ranks is an indication of how much progress has been made in the past couple of decades. There in fact have been calls to broaden the definition of what constitutes diversity to include religion, gender identity and sexual orientation. In a newly published study, students in the medicine, physician assistant and physical therapy programs at the University of Colorado supported the value of an inclusive, respectful campus environment – but they also reported that disparaging remarks and offensive behavior toward minorities of all kinds continue to persist.

“According to these students, the institution must embrace a broader definition of diversity, such that all minority groups are valued, including individuals with conservative viewpoints or strong religious beliefs, the poor and uninsured, GLBT individuals, women and non-English speakers,” the researchers concluded.

Few people are without bias in some form and this goes for every walk of life, not just health care. The challenge lies in recognizing it and overcoming it so all patients get the care they need, even if the clinician doesn’t look like them.

There’s a health app for that… but will it help?

If you’re someone like Jared Sieling, you’ll track everything you eat for weeks and then pore over the data to uncover your nutritional shortcomings. Other self-trackers monitor their heart rate during workouts, log how many steps they take each day or chart how far and how fast they perform on their daily run.

Are these “elite livers”, as some of them call themselves, a standout example of highly engaged health consumers or are they just, well, obsessed? And where does the average person fit into this?

The rising popularity of self-tracking was explored recently by the Star Tribune of Minneapolis, where a Twin Cities Quantified Self meetup took place this month. Sieling told the newspaper that self-tracking has helped him zero in on where his health habits could improve – more protein in his diet, for instance.

Most Americans track their personal health in some way or another. We weigh ourselves on the bathroom scale, check our blood pressure with a home monitor or wear a pedometer to count how many steps we take each day. As athlete Erin Klegstad told the Star Tribune, “I just like to see the numbers and [it's] fun to watch it all. It’s taught me what my body can do and makes me push harder.”

But with the growth of hundreds of new apps, monitors and gadgets, self-tracking has reached a whole new level of data collection and analysis. Many people are starting to wonder: Are health apps really as beneficial as they’re cracked up to be?

Heart attack survivor and Heart Sisters blogger Carolyn Thomas attended a conference last fall in Silicon Valley, where she encountered a whole passel of mobile health app designers. All of them were convinced that technology is the answer to healthier living, but Thomas couldn’t help wondering if any of them knew what it’s like to be a real patient.

She writes:

It struck me that the imaginary patient using this technology wasn’t anything like me, or my readers, or most of the heart patients I meet or hear from or talk to on any given day. Instead, the patient that the hypemeisters talked about seemed to be some kind of fairy tale fantasy patient: tech-savvy, highly motivated, compliant, eager to track every possible health indicator 24/7, and most of all – oh, did I mention? – NOT SICK.

In other words, an apparent case of “healthy privilege,” or the tendency of the healthy to assume everyone else is like them, Thomas wrote.

Jessie Gruman, president and founder of the Center for Advancing Health in Washington, D.C., is recovering from stomach cancer and needs to eat something every hour, a task that takes time, planning and mental energy every single day. But when she downloaded a popular app to help keep a food diary, she found it was more time-consuming to use the app than to actually eat the food.

“I used the app for three days before I resorted to pen and paper,” she wrote a few months ago in an “Open Letter to Mobile Health App Developers and Their Funders.”

Gruman offers this real-life advice to the app people: 1. We will not use mobile apps that add to the time we spend caring for ourselves. 2. We do not respond well to nagging. 3. We favor apps that are linked to (or associated with) our clinicians.

There’s a big difference between those who use technology to track every aspect of their daily health and those who are living with chronic conditions and may not have the energy or commitment to use the technology in a meaningful way, Thomas writes.

“I do consider myself an engaged patient, but I do not own a smartphone, a 7-inch tablet, or an iPad – nor, as a person with a chronic illness living on a modest disability pension, could I afford them,” she writes.

Even the medical community is struggling to figure out the role that self-tracking can, or should, play in personal health.

Dr. Michael Joyner, a physiologist and professor of anesthesiology at the Mayo Clinic, told the Star Tribune of Minneapolis that the proliferation of health apps offers some hope for improving the management of chronic diseases. But at the same time, it’s possible to go overboard with detailed self-tracking, he acknowledges. “Are they living life or are they tracking life?”

Dr. Leslie Kernisan, who provides care for geriatric patients, recently pondered whether she should be prescribing apps – and if so, which ones?

Apps can be useful if they help patients meet their medical goals or provide information that’s clinically useful in some way, she wrote. “But I worry that we’ll end up making the same mistakes with apps as we’ve often made with the prescription of medications: recommendations based on marketing rather than thoughtful assessment of expected value, and prescription of apps for every little medical condition rather than choosing a few high-yield apps based on a whole-person approach to managing healthcare.”

Here’s some final food for thought: A 2011 consumer survey of smartphone users found that one in four health-related apps were used only once, and nearly three out of four were abandoned by the 10th use.

Are health apps mostly a niche technology for the worried well and the data junkies? Or do they hold some promise for real, live, actual patients in need of tools to help them manage their health in ways that are constructive rather than burdensome? At the moment, the answer isn’t at all clear.