Learning to give bad news

Bad news awaited the patient as she recovered from surgery… but the first person to enter her hospital room while she anxiously waited for the pathology report wasn’t the doctor, it was a medical student.

Could this situation have been handled better? Kyle Amber, a fourth-year student at the University of Miami Miller School of Medicine in Miami, Fla., writes about the experience in a recent issue of Academic Medicine, the journal of the Association of American Medical  Colleges:

“Is it cancer?” she asked, as if completely unfazed by the three hours she waited to be seen at this busy clinic that treats patients with poor access to care. All I could reply was, “The resident will come into the room in a few minutes to discuss the findings with you.” Let down, she nodded and told me she understood. Even as I attempted to comfort my patient, I knew it would take quite some time before a resident would be able to see her in this busy teaching clinic. An hour after I had left the room, the resident walked into the room, introduced himself to the patient, and told her that the pathology demonstrated a widely disseminated Stage IV cancer.

If the patient is about to receive bad news, the first person in the room should not be a medical student, he concludes. “This is no way to deliver bad news.”

Breaking bad news to patients and families is one of the most difficult tasks for a doctor to undertake. It’s stressful, and few physicians ever look forward to it or truly become comfortable with it. Yet at some point during their training, they have to confront it.

How do you teach this skill to medical students? There’s a fair amount of debate in medical academia over whether empathy can be taught. Students can learn and practice communication skills, however, that help increase their comfort level as well as their competency. The evidence also suggests that training can help instill awareness and behaviors that make it less likely for bad news to be delivered in ways that are insensitive or devoid of compassion.

At one time, doctors received little if any formal training in how to communicate bad news. They mostly learned by watching more seasoned mentors carry out this task. Hopefully they saw good examples, but maybe they didn’t. These days, medical schools are recognizing it’s an important skill for students to learn and most training programs have incorporated it in their curriculum.

But there seems to be an unwritten rule that medical students should never, ever be the bearer of bad news in a real-life clinical setting. That responsibility lies with more senior colleagues who are directly involved in caring for the patient.

It’s tough, then, for a medical student to be placed in the position of being with a patient who has been waiting for hours to hear the outcome of her surgery, yet be unable to do anything about that patient’s most immediate need – the need for information. Sometimes a student’s most memorable lesson is about how to handle things differently next time.

Medicine’s privilege gap

Have doctors become increasingly removed from the everyday struggles of their patients, especially patients who occupy the lower rungs of the socioeconomic ladder?

A letter to the editor in the July issue of Academic Medicine raises some thought-provoking questions about a “privilege gap” that’s opening up in medicine.

It starts at the very beginning, with the selection process into medical school, writes Dr. Farzon Nahvi, an emergency medicine resident at New York University’s Bellevue Hospital:

Data from the Association of American Medical Colleges show that over 60% of medical students come from families in the top quintile of household income, with only 20% coming from families who earned in the bottom three quintiles. Similarly, the median family income of American medical students is over $100,000. In other words, the average medical student comes from the upper 15% of America.

This is anything but reflective of the patient population, Dr. Nahvi goes on to explain: “They are all of America: rich, poor and in between.”

And it has an impact, he maintains:

The unfortunate consequence of this is that patients sometimes struggle to be understood by well-meaning but, ultimately, privileged doctors who sometimes cannot relate to patients of other backgrounds.

Being privileged does not necessarily make a physician incapable of understanding the daily lives of his or her patients, of course. And many physicians resent (often rightfully so) the stereotypes that portray them as money-grubbing, golf-playing, Beamer-driving plutocrats who consider themselves above the masses.

Yet the statistics cited by Dr. Nahvi don’t lie. And they’re a problem for a society in which the health gap between the well off and the not so well off has been extensively documented. As Dr. Nahvi points out, how can doctors be aware of the issues their low-income patients face – unable to afford prescription drugs, for instance, or unable to take time off work to get to the pharmacy – when “it often doesn’t occur to the more privileged that such issues even exist”?

If medicine in the U.S. is becoming a bastion of privilege, it’s probably because it increasingly takes privilege to survive the rigors and costs of becoming a doctor.

The cost of a medical education is a significant burden for aspiring doctors; a report from the Association of American Medical Colleges puts the median amount of medical school debt at $170,000 for the Class of 2012 (and this doesn’t include any debt students may have accumulated from their preceding four years of college).

Then there’s the protracted training time to consider: four years of undergraduate education, four years of medical school and, at a minimum, three years of residency before doctors actually start earning real money. Once they’ve arrived, they can start acquiring the trappings of an upper-middle-class lifestyle – but this is small comfort to the bright young high-schooler from a low-income family who dreams of being a doctor but lacks the financial wherewithal to even get a foot in the door.

One could also argue that the medical school admission process itself tends to favor students with the “right” kind of background, i.e. those who already possess strong socioeconomic advantages.

So what’s the solution? Dr. Nahvi writes:

The stopgap fix is to better train all students to deal with all types of patients. A true long-term solution, however, is to steer more representative slices of America – individuals from all income levels – into medicine. There are many ideas for how to do this, from special recruitment strategies to arrangements for financial aid. Fundamentally though, for change to occur, admission committees need to recognize the importance of getting more middle- and low-income students into our medical education system.

Doing so won’t be easy, because it’s not just about money. Many other ingredients come into play: a solid grade school and high school education, parents and teachers who encourage careers in medicine and hold aspiring students to high expectations, and even local role models who can show young people that someone like them can successfully become a doctor.

There doesn’t seem to be much public discussion about how to narrow the privilege gap in medicine. Since part of the solution likely will lie at the community level, maybe this needs to change.

A clinician who looks like me

The face of America, and the people who seek health care every day at clinics, hospitals, urgent care centers and emergency rooms, is becoming ever more diverse. But you’d never know it by looking at the average U.S. medical school, where the faculty remains resolutely white and male.

The Association of American Medical Colleges recently examined this implicit bias in an article that takes a look at the situation and what medical schools are doing to cultivate a leadership that is more diverse.

It matters because when medical school leaders and faculty come from varied backgrounds, they bring a more inclusive approach to how medical students – the physician workforce of the future – are trained, Dr. Hannah Valantine, senior associate dean of faculty affairs at the Stanford University Medical School Office of Diversity and Leadership, told the AAMC Reporter recently.

“We are facing complex problems that will require diverse perspectives to solve,” she said. “The extent to which we can retain diverse faculty will drive our excellence in education, research and patient care.”

Disparities in health and in health care unfortunately are pervasive. They’re manifested in many ways: how people live, whether their environment is safe, whether they have access to health insurance and affordable care. At least some of the disparities, however, seem to be rooted in a health care system that doesn’t always recognize or appreciate the differences, both clinical and cultural, that make human beings so diverse.

Take, for example, a cardiovascular health initiative that was one of the topics of discussion last month at the 10th annual national summit on health disparities. The initiative, now in its second phase, is aimed at giving doctors more knowledge about treating minority patients and improving their cardiovascular outcomes.

Speakers at the event said physicians often are unaware of the nuances in treating patients of diverse ethnic and racial backgrounds, hence may fail to recognize heightened risk or important early signs of chronic disease. One of them is the so-called triglyceride paradox, or the fact that blacks can have high levels of high-density lipoproteins (“good” cholesterol) and low triglycerides yet still be at high risk of cardiovascular disease.

To what extent does the failure to see these nuances reflect assumptions that “every patient is like me,” i.e. white? Some studies have noted that black patients are referred for cardiac catheterization or bypass surgery less often than white patients, even when their symptoms are the same, which suggests at least some level of inequality.

“Clearly there is some subconscious bias that is going on,” Dr. Conrad Smith of the University of Pittsburgh Medical Center told MedPage Today last week.

Physicians need to be aware of the differences in how they approach patients of varying backgrounds and the impact this has on outcomes, Dr. Smith said. “The education of physicians is going to be paramount if we want to close that gap.”

This isn’t to say prejudice and discrimination are rampant among health care professionals. The vast majority are skilled and well-intentioned. Yet their training and background may not necessarily have equipped them to recognize their own assumptions.

Consider, for example, the implications this may have for conducting end-of-life discussions. Americans strongly favor telling the truth to patients when further medical care is futile; other cultures view this as harmful and believe the patient should not be told.

U.S. clinicians might become deeply frustrated, perhaps even angry, with immigrant or refugee patients who refuse testing and treatment for tuberculosis. What they may not understand is the stigma associated with TB in these cultures and differing practices in when and how medication is prescribed.

It’s not hard to see how misunderstandings can arise. Sometimes these spill over into the clinician-patient relationship, not only in how people communicate with each other (Do they feel their perspective is heard? Do they feel their values are respected?) but in the quality of care the patient receives.

Some studies have found that doctors relate better to patients when they share common ground, such as socioeconomic background – in other words, patients who aren’t perceived as “other.” These same studies also have found that when doctor and patient come from similar backgrounds, the doctor is more likely to take the patient’s symptoms seriously and more likely to trust that the patient will follow medical advice.

It argues that unconscious bias in favor of one’s own tribe is very real. It also argues for a better doctor-patient relationship and greater comfort level when an increasingly diverse patient population can receive care from someone with whom they identify.

That the health professions are having robust discussion about instilling diversity and cultural competency within their ranks is an indication of how much progress has been made in the past couple of decades. There in fact have been calls to broaden the definition of what constitutes diversity to include religion, gender identity and sexual orientation. In a newly published study, students in the medicine, physician assistant and physical therapy programs at the University of Colorado supported the value of an inclusive, respectful campus environment – but they also reported that disparaging remarks and offensive behavior toward minorities of all kinds continue to persist.

“According to these students, the institution must embrace a broader definition of diversity, such that all minority groups are valued, including individuals with conservative viewpoints or strong religious beliefs, the poor and uninsured, GLBT individuals, women and non-English speakers,” the researchers concluded.

Few people are without bias in some form and this goes for every walk of life, not just health care. The challenge lies in recognizing it and overcoming it so all patients get the care they need, even if the clinician doesn’t look like them.

The ethics of medical job shadowing: Who benefits?

The patient was about to undergo surgery when the surgeon came into the hospital room with “a clan of about six ‘student doctors’.”

It was uncomfortable and embarrassing:

As the young students craned their necks to get better views of my condition, I remember feeling so overwhelmed and self-conscious that I didn’t voice any of the several concerns I had outlined to discuss with my surgeon. I’m ashamed to admit it, but I remained quiet as my surgeon prattled on about what he was going to do.

Now imagine the student doctors are high school or college-aged students who aren’t even in medical school yet but are job-shadowing a physician mentor to learn more about a possible career in medicine.

Does the discomfort level ratchet up even more?

Some intense online discussion erupted recently over the ethics of allowing students to observe patient care as part of the career exploration process.

Job shadowing is a time-honored way for teens and young adults to gain a firsthand look at what medicine is like so they can make informed, realistic choices about pursuing it as their life’s work,  writes Dr. Elizabeth Kitsis, director of bioethics education and assistant professor of medicine at Albert Einstein College of Medicine in New York.

But she admits to feeling uneasy about its impact on patients and on their care.

She wonders: Do patients know the “student” observer in the exam room might actually be a high school student? Are they allowed to ask the student to stay out of the room? What if they don’t want a student present but are too shy or intimidated to say so?

More to the point, does the presence of a high school or college student inhibit the discussion of sensitive medical issues? Maybe the patient won’t bring up worries about having contracted a sexually transmitted infection, or might give less than honest answers about alcohol consumption, Dr. Kitsis writes.

She asks: “Are these concerns outweighed by the benefit derived by premed students from being in the room? I suspect that students would respond in the affirmative. But I am not sure what patients would say.”

To help answer some of these questions, Dr. Kitsis and a colleague turned to the research to conduct a study of their own. What they found was a dearth of information on best practices in medical job shadowing, how to measure the effectiveness of such programs and how to address the ethical and practical considerations.

She and Dr. Michelle Goldsammler concluded that more research is needed, and they suggest the development of guidelines and a code of conduct for premedical students who want to shadow a practicing physician.

Judging from the online conversation, most medical students find it valuable indeed to shadow an experienced physician to gain an understanding of a career they’re seriously considering.

Jimmy Tam Huy Pham, a medical student, shadowed in several different settings before entering medical school. Most patients were OK with having him in the room, although many others declined, he wrote.

“The bottom line is: exposure to any aspects of medicine (and as much of it as possible) is crucial for an individual who is about to spend the next 8-11 years of their life (subsequent to high school) in a profession where repetitive exposure and practice to procedures and patient interaction are keys to learn,” he wrote.

“Patients were always asked whether or not they were happy to have me present and, of course, their wishes always respected,” someone else wrote. “Personally, I feel that shadowing was invaluable in helping me to decide that I do truly want to pursue a career in medicine.”

Several people pointed out that shadowing experience is an unofficial entrance requirement at many medical schools in both the U.S. and the U.K.

Many of the patients who joined the online discussion had a different perspective, though.

“A curious teenager in my exam room? Never,” was one person’s response.

Someone else wrote that she did not want “some kid who has nothing to do with my care tagging along with my doctor.” Nor did she want to be in the position of having to say yes or no to allowing a student in the room, she wrote. “Why should patients have to go through that? It’s not the patient’s responsibility to help young people figure out if they want to be doctors.”

Previous studies have found that a majority of patients are willing to accept the involvement of medical students in their care – and that many in fact find it rewarding to help contribute to a student’s education. But most of these studies concentrated on students who have already entered medical school and are able to bring training and commitment to their encounters with patients. Does it – or should it – make a difference when the student is still in the career exploration phase?

Society holds a stake in the training of future health care professionals. After all, today’s student is tomorrow’s doctor, nurse, physical therapist or physician assistant. But do patients have a moral obligation to participate in this process? And when they do participate, what can be done to help ensure it doesn’t compromise their care or their privacy? It would seem these are questions that need to be examined and satisfactorily answered.

The looming shortage of doctors

Not too long ago, the American Association of Medical Colleges unveiled a new print ad depicting a patient sitting alone and distressed in an exam room. The stark message: “By the time you notice America’s doctor shortage, it will be too late.”

A new round of ads released this month warns, “Careful what you cut.”

What many rural communities have known for years is increasingly catching up with everyone else: The supply of doctors won’t be enough to meet future demand.

The AAMC’s workforce estimates aren’t encouraging. By 2020, the U.S. is projected to have a shortfall of 90,000 doctors, according to data collected and analyzed by the AAMC Center for Workforce Studies.

Part of the shortage is on the demand side. As the baby-boom generation ages, there will be a dramatic increase in the number of Americans older than 65 – precisely the population that tends to use health care services the most. Easier access to health insurance, the result of the federal health care reform law, also is expected to bring millions of formerly uninsured patients into the system, many of them with pent-up health needs that will need addressing.

But the absolute number of doctors is also anticipated to decline in future years. Although at least half of the projected shortage is among primary care doctors, the other half will be among specialties not customarily thought of as being in short supply – surgeons, oncologists, endocrinologists and more.

In practical terms, what it means for patients is the likelihood of longer waits in the future to see a doctor, more difficulty obtaining timely appointments, and possibly delays in care that could have long-term health consequences.

This is obviously a vastly oversimplified picture of what’s happening in the U.S. physician workforce right now. It doesn’t even touch on the many other issues churning alongside the basic math of supply vs. demand – the number of physicians who will be retiring in the next decade, for instance, or the tendency for physicians to gravitate toward non-rural practice, or the soaring cost of a medical education, or the staggering educational debt that students accumulate and its impact on their choice of specialty.

There’s another factor, though, that much of the public may be less aware of: Federal funding for residency training, which all physicians go through after completing four years of medical school, has not increased to meet current demands. The result is a bottleneck that has reduced U.S. capacity to get physicians fully trained and into the workforce.

Physicians – and indeed all the health professions – have unique training needs. Although much of their initial learning takes place in the classroom, at some point they must be unleashed on actual, live patients to hone their skills. Without this hands-on experience, there’s no other way to become familiar with health and illness and the variety of ways these manifest themselves across the spectrum of patients. There’s no other way to become proficient at diagnosing, treating, prescribing and performing procedures.

It takes time, money and resources to provide the necessary programs and supervision at teaching hospitals where residency training takes place – which is why outside funding is so critical.

The Association of American Medical Colleges has ramped up its lobbying effort with Congress this year to eliminate a freeze on Medicare funding for residency training. The freeze has been in place since 1997 – a full 15 years. Even a relatively modest 15 percent increase would be enough for teaching hospitals to prepare 4,000 additional doctors per year, the AAMC argues. The AAMC estimates that 10,000 more doctors need to be trained annually to completely address the pending shortage.

Certainly there are other sources of residency funding – state governments, private businesses and the teaching hospitals themselves, to name a few. The number of residents in training in the U.S. in fact has grown despite the Medicare cap. State governments and cash-strapped hospitals may not be able to sustain this indefinitely, however, nor does a fragmented approach necessarily ensure that the right types and amounts of primary care doctors and specialists are being trained.

Redesigning the health care delivery system to make it more team-centered could help blunt some of the growing shortfall in the physician workforce, yet this is unlikely to be the total answer. Patient care will still demand the skills and training of a doctor.

Considering the success that physician practices here in rural central Minnesota have had in hiring new doctors the last couple of years, it may seem there’s no real urgency to address a looming shortage of doctors.

But what’s critical to keep in mind is that the pipeline from medical student to full-fledged physician is long – seven years at a minimum, and usually longer for subspecialists. Is it OK to delay action until the problem becomes more painfully evident? The AAMC says no: “The United States cannot afford to wait until the physician shortage takes full effect because by then, it will be too late.”

Lessons in Professionalism 101

A couple of years ago there was an online flap over a MedPage Today blog post about a group of medical students and a vending machine that dispensed beer.

Writer Charles Bankhead was attending a cardiology conference and was on his way to a breakfast meeting when he encountered the young doctors-in-training gathered enthusiastically around the beer machine. He writes:

As they drained the cans and vowed to return often to worship at this altar, something told me this wasn’t the first time they had consumed beer before breakfast.

And then a thought occurred to me. One day I might have to entrust my heart to the care of these young men.

I found the thought unsettling.

He goes on to muse about medical students these days and their behavior, both bad and good, and winds up with some uplifting observations about the medical trainees who helped staff street-corner clinics in New Orleans, ministering to the many needs in the days and weeks after Hurricane Katrina.

“I hope I can remind myself of that the next time I see a group of young doctors gathered around a beer machine,” he concludes.

Students have rarely been known for maturity or prim behavior, but when it comes to medical students, should there be a higher standard?

Bankhead’s blog entry was reposted at Kevin, MD, where the reaction was decidedly mixed. What’s the big deal? wondered several commenters. One person’s assessment: “Honestly, if you get heart palpitations from the idea of late 20s male doctors finding the idea of a beer dispenser awesome, I don’t want you as my patient any more than you want me as your doctor.”

Not so fast, cautioned another commenter, who observed that “it has been my experience that this type of ‘childish’ behavior (getting excited about a beer machine and drinking DURING a conference) generally does spill over into practice.” These are the future doctors who make fun of patients, ogle the nurses and talk more about their extracurricular activities than the business at hand, the commenter wrote.

So who’s right?

As medical students head back to school this fall and the new crop of residents that began training in July gains some seasoning, it’s an opportune time to consider one of the more intangible lessons of medical school: teaching students how to be professional.

There doesn’t seem to be any formal curriculum for Professionalism 101. Yet it would be hard to find a U.S. medical school that doesn’t value professionalism or create expectations for how students should conduct themselves around each other, the staff and their future patients.

A paper published in 2010 in the International Journal of Medical Education attempted to pin down the teaching of professionalism. Is it shaped by medical school coursework? By being selective about who gains entrance to medical school? By strong role models on the faculty?

The researchers reviewed about three dozen previously published studies and came up with some interesting findings. One of the things that seemed to matter most was the learning environment – whether it was supportive and whether students had a variety of opportunities to gain experience and insight into what it means to practice professionally. The researchers found it was also important to have faculty and staff who set a positive example.

There’s been some debate over the role of the medical school admissions process in selecting students with the right qualities. But as the authors of the review note, it can be difficult to determine at the outset of a medical student’s career whether he or she is capable of learning professionalism. Furthermore, there don’t seem to be any reliable tools for identifying candidates who might not measure up.

Finally, assessment and feedback mattered. The researchers wrote that it’s especially important to monitor medical students for unprofessional or disruptive behavior and to have policies for dealing with problem behavior.

Judging from the online discussions at some of the student doctor forums, most medical students are keenly aware of the professional expectations placed upon them. A student who was ticketed for illegally netting crabs at the seashore fretted that the infraction might result in a rejected application for a residency training program. Some students had bigger problems: a DWI arrest, an assault charge, poor academic performance.

We expect a lot of doctors; perhaps we expect too much. Should a certain amount of partying or immature behavior be acceptable among medical students? Or should even an indiscreet Facebook post be grounds for disciplinary action? Some years back I had a conversation with a medical school instructor who spoke of the challenge of curing young medical students of casual terminology such as the word “butt.” Professionalism seems to be a many-layered creature that extends to language and demeanor, especially around patients, yet it’s not always easy to know where (and when) the line should be drawn.

Here’s a bit of evidence, though, that students who grasp the concept of professionalism might also be better able to practice it: In a study that appeared a few years ago in the New England Journal of Medicine, researchers found that doctors who were disciplined by the state medical board often showed warning signs of problem behavior back in medical school. Indeed, problem behavior in medical school was among the strongest risk factors for disciplinary action later in the physician’s career. Other studies, though, have found only a weak association between medical school behavior and future professionalism.

It points to a need for defining more clearly what is meant, exactly, by medical professionalism. Most future doctors do in fact measure up; it’s a minority who don’t, and a reason why the public has some stake too in the education that helps shape medical students into good doctors.

Gender bias in the halls of medicine?

Going to medical school is no guarantee for women doctors that they’ll break through the glass ceiling, it seems. A new study by the University of Michigan Health System uncovered a $12,000-a-year gap between what male and female doctors earn in one of medicine’s most rarefied areas – research and academia.

The researchers compared men and women who were mid-career physician researchers and found they weren’t paid equally, even though they did comparable work. Over the course of a 30-year career, the difference amounted to $350,000.

The report’s publication this month in the Journal of the American Medical Association comes on the heels of a new report by the University of Minnesota on the persistent gap in what women earn compared to men.

What makes the JAMA study especially notable is that it’s among the few to compare apples to apples. The researchers controlled for specialty and for hours worked. They also adjusted for academic rank, leadership positions and research time – and still came up with a disparity. Nor did the pay gap seem to be influenced by whether the female physician-researchers were raising children, because it persisted even among women who weren’t mothers.

A couple of factors may help explain some of the differences in pay: The female physician-researchers in the study held fewer of the top positions that command higher pay. They also were less likely to be in the higher-paying medical specialties.

It’s not clear whether this was the result of institutional bias, and addressing this question was outside the scope of the study. Perhaps it simply indicates this group of women didn’t pursue leadership positions in academic medical research or that they were less likely to choose higher-paying medical specialties.

The findings from the JAMA study are not isolated. A 2010 study by the Mongan Institute for Health Policy at Massachusetts General Hospital came up with almost the same conclusion: Women in academic medicine make less money, even when differences in work hours, research, teaching, patient care and other professional activities are taken into account.

It all raises the issue: Is there gender bias within the medical profession?

Once upon a time, women weren’t admitted to medical school. In a doubling-down of sexist attitudes, it was presumed that if they wanted a career involving patient care, they could become nurses.

Women who were the first to break through this barrier didn’t face an easy time of it. Dr. Susan Love, well known as a surgeon and outspoken advocate for breast cancer prevention, has spoken often about the obstacles she faced in medical school in the 1960s. There were virtually no female surgeons so she had few role models or mentors. And because she’s a surgeon who’s both female and gay, it “meant that I was never going to be accepted into the ‘old boys club,’” she says. “It meant that I had to be better and I had to serve my patients so well that they would come to me for that reason and not because someone had referred them.”

At the time, between 5 and 10 percent of students entering medical school were female, and quotas were imposed to give these women a chance. These days, women are represented in far greater numbers. According to figures from the Association of American Medical Colleges, 47.3 percent of new medical school applicants in the fall of 2011 were female.

They’re not distributed equally across specialties, though. Women tend to cluster in certain areas – pediatrics, internal medicine, family medicine, obstetrics/gynecology – and gravitate less often toward some specialties such as orthopedic surgery or neurosurgery that continue to be dominated by men and often are higher-paying.

Despite the strides made by women in medicine, it may not be enough. More recently, female enrollment in medical school has declined, for reasons that remain unclear. Are women becoming disenchanted with the prospect of the high stress and long hours that accompany a career in medicine? Have women with undergraduate degrees in the sciences found better opportunities in fields other than medicine?

Even the public tends to think of doctors in terms of “he” rather than “she.”

It’s the goal of most industries, including medicine, to have a diverse workforce. Indeed, a culturally and gender-diverse physician workforce is seen as one way of helping reduce the ethnic and economic disparities in health that persist in the United States. To be sure, there are medical schools, research institutes, hospitals and medical practices that back this wholeheartedly in their hiring and pay policies. But if it isn’t modeled – and supported – as widely as possible in the medical teaching and research institutions where students are molded into doctors, it’s hard to see how any meaningful progress can be achieved.

Primary care’s bad rap

Primary care’s often-negative reputation as stressful and unrewarding apparently starts early in the medical education process – possibly before students even enter medical school, a recent study has found.

The study appeared earlier this year in the Family Medicine journal. More recently, the findings and their implications for family practice medicine were explored in an interview by the American Academy of Family Practice with one of the study’s authors, Dr. Julie Phillips. an assistant professor of family medicine at Michigan State University College of Human Medicine.

Primary care has struggled for several years with perceptions that it’s boring, stressful, demanding, low-paying and hemmed in with constraints on everything from insurer requirements to time pressures in the exam room. Whether this is perception or reality, it has had an impact: Fewer students who enter medical school are choosing a career in primary care.

The authors of the study wanted to learn more about how primary care is perceived by medical students and whether their perceptions are changed by what they experience during their training.

Surveys were conducted among 983 medical students at three medical schools between 2006 and 2008. The students were asked to rate statements such as “primary care physicians have too much administrative work to do” and “time pressures keep primary care physicians from developing good patient relationships.” Similar questions were posed about the students’ perception of specialty physicians.

Perhaps the most eye-opening conclusion of the study is this: Negative views of the daily routine of primary care were already present in many of the students at the beginning of their training. What’s more, these views didn’t really change as students progressed through medical school, even after they had a chance to directly observe and participate in patient care.

What to make of these findings? It’s clear that “contemporary physicians struggle to meet the high expectations set by patients and their profession with limited time and resources,” the authors wrote. “Our date demonstrate that students are paying attention to the struggle.”

The results were “kind of discouraging,” Phillips told AAFP News Now. She said she also was surprised that the students’ perceptions were formed so early. “That makes me think that some of their views of what it’s like to be a doctor actually don’t come from medical school but from the larger cultural perception of what physician work is like – and especially what primary care is like.”

There were some glimmers of hope. Students who completed a primary care clerkship (typically during the third year of medical school) and had seen real-life primary care in action were more positive about the ability of primary care doctors to develop good relationships with their patients, in spite of the time constraints in the exam room. “It may be that actually spending time observing physicians helps to break some negative stereotypes,” the study’s authors noted.

The researchers also learned that some students will choose primary care regardless of their perceptions about the daily grind. This suggests that individual values and goals play an important role in the career choices of medical students, the authors wrote. “The study reinforces the importance of admitting students with primary care-oriented values and primary care interest and reinforcing those values over the course of medical school, if we are to produce greater numbers of primary care physicians.”

We’ve come a long way from the romanticized ideal of the family doctor that prevailed a generation or two ago. But did the ideal ever really match the reality? If you talk to physicians privately, some of them will admit there’s a great deal of grumbling about the profession and not enough focus on what makes it rewarding. To be sure, there are all too many reasons for doctors to be frustrated and exhausted and discouraged, but at what point do the negatives start to drown out everything else?

Phillips challenged the medical profession to become more involved in supporting new models of care, such as the patient-centered medical home, that can breathe new life into primary care and make it a better career choice. Family doctors also should try to share what’s good about their specialty, she said. “Students listen to what we say. We should try to be positive about the great things in our everyday work, because there are many wonderful things about being a family physician.”

The decline of empathy

It’s probably safe to say that most medical students start their training with a high desire to be empathetic toward their patients.

But these ideals often don’t survive the grueling process of becoming a doctor, a recent study published in the American Journal of Pharmaceutical Education has found.

Researchers at Midwestern University in Chicago and Thomas Jefferson University in Philadelphia wanted to know whether exposing first-year medical and pharmacy students to a theatrical exercise depicting the challenges of aging would improve their ability to empathize.

It did. The study involved 370 students at the Chicago College of Pharmacy and the Chicago College of Osteopathic Medicine at Midwestern University. They were asked to complete a test measuring their empathy levels before and immediately after the skit, and they scored significantly better afterwards. Unfortunately, though, the effects weren’t long-lasting. When the empathy test was administered again (one week later for the pharmacy students, 26 days later for the medical students), most of the scores had returned to their original baseline.

This doesn’t necessarily mean these students were completely lacking in empathy. But it points to the difficulty of instilling and maintaining this quality in the future health care workforce as they progress through their training.

Previous studies have reinforced this. The one that’s probably cited the most often appeared in 2008 in the Academic Medicine journal and examines what the authors describe as “hardening of the heart” during medical training. It tracked four classes of students at the University of Arkansas for Medical Sciences and documented notable declines in empathy scores, especially after the first and third years of medical school.

Similar findings have been reported among students in dental school and in postgraduate medical education.

Given the intensity of medical training, it seems inevitable that students would undergo some hardening of the heart, if for no other reason than to cope with the sheer burden. The academic demands are rigorous. The reality of providing care for actual patients, which starts in their third year, can be overwhelming and disillusioning for many students, especially when patients die or don’t fare well. Training environments that place medical students at rock-bottom in the pecking order are exhausting at best and abusive at worst. Add in the ever-present anxiety about grades and educational loans, and it’s no wonder that empathy begins to take a back seat.

Here’s a peek at the emotional state of a second-year medical student at Johns Hopkins who is about to transition into hospital training:

Suddenly the theoretical becomes practical, the “nice-to-knows” become “must-knows,” and simple clinical scenarios become ethical dilemmas. The vicissitudes become quite intense: one moment you feel ready to save a life as you stand triumphant over a mannequin, then suddenly you’re hovering in the pediatrics emergency department hearing the gurgle of a seizing child and feel completely helpless to handle such situations.

But it all begs the question: Is empathy something you can teach to medical students, or is it innate? If it’s teachable, can the curriculum be strengthened to foster and develop empathy? If it’s innate, should the admission process put more emphasis on selecting students who have this quality?

The 2008 study in Academic Medicine uncovered some interesting nuances. Overall, students who chose specialties in fields with a high amount of patient contact and continuity of care – family medicine, internal medicine, pediatrics, obstetrics-gynecology, psychiatry – had higher scores on the empathy scale than those in other specialties such as surgery or radiology. Male students in the core specialties actually scored higher than the population norm at the beginning of their training. On the other hand, the largest decline in empathy scores took place among male students in non-core specialties. Meanwhile, female medical students started at the norm and scores then declined for females choosing non-core specialties.

Of note, few if any of these students entered medical school with a deficiency in their empathetic abilities. But something about the training process may have diminished their capacity to feel empathy for others. To date there haven’t been many studies to determine whether this is temporary or whether it persists throughout their career. The study in the pharmacy education journal suggests that even when students are exposed to academic exercises specifically designed to increase their empathy, the effects may be short-lived.

Despite a lot of study, especially in the past decade or so, there remain many questions about how best to foster empathy in students who will be spending the rest of their working years in patient care. It matters because patients are more likely to comply, more likely to receive an accurate diagnosis and more likely to be satisfied with the clinician-patient relationship when empathy is present, the pharmacy journal authors wrote. “Empathy is an important component of the healthcare provider-patient relationship that has been linked to optimal patient outcomes.”

A new breed of doctor

Most people probably skipped right over the announcement late last week, but for anyone thinking about going to medical school, it was pretty big news. For the first time in 20 years, the Medical College Admission Test, or MCAT, has been revised.

Starting in 2015, students who aspire to become doctors will be tested on more than just their knowledge of the sciences. They’ll also need to have a good understanding of psychology, sociology and biology and how these forces help shape individual health and behavior.

How best to educate future doctors has long been a subject for debate. Should students be accepted into medical school on the basis of their grades and test scores alone, or should other factors be considered as well? How important is it for pre-med students to have a grounding in non-science disciplines such as psychology or the humanities? Who’s likely to make a better doctor – someone who’s outstanding in science but mediocre in people skills, or someone who’s merely good in science but excellent in people skills?

The MCAT matters because it’s one of the major determinants for who gets into medical school and who doesn’t – and, ultimately, what the future physician workforce will look like.

The revisions to the exam have been brewing for many months and reflect an ever-broadening definition of what it takes to be a good doctor. It’s no longer enough to be a science nerd with a solid background in organic chemistry. As Dr. Darrell D. Kirch, president and CEO of the Association of American Medical Colleges put it, “it also requires an understanding of people.”

The new version of the MCAT adds two sections: one on the psychological, social and biological foundations of behavior, and one on critical analysis and reasoning skills. A writing section has been dropped but the rigor of the science sections remains unchanged, and the test will still be a marathon. It’ll take students about six and a half hours to complete the whole thing, versus the four and a half hours it takes now.

I checked out an online preview guide for the test. Make no mistake, it’s very difficult. Here’s a sample question from the new section on the psychological, social and biological foundations of health: “How does cognitive dissonance explain the occurrence of persistent conformity? Memories change to reduce discomfort resulting from providing answers that differ from: A. answers identified as correct. B. memories of others. C. previously provided answers. D. original memories.” (The answer is D.)

Curious to know what pre-medical students think of these changes in the MCAT, I visited an online student doctors forum, where most of the reaction can be summed up in one word: “Brutal.”

“I think we lucked out not having to take this,” one pre-med student commented.

MedPage Today interviewed an aspiring doctor who had a different perspective. Adam Gardner, 29, earned a master’s degree in international affairs, then decided he wanted to become a doctor. He’s currently loading up on science prerequisites in preparation for taking the MCAT in June.

He said that while he appreciates the idea behind trying to create a more well-rounded doctor who can interact on a deeper level with patients, he worries that the additional test sections may require students like him who decide later in life to go to medical school, to take a greater number of classes in order to prepare for the test.

“Adding these new things to the test could drag this out even longer for people who want to get it done,” he said.

But then again, Gardner said having a knowledge of social sciences would lead to a better doctor-patient relationship.

“I’ve met a lot of doctors … and some of them are pretty cool people,” he said. “But a lot of them are not terribly social and easy to get along with, and I think having a more rounded background will create doctors who can deal with patients better.”

Critics didn’t waste any time weighing in with their reaction to the newly revised MCAT. One of the objections being voiced most frequently is that the heightened emphasis on psychology, sociology and critical thinking skills is too “touchy-feely” and will have the effect of dumbing down the test, thereby lowering the bar for who gets admitted to medical school.

It’s a valid concern. You could just as easily argue, however, that a medical school admissions process that’s structured in favor of the sciences can end up unfairly excluding students who are well-rounded, hard-working, took all the right pre-med college courses, earned good grades, have all the makings of becoming a good doctor but simply didn’t do as well on a science-oriented test.

It comes back to the original question: What are the most important qualities for doctors to have? Knowledge of the sciences will always be critical, but is there room for training programs to also emphasize analytical thought, the social sciences and the underpinnings of human behavior? The new version of the MCAT suggests that indeed there is. In 20 years or so, we’ll find out whether it’s made a difference.