Duas coisas são infinitas: o universo e a estupidez humana. Mas, em relação ao universo, ainda não tenho a certeza absoluta.
(Einstein)
But the tune ends too soon for us all (Ian Anderson)
Im Kanal "ARTE regards", eine Reihe von Programmen auf Brücken. Diese zweite mit dem Titel "Alte baukunst aus holz und stein", mit einer Brücke in Verzasca und eine andere in Chengyang.
Why a century-old battle over vaccination continues to rage.
Epidemics have piggybacked their way on wars across the world for thousands of years. The Plague of Galen, which decimated Rome in 165 A.D., entered the empire with soldiers returning from the Near East. Infectious diseases, more than swords or guns, helped Pizarro conquer the Incas. In the early eighteen-seventies, a smallpox pandemic that accompanied the Franco-Prussian War killed more than half a million Europeans.
Smallpox claimed the lives of tens of thousands of French soldiers, yet the Prussians lost fewer than five hundred men.That was because Prussia vaccinated its entire Army against the virus, and France did not. There has never been a more dramatic demonstration of a vaccine’s power to alter the course of history. By the end of the decade, several European countries had passed laws mandating vaccination.
Those measures worked. In 1899, smallpox took just over a hundred lives in Germany, a nation of fifty million people. When epidemics broke out in the United States, as they did at about the same time, Surgeon General Walter Wyman seized on the European data, urging compulsory vaccination for all Americans. The disease had become so easy to prevent, he wrote, that “the smallpox patient of to-day is scarcely deserving of sympathy.” Throughout the United States, at the dawn of the Progressive era, dozens of laws and regulations were established to empower police officers, public-health officials, and even the armed forces to vaccinate at will, and, if necessary, at gunpoint.
As Michael Willrich, a historian at Brandeis, notes in his meticulously researched book “Pox: An American History” (Penguin; $27.95), those efforts helped bring an end to smallpox as a serious public-health threat in the United States. But the victory may have come at a cost, initiating what Willrich describes as “one of the most important civil liberties struggles of the twentieth century”: a ferocious clash between personal freedom and public health. The clash reverberates to this day, as a growing number of American parents refuse to inoculate their children against common childhood diseases. Universal vaccination may well be the greatest success story in medical history. But the movement also has a political history—and it’s far less reassuring.
Smallpox was among the most lethal viruses to afflict humanity, killing anywhere from twenty-five to sixty per cent of those it infected and leaving others scarred for life. The first symptoms were fever, malaise, body aches, and vomiting; soon victims suffered mouth sores and then the disease’s ghoulish signature, a savage rash. Within forty-eight hours, the lesions would spread across the body. The patient’s face would become badly swollen, the pain acute. A 1900 handbook of naval medicine describes the final stages: “The pustules break, matter oozes out, crusts form, first on the face and then over other parts of the body.”
Nobody is certain when or how smallpox first appeared. But the virus emerged in Europe sometime between the fifth and seventh centuries and was often endemic during the Middle Ages. By 1700, variolation—deliberate infection with smallpox—had been tried successfully as a preventive measure. It was dangerous, but far less so than the disease itself. Dried smallpox scabs were blown into the nose of an individual, who then contracted a mild form of the disease but was immune afterward. The Colonial preacher Cotton Mather, who had learned about variolation from his slave, attempted to introduce the new practice during a 1721 epidemic in Boston. Mather was denounced as an “inoculation minister,” and his house was firebombed. The actual vaccine—the world’s first—was invented by Edward Jenner, a British country doctor, at the end of the eighteenth century. After noting that milkmaids rarely got the disease, he theorized, correctly, that exposure to cowpox—a virus similar to smallpox but much less virulent—conferred resistance.
Today, Americans expect the federal government to respond to (and contain) any serious contagion. That wasn’t true in the late nineteenth century, the period of Willrich’s focus. The idea of calling for federal aid was unusual, and in the Deep South it was unthinkable. Then, in the mid-eighteen-nineties, after decades of relative quiescence, smallpox began to spread through the communities of Kentucky and other Southern states. Panic kept pace. As a member of the Kentucky Board of Health put it, “One case of small-pox in a tramp will create far more alarm in any community in Kentucky than a hundred cases of typhoid fever and a dozen deaths in the leading families.” Finding themselves defenseless against the virus, communities sought help from the United States Marine Hospital Service—the precursor to the U.S. Public Health Service.
The service dispatched doctors who rode from town to town like U.S. Marshals, brandishing masks and needles instead of badges and guns. They vaccinated the healthy and quarantined the sick. Once an epidemic was under way, those doctors were granted broad police-like powers, and they established the first foothold of federal authority in the South since the end of the Civil War and Reconstruction.
The logic used by the increasingly powerful federal government was straightforward: the good of the community had to outweigh objections raised by a minority. After all, what could be worse than a smallpox epidemic? Willrich offers an answer: curtailing basic civil liberties. As he points out in this eloquent, if not always persuasive, book, compulsory vaccination collided with fundamental medical and religious beliefs held by millions; it overran the rights of parents and, most painfully, contradicted strongly held, and particularly American, notions of personal liberty. For all those reasons, not to mention the fact that the vaccine itself occasionally killed people, the resistance was intense. Residents of many neighborhoods burned down their “pesthouses” (isolation hospitals for infectious patients), fled when vaccinators approached, fought with police, forged vaccination certificates, and often simply refused to deliver sick relatives to authorities.
Public-health officials stoked resentment by applying the laws without subtlety or restraint, Willrich points out, most of all to immigrants in the North and blacks in the South. (At the end of the nineteenth century, a black man in Kentucky was required to carry a certificate of vaccination—or display a scar on his arm—in order to move about freely. No such law applied to white men.) In February, 1901, an epidemic erupted in New York, and a vaccination squad moved en masse into the crowded Italian sections of the Upper East Side, where they thought the infection had originated. Willrich writes:
They followed the same method on each block. With policemen stationed on the roofs, at the front doors, and in the backyards, doctors and police entered the tenements and rapped on doors, rousing men, women, and children. Frightened and furious, the residents moved into the lighted areas, where doctors inspected their faces for pocks and their arms for the mark of vaccination. . . . Everyone lacking a good mark had to submit to vaccination.
Infected children were routinely taken from their mothers’ arms and sent to a pesthouse, where they often died—then, as now, there was no successful treatment. Many of those who were hauled away spoke no English; more often than not, they belonged to families that had fled despotic nations to live in what they thought was a democracy.
The anti-vaccine activists were aided by an inexplicable epidemiological anomaly. A new form of the virus appeared in the eighteen-nineties, one that wasn’t nearly as deadly as those which had come before; it seemed to kill just one per cent of the infected, and many doctors were not even sure the rashes were a sign of smallpox. Some thought it was a different disease entirely, and others wondered if the virus had become weak enough to ignore. The decline in virulence made opposition to vaccination much easier—especially given the uncertain risk posed by the vaccine itself.
These days, vaccines used in the United States are subject to years of clinical testing, several layers of regulatory approval, and the final judgment of the Food and Drug Administration. The oversight continues after vaccines are introduced. In the nineteen-nineties, federal health officials called for the live oral polio vaccine to be removed from the market, because it caused the disease in about ten children out of the millions who received it each year. A newer version, which has been used widely for more than a decade, eliminates even that small risk. But in 1900 the F.D.A. didn’t exist, and neither did any federal rules about how to make, test, or deliver vaccines.
By introducing laws that compelled vaccination without any safety regulations to go along with them, the government did nothing to reassure those who regarded compulsion with dread or indignation. The smallpox vaccine was produced in the least pleasant way imaginable: cows were infected with vaccinia virus, which is similar enough to smallpox to stimulate protective antibodies, and then virus-laden pus would be extracted from their sores. A brew of the material was then spread on a patch of skin that had been cut and then rubbed raw. The vaccine was effective and relatively safe. Yet no statistics were available to the public, and people often confused vaccination with the less precise technique of variolation, which had proved lethal to as many as one in fifty of those who received it. When smallpox was killing thirty per cent of its victims or more, the odds clearly favored vaccination. In a mild epidemic, where few people died, rejection of the vaccine made far more sense.
But the social calculus of vaccination can never be reduced to the estimation of individual benefit. When most members of a community are vaccinated, they protect those who are not by eliminating the viral reservoirs in the population. The effect is known as herd immunity. Some people, because they are too young or have particularly weak immune systems owing to cancer or other illnesses, cannot be vaccinated. For them, herd immunity is the only defense. As long as the majority are vaccinated, then, a few can decline without courting harm, but when vaccination rates fall below a certain level this protection quickly begins to vanish. At that point, someone who refuses a vaccine imperils not only his own health but that of everyone he encounters.
The revolt against the smallpox vaccine took many forms, and one of them was litigation. A wave of lawsuits were filed at the beginning of the twentieth century, all aimed at protecting an individual’s right to do with his body what he chose to do. The most prominent case, Jacobson v. Massachusetts, was brought by Henning Jacobson, a Swedish immigrant living in Cambridge during a smallpox outbreak there in 1902. Jacobson, a minister, refused to comply with the city’s vaccination order, because, he said, a vaccine had made him sick when he was a child. He also believed that vaccines made his son sick, and said that he knew others who suffered as well. The case reached the Supreme Court in 1904, at a time when infectious diseases remained the leading cause of death in the United States. The Court upheld the right of states to compel vaccination, ruling that, although personal autonomy mattered greatly, the state was entitled to protect its citizens from threats to their health and welfare.
The Jacobson decision provided states with legal support, but it did little to resolve fundamental conflicts over coercive public-health measures. Battles over vaccination rage, and the Internet has only amplified the tumult. Lawsuits continue to be filed—and, like Jacobson, many people rely wholly on anecdotal evidence to argue that vaccines pose a greater threat than the diseases they prevent.Large numbers of American and British parents refuse to vaccinate their children for measles, because they fear that the vaccine can cause autism, despite many studies that show measles vaccines to be safe, and unrelated to autism.There are even pediatricians who decline to treat children who have been vaccinated. Vaccines work by stimulating resistance in the immune system; unfortunately, resistance to vaccination itself has proved no less formidable.
Willrich’s sympathy with those turn-of-the-century opponents of compulsory vaccination can be instructive, but it sometimes leads him to overreach. “A natural affinity linked abolitionism and antivaccinationism,’’ he writes. “Both upheld bodily self-possession as the sine qua non of human freedom; both distrusted institutions; and each evoked public scorn in its time as the dangerous cause of a lunatic fringe.” Some anti-vaccine activists had legitimate complaints; others truly were, and are, a lunatic fringe, and the costs of their lunacy are borne by the larger society. It’s absurd to argue that people who opposed the most effective public-health measure in history have a “natural affinity” with the movement to free human beings who were owned by other human beings.
After all, what makes it easy to be a vaccine dissenter these days is the fact that most people aren’t. Because of routine vaccination, measles—which kills at least a hundred and fifty thousand people in the developing world each year—long ago ceased to be a significant threat in the United States. This creates a paradox. Public-health officials must struggle constantly with the consequences of their own success: the dangers of complacency are real. Vaccine-preventable illnesses have made a strong resurgence in the past decade in the United States, fuelled almost wholly by fear. There is currently a measles outbreak in Minnesota; last year, pertussis (whooping cough) cases, and deaths, reached a record high in California.
Willrich reminds us, invaluably, that vaccination is never simply a medical and technological matter; it’s also a political one. In 2009, a novel influenza virus, H1N1, emerged in Mexico and threatened a global pandemic. The World Health Organization, erring on the side of safety, declared the highest level of international alert. And then the virus mostly fizzled out. As with the smallpox outbreaks around 1900, the threat seemed to fade. Today, more than a third of Americans say they would decline a flu shot for themselves or their children. Yet nobody can predict the ways in which a virus will mutate or how virulent it may become. It could fade, as H1N1 appears to have done; or, like the virus that caused the 1918 flu pandemic, it could claim the lives of millions.
It is a remarkable fact that smallpox, a scourge for thousands of years, has now vanished from the earth, except for two tiny vials, one locked in a highly secure facility at the Centers for Disease Control, in Atlanta, and another stored in a similarly secure vault in Siberia. Before the virus was eradicated, in 1977, it killed three hundred million people in the twentieth century alone. But there is no reason to think that pandemics are behind us. A public-health establishment that regards vaccination as merely a technical matter has failed to register the genuine clash of values—not least between social welfare and individual liberty—that Willrich describes. Making the case for vaccination means taking its opponents seriously; it means taking the time to understand the reasons for resistance, and it means figuring out how to prevent and allay mistrust. Modern vaccination is a triumph of medicine; its decline would be a failure of politics
The animals have 20 copies of a key tumor-fighting gene; humans have just one
Elephants have evolved extra copies of a gene that fights tumour cells, according to two independent studies, offering an explanation for why the animals so rarely develop cancer.
Why elephants do not get cancer is a famous conundrum that was posed — in a different form — by epidemiologist Richard Peto of the University of Oxford, UK, in the 1970s. Peto noted that, in general, there is little relationship between cancer rates and the body size or age of animals. That is surprising: the cells of large-bodied or older animals should have divided many more times than those of smaller or younger ones, so should possess more random mutations predisposing them to cancer. Peto speculated that there might be an intrinsic biological mechanism that protects cells from cancer as they age and expand.
At least one solution to Peto's paradox may now have been found, according to a pair of papers independently published this week. Elephants have 20 copies of a gene called p53 (or, more properly, TP53), in their genome, where humans and other mammals have only one. The gene is known as a tumour suppressor, and it snaps to action when cells suffer DNA damage, churning out copies of its associated p53 protein and either repairing the damage or killing off the cell.
The elephant's tale
Uncovering TP53's role has taken a few years. Joshua Schiffman, a paediatric oncologist and scientist at the University of Utah in Salt Lake City, first heard about Peto’s paradox three years ago at an evolution conference, when Carlo Maley, an evolutionary biologist now at Arizona State University in Tempe, revealed he had found multiple copies of TP53 in the African elephant's genome.
Schiffman specializes in treating children missing one of their TP53 gene's two alleles, which leads them to develop cancer. So after hearing Maley's talk, he wondered whether elephants held some biological insight that could help his patients. He teamed up with Maley, who had not yet published his work, and asked elephant keepers at Salt Lake City’s zoo whether they could spare some elephant blood so that he could test how the p53 protein works in the mammals' white blood cells.
At about the same time, in mid-2012, Vincent Lynch, an evolutionary geneticist at the University of Chicago in Illinois, was preparing for a lecture on Peto’s paradox, and wondered about mechanisms that could explain it. “Right before I gave the lecture, I searched the elephant genome for p53, and 20 hits came up,” says Lynch.
Schiffman and Lynch’s teams have now independently revealed their findings — Schiffman's in the Journal of the American Medical Association, and Lynch's in a paper posted to the bioRxiv.org preprint site, but which is in review at the journal eLife.
Using zoo autopsy records for 36 mammals — from striped grass mice to elephants — Schiffman’s team recorded no relationship between body size and cancer rate. (Around 3% of elephants get cancer, according to the team’s analysis of hundreds of captive-elephant deaths).
The researchers found that elephants produce extra copies of the p53 protein, and that elephant blood cells seem exquisitely sensitive to DNA damage from ionizing radiation. The animals' cells carry out a controlled self-destruction called apoptosis in response to DNA damage at much higher rates than do human cells. Schiffman suggests that, instead of repairing the DNA damage, compromised elephant cells have evolved to kill themselves to nip nascent tumours in the bud. “This is a brilliant solution to Peto’s paradox,” he says.
Mammoth set
Lynch’s team — working with African and Asian elephant skin cells from the San Diego Zoo in California — found similar results. They also discovered more than a dozen TP53 copies in two extinct species of mammoth, but just one copy in elephants’ close living relatives, manatees and hyraxes (a small, furry mammal). Lynch thinks that the extra copies evolved as the lineage that led to elephants expanded in size. But he thinks that other biological mechanisms are involved too.
Mel Greaves, a cancer biologist at the Institute for Cancer Research in London, agrees that TP53cannot be the only explanation. “As large animals get bigger, they become more and more sluggish,” he notes, thereby slowing their metabolism and the pace at which their cells divide.And protective mechanisms can only do so much to stop cancer, he adds. “What would happen if elephants smoked and had a bad diet,” he says. “Would they really be protected from cancer? I doubt it.”
Simon breathalysed himself before surgery. Johnny operated on one hour's sleep. As an increasing number of doctors feel the strain, we find out why the experts don't get help
It was the summer of 2012 when Simon, then a 37-year-old anaesthetist, found himself one morning drunk and sobbing in a London pub. Questions filled his head, foggy with booze: "How did it come to this? How did I throw it all away?"
A letter from the General Medical Council lay in his lap. He'd been convicted of drink-driving and was now suspended from being a doctor. Simon was an alcoholic, drinking as much as 30 units every day. Faced with the wreckage of his career, he was suicidal.
The year before, Simon (not his real name) had been breathalysing himself before he went to work at the hospital, terrified he'd kill somebody in theatre. Unable to cope with the stress of his double life (and because, paradoxically, he was a good doctor), he had resigned from his job before he could hurt a patient. He had given in to his addiction, been prosecuted for driving under the influence, been ordered before the GMC and had left the profession he loved and for which he had once had a natural talent.
Simon was brought to rock bottom by a combination of personal factors: the break-up of his marriage; his mother's cancer; geographical dislocation from his family; his own self-loathing and need to achieve; and a pattern of heavy drinking which had started at medical school in order to fit in and cope with stress. "I went to a grammar school and had always worked hard. I walked into medical school," he says. "But I was shy and I immediately saw that if I drank heavily, it could feel like I fitted in more." When his life derailed, he drank rather than ask for help. What characterised this period of his life was fear: fear of failing; fear of his drinking being found out; fear of losing his job and being stigmatised.
"There was this immense sense of loss," he remembers of that morning in London. "That it was all gone, and that I'd never get it back."
David Emson lives daily with the reality of loss. His wife, Daksha, a brilliant young London-based psychiatrist, suffered from bipolar affective disorder. Her fear of the stigma attached to mental health problems ended in tragedy. Known as one of the brightest young psychiatrists of her year and on course to be made a consultant, Daksha was terrified that if it was discovered, her illness would cost her her job.
She was so secretive about her condition that her only treatment took the form of hurried consultations in hospital corridors. Most of the time, she was not treated at all. During a period in which she had stopped taking medication, after the birth of her child, her disorder took hold. Suffering from violent delusions and obsessed with evil spirits, she set herself and her three-month-old baby on fire. The baby, who also had multiple stab wounds, died immediately. Daksha, who was 34, died three weeks later in a burns unit.
In the inquiry that followed, the stigma of mental illness within the NHS was cited as a contributory factor. Also cited were inadequacies in both perinatal mental health services and NHS occupational health services – but a significant contributor was the fact that Daksha was both a doctor and a patient. She had managed to convince those around her, including the doctor who was treating her unofficially, that she was in control of her symptoms.
David Emson's cigarette habit hangs over his house in east London like a fog. You can see it on his teeth and fingers, too. He didn't smoke before Daksha died, but now he can't stop. He apologises in advance for his incoherent trains of thought, shaped by bitterness at a system that failed his wife, his own guilt at not spotting Daksha's descent into mania before it was too late and his untreated post-traumatic stress disorder. He tells me at times he still feels suicidal.
It has been just over 13 years since he came home from work as a radiologist and raced upstairs to find Daksha and Freya – his "button-nose" – ablaze. He was arrested as an initial murder suspect. His clothes were taken for forensic analysis and he was interviewed for hours. He had to make phone calls for help from the police station in his underpants. When he saw Freya in the mortuary, his impulse was to "climb in there with her". In his distress, he pulled chunks of his hair out and placed them beside his dead child.
That Emson still lives in the house that was the scene of such horror seems impossible to comprehend, but the house, he says, is a connection to them. He takes me into the sitting room and shows me a shrine to "his girls". There are candles and photographs, one showing Daksha holding her daughter close to her, beaming into the camera. When that photograph was taken, the mania had already taken hold, but nobody knew it.
Daksha Emson's case was a complicated one. After attempting suicide at medical school, she was diagnosed with bipolar disorder. She qualified as a psychiatrist – "She wanted to understand her illness," her husband says – and kept her mental history secret. For years she had been under the informal care of a consultant unconnected with her own training hospital in order to avoid the stigma of her disease affecting her career. And, for a time, it worked.
"One of her colleagues got 'found out' and Daksha was terrified," Emson says. "It was that fear that, if I'm found out now, I'm sacked, it's all gone. And for her, a committed doctor about to be made a consultant, it wasn't just her job that she feared being taken away, it was [the risk of] her whole life being swept away with it.
"When she came off her meds to get pregnant, I'd monitor her bloods and she made me send them to the lab under a different name. During this time, at least one other psychiatrist would come to see her for help and she would write private prescriptions for her for antidepressants, so nobody would find out. We discussed her illness often at home, what works, what doesn't, the early warning signs, everything was planned when she was trying to get pregnant."
After she gave birth, she stayed off the drugs to breastfeed the baby and became increasingly unwell, but successfully hid it from her husband and from the psychiatrist she saw occasionally.
"In the end," Emson recalls, "we were all relying on Daksha's insight. She was a victim of her illness and so was Freya and so was I, and so were other people. But she was also a victim of her wellness. She would say, 'Dave, I know my illness, I understand my illness…'" He pauses, then adds sadly, "What she wanted above all was the anonymity to be under the radar, to not stand out."
Between 10 and 20% of doctors become depressed at some point in their career and they have a higher risk of suicide than the general population, according to research cited in the Journal of Mental Health 2011. A survey sent round to members of the UK-based Doctors Support Network, a self-help group for doctors with mental health issues, found that 68% of the 116 doctors who took part had a diagnosis of depression; others reported diagnoses of bipolar disorder, anxiety, eating disorders and addictions.
Dr Clare Gerada, former president of the Royal College of General Practitioners, is clear that the number of doctors becoming affected by mental illness or addiction is a frontline issue that could have catastrophic consequences. NHS occupational health services have been drastically cut in recent years, which coincides with increased workloads and stress. "There are very many reasons why doctors are becoming ill," Gerada says. "For GPs, it's the pressure of the workload, the denigration of what they are trying to do. For others, it is the loss of team structure. If you are a paediatrician now, after you've told parents their child has died, you have no support. In my day, you'd have been supported in that role by a senior member of the team."
An atmosphere of fear and uncertainty pervades the NHS, adding to doctors' anxiety about being perceived as weak or unwell. Doctors do not find it easy to get the right help, even if it is available to them. Their problems are, Gerada says, deep-rooted, psychological and social, part of a stigma in the NHS attached to weakness, addiction or mental illness.
"First," she says, "there is a belief that doctors don't get ill, that they themselves see it as a sign of weakness. Then you have the fact that doctors are put on pedestals, that they wear a white coat and speak a different language. Then there is the worry that admitting depression or addiction will ruin their careers. Then you have their obsessive personality traits, a doctor's attention to detail and wanting to work especially hard – the very things that make them good doctors. Then there is the fact that doctors are frightened they are going to end up being treated by a colleague."
As well as 15 years as a clinical consultant, Dr Frances Burnett has been assessing and supervising doctors for the GMC for the last decade. "Doctors may not recognise that they are becoming ill," she says, "and even if they do, they may understate their symptoms in order to keep working. Seeking help early enough, before things get out of hand, is important, and is often difficult for doctors because of the practicalities of cancelling clinics and the numbers of patients who will be let down. For GPs, this is especially difficult as it creates a financial as well as a clinical burden on colleagues. I have assessed doctors who have been working in conditions of enormous stress but kept going because they are dedicated to the job, and this has led them to behave in strange or inappropriate ways – for example, shoplifting."
Johnny (not his real name), a 50-year-old consultant at the top of his profession, was recently diagnosed with bipolar II disorder. "I reckon I've had it for 20 years of my career," he says. Until relatively recently, he would often climb into his car in the middle of the night, wind down the roof and drive for two or three hours, smoking. He'd go home for an hour's nap, then go straight to work in the operating theatre. He had mood swings, which alienated his family, and non-specific anxiety, which made it impossible for him to sleep.
"The first or second time I saw a psychiatrist, I managed to pull the wool over their eyes – I'd done that through 10 years of therapy, too." In treatment, he fell back on his status as an experienced doctor. It was not a normal patient-doctor relationship. He convinced the other doctor that nothing was wrong. "I sometimes wonder, if I'd been a train driver, would I just have gone to the GP and got a prescription like other people?"
Johnny is terrified he will be found out. His level of paranoia and fear of being identified as "ill" is astonishing. It's as if he is concealing a crime.
"I haven't told anybody at work. Why? I don't know, partly I see it as a sign of weakness. We are supposed to be curing people. We are not supposed to be weak. I don't want people thinking, 'He's gone bonkers.' And in the back of my mind, if something goes wrong, if I make a mistake, I don't want people thinking…" He trails off.
"It's a political business, being a hospital consultant. You don't show anybody any weakness. I don't want people thinking I need help."
He won't tell me his specialism, except to say that it is one of the most stressful. He sees people die regularly. He can't ask for help at work if he feels unable to cope. "There is no help available. I lose a child, I lose a 20-year-old, and I go round the back of the hospital and have a fag and then it's straight back to work. There's no debrief. There is absolutely no pastoral support, no help for doctors with mental illness, no post-traumatic stress counselling."
He estimates he has had four days off sick in 17 years. Only once did he take himself home: "I knew I wasn't safe to be at work." He pretended to have norovirus: "I picked that because I knew they wouldn't want me in for four days if I'd had diarrhoea, but I thought, 'What am I going to do in the future? I can't always have norovirus.'"
There is a GP practice in London that is not what it seems. Patients go in through the front door, but operating out the back is a separate practice set up to give doctors confidential healthcare. Doctors such as Johnny and Simon (both of whom were treated here) sit with the rest of the patients, but are called to see different doctors. The scheme is called the Practitioner Health Programme (PHP) and was set up in 2008 as a two-year pilot by the government in response to the damning judgment of the inquiry into Daksha Emson's death.
PHP currently treats, confidentially, 500-600 doctors in the London area. "We are saving lives," says Gerada, its medical director. "We have masses and masses of letters from doctors telling us that. Doctors need a healthcare system of their own."
Almost all of the addicted doctors it sees – like Simon – are sober after six months, and 90% of those will continue to be so five years later. PHP has the financial power to pay for a doctor's rehab (the cost of treatment represents a sound investment, given that it costs £500,000 to train a doctor). In Simon's case, as with so many of the doctors who end up at PHP, he was in denial: "I was told that I was among the most ill they'd seen."
On 28 August 2012, within a week of his first PHP consultation, he was admitted to Clouds rehab in Wiltshire. He hasn't had a drink since. After a year of voluntary medical work – allowed within the parameters of his GMC suspension – Simon regained his licence to practise. At the beginning of this year, he began working as a doctor again, in an intensive care unit. PHP, he stresses, gave him his life back and supported him through the deeply stressful GMC hearings.
For doctors suffering from mental illness, PHP provides proper diagnosis, care and treatment. Johnny's medication, for example, makes life a lot easier, and help has allowed him to "get off the shifting ground", as he puts it, although he knows that he is by no means 100% well.
The key to the scheme's success is that the doctors it treats can self-refer and tell nobody; no one knows they need help. For those in trouble with the GMC, PHP offers support at hearings, detailed psychiatric reports and the right care. Max Henderson, one of three psychiatrists seeing patients at PHP, says: "We knew we had to create it in the back of a normal surgery because if we'd asked [the doctors] to attend a place with a small sign on the door, or anywhere, in fact, where people might see them and make some kind of link, then that would stop them coming."
With confidentiality established, Henderson was able to diagnose Johnny's illness quickly, partly because he knew the psychological denial he was up against. Even now, Johnny (who has finally managed to tell his wife about his diagnosis) insists that Henderson issues his prescriptions, rather than his own GP, who is not remotely connected with his work. "I don't want my GP knowing," Johnny says. "That was a clear condition I had."
"By the time I see them, the doctors have developed unhelpful coping strategies that have to be unpicked," Henderson explains. "But the key issue is creating an environment where they allow you to be the doctor and them to be the patient. Many of the people who end up with us have seen other doctors who, however well-meaning, have said things like, 'What do you think? What treatment would you like?' I have to work hard to treat these doctor-patients like everybody else." But Henderson also provides some sobering context: "The surprise for me is that a lot of doctors I see become mentally ill not because of the clinical work they do, but because of the way they are managed in whatever health service they work in. Since 2009, the NHS has been characterised by fear and uncertainty."
However game-changing PHP is for the rising number of doctors it treats, it is available only to those in the London area (although discussions are taking place about a Dublin launch). Where doctors outside London go for help is a big question.
"Some doctors seek help privately or out of their local area," Burnett says, "and while this can work well, there is a chance that they actually get much worse care than a typical patient, as attempts are made to keep their health problems secret. This is a particular problem for doctors with serious mental health issues who may not have access to the normal range of interventions available to most patients."
In addition to geography, another concern of doctors treating doctors (aside from the worry that funding for PHP may not continue) is that a growing number of the doctors in trouble – either through depression or addiction – are young.
Henderson says: "We are seeing sharp year-on-year rises in the number of young doctors, junior doctors who have another 30 years of their career ahead of them. They look at their consultants and they panic at the way their lives look. We'll get these doctors better, but where are they going to be when they are in their 60s? The worry is the next generation. Basically, the training regimes now have increased demands and reduced levels of support."
Research into patients attending MedNet, a confidential consultation service for doctors and dentists in the London area, backs this up. The largest age group using the advice and support service is those between 30 and 39 years old.
Dr Michael Wilks, ex-chairman of the Sick Doctors Trust, which offers a confidential telephone support line for doctors worried by their own behaviour but not yet in trouble with the GMC, says that its evidence also suggests the age of doctors in need of help is getting younger. The helpline takes increasingly frequent calls about drug use in younger doctors, be it cocaine or prescription drugs. (It's important to remember that these calls are just from those doctors who are worried and brave enough to pick up the phone.) One doctor who contacted the helpline anonymously had taken so many Nurofen Plus that he had a gastric bleed.
"And the things with [drugs]," says Wilks, himself a recovering alcoholic, "is that you don't get away with it for so long. You get addicted faster and in the case of writing out a false prescription for yourself, there will be a police investigation and a GMC suspension. We are actively going into medical schools these days to tell the students of the stresses they face, so that they can be aware as soon as it begins to happen to them."
The GMC itself has a dilemma: how does it hold the line between protecting the public and dealing compassionately with increasing numbers of struggling doctors brought before its panels with health-related "offences"? A series of new procedures is under discussion, intended to help doctors in trouble. (How well it is succeeding is a moot point; specialists complain about the protracted back-to-work procedures required.)
With its lengthy process of tribunal hearings and the power to bar doctors from practising, the GMC is often cast as demeaning and punitive for an addicted or ill doctor. But it can, Burnett stresses, do some good: "At least one excellent doctor whom I have treated for depression considered giving up medicine altogether rather than risk facing the GMC… I remember a doctor who had used alcohol to cope with a very stressful job and a rota that made it difficult for him to get regular sleep, who was referred after a driving offence. Latterly, [he] recognised that without the involvement of the GMC and the structure that was imposed on him to manage his own health, he would probably have become an alcoholic."
Some medical schools – understanding the threat of drink and drugs – are beginning to introduce "fitness to practise" hearings. These hearings are intended to keep student behaviour in check. At best, they will nip a bad habit in the bud, but at worst, as Henderson sees it, they "introduce an early punitive threat to medical students who are still in the process of growing up".
In David Emson's huge file of material gathered over the years following Daksha's death, there is correspondence he received from a female relative of a senior doctor. The doctor, whom the woman says she "worried about continually", was buying his antidepressants in secret on the internet. "He struggles with depression but says he dare not seek help because he might lose his job or at the very least be less well-regarded… The trouble is, some men, particularly doctors, are especially sceptical about anonymity as it means so much to them and they believe that it will be broken."
Emson hands me the letters as if to ask: how many more of them are out there?
Johnny understands this. He is paranoid about his own confidential condition, but he has noticed since taking his medication that his junior doctors have begun to confide in him about their weaknesses and anxieties. He suspects it is because his manner has changed and softened – and this is good. Yet fear prevents him from helping them: "One day, I'd love to be able to tell them about me, so they feel they are supported. But that moment has not yet come."
The medical establishment is gradually acknowledging these hard facts, as indicated by a recent article in the Journal of the American Medical Association: "Overdiagnosis and Overtreatment in Cancer: An Opportunity for Improvement." The article was written by a working group formed by the American Cancer Institute last year "to develop a strategy to improve the current approach to cancer screening and prevention."
The three authors state: "Over the past 30 years, awareness and screening have led to an emphasis on early diagnosis of cancer. Although the goals of these efforts were to reduce the rate of late-stage disease and decrease cancer mortality, secular trends and clinical trials suggest that these goals have not been met; national data demonstrate significant increases in early-stage disease, without a proportional decline in later-stage disease."
In other words, increased screening has led to increased diagnosis of cancer but has not significantly decreased mortality. The problem with screening, the authors note, is that "cancers are heterogeneous and can follow multiple paths, not all of which progress to metastases and death, and include indolent disease that causes no harm during the patient’s lifetime."
That is, screening often detects growths that represent no significant threat and yet are nonetheless often treated with surgery, chemotherapy and radiation, all of which degrade health. The authors state: "Physicians, patients, and the general public must recognize that overdiagnosis is common and occurs more frequently with cancer screening." [Italics in original.]
"Policies that prevent or reduce the chance of overdiagnosis and avoid overtreatment are needed," the authors assert, "while maintaining those gains by which early detection is a major contributor to decreasing mortality and locally advanced disease."
One policy change that the authors recommend would be to avoid using the term "cancer" to describe tumors or other abnormalities that are not life-threatening. When patients hear the word "cancer," they often demand further tests and treatment, even when medically unjustified, and physicians are too often eager to comply.
The JAMA article, if anything, downplays the problems with cancer testing. For example, the authors state that "colon and cervical cancer are examples of effective screening programs in which early detection and removal of precancerous lesions have reduced incidence as well as late-stage disease." As I stated in a column last year, "Why I Won't Get a Colonoscopy," the value of colonoscopies has not been clearly demonstrated.
In that same column, I quoted Gilbert Welch, a professor of medicine at the Dartmouth Institute for Health Policy and Clinical Practice Welch, writing in The New York Times that screening healthy people leads to "needless appointments, needless tests, needless drugs and needless operations (not to mention all the accompanying needless insurance forms)."
Welch, author of the excellent book Overdiagnosed: Making People Sick in the Pursuit of Health(Beacon Press, 2011), added, "This process doesn’t promote health; it promotes disease. People suffer from more anxiety about their health, from drug side effects, from complications of surgery. A few die. And remember: these people felt fine when they entered the health care system."
Welch and a colleague estimate in The New England Journal Of Medicine that 70,000 American women were overdiagnosed with breast cancer in 2008. As I have reported previously, men who take a prostate-specific antigen test and receive a cancer diagnosis have been estimated to be 47 times more likely to get unnecessary, harmful treatments—biopsies, surgery, radiation, chemotherapy—than they are to have their lives extended.
The Affordable Health Care Act represented a reasonable step toward reforming over-priced, under-performing American medicine. But true reform will require ending the epidemic of overtesting and over treatment, which is bankrupting us without improving our health.
Photo by Rhoda Baer, National Cancer Institute, courtesy Wikimedia Commons.