Megan Danielczak couldn’t stand living with her husband, but couldn’t afford to live without him. So she came up with a plan that was boilerplate noir: Hire a killer to murder him, and collect the life-insurance payout. She met the hit man in a Walmart parking lot on Valentine’s Day last year, and gave him a down payment of three gold rings and $402 in cash, and a promise of another $4,500 on the back end. Fortunately for Danielczak’s husband, if unfortunately for her, the hit man was an undercover cop. She is now serving two years in a Wisconsin prison, having been convicted of solicitation to commit first-degree intentional homicide.
Stories of unconsummated contract killings make headlines on a regular basis. Sometimes the motive is shockingly impersonal: Last year, a Houston man allegedly took out a $2,000 contract on the police officer who had been slapping his business’s vehicles with tickets. More often, the crime can be traced to an intimate but fractured relationship. In February, federal authorities charged an Indiana man and his girlfriend with murder for hire, after the two allegedly solicited a hit on the man’s ex-wife following a child-custody battle. The couple agreed to a fee of $5,000 to $10,000, “depending on the job’s complexity.” As in the Danielczak case, both the Houston and Indiana plots were foiled by undercover law-enforcement officers.
Criminologists have a name for a person who hires a hit man: instigator. They also confirm what news stories suggest: Lots of instigators get caught because they don’t know what they’re doing. After all, most of us don’t socialize with professional killers. The average person therefore looks to acquaintances or neighbors for referrals, or finds his way to criminal bottom-feeders who are likely to be inept and inexperienced. The former may be inclined to call law enforcement, while the latter may lose their nerve or botch the job. Which helps explain why so many murders for hire don’t produce any dead bodies.
In 2003, the Australian Institute of Criminology published an analysis of 163 contract-killing cases (some completed, others merely attempted) in Australia; it remains one of the most significant studies ever conducted of the subject. The authors determined that 2 percent of all murders in Australia were contract killings and that contracts were, in some cases, surprisingly affordable. One unfulfilled contract was for 500 Australian dollars; another job was completed for just $2,000. Among other key findings, nearly 20 percent of all contracts involved a romantic relationship gone wrong, and 16 percent were financially motivated.
Another study, this one of contract killings in Tennessee, found instigators pretty evenly split between men and women. This is notable, given that almost all conventional murders are committed by men. But it tracks with the fact that women are almost as likely as men to wish someone dead. In The Murderer Next Door: Why the Mind Is Designed to Kill, David M. Buss, an evolutionary psychologist, reports that “91 percent of men and 84 percent of women have had at least one vivid fantasy about killing someone.”
What of the people who are hired to kill? Reid Meloy, a forensic psychologist who has consulted on a dozen murder-for-hire cases, told me that virtually all of the contract killers he’s examined display moderate to severe psychopathy. “Psychopathy, as a constellation of personality traits, gives them both the aggression and the emotional detachment to be able to carry out an act like this for money,” he says. Other experts I spoke with believe that both parties to a contract killing are engaged in psychological distancing. The contractor comforts himself by saying, This is my job. I’m just following orders. The instigator thinks, I’m not a murderer—he’s the one pulling the trigger.
Park Dietz, a forensic psychiatrist who has testified in court cases of criminals ranging from serial killers (Jeffrey Dahmer) to deranged assassins (John Hinckley Jr.), has another theory as to why homicidal people hire help. “My prime suspect is the depiction of hit men in popular culture, such as films, TV, video games, and novels,” Dietz told me, noting that the last time he entered hit man into Netflix, hundreds of results appeared. According to Dietz, such entertainment gives “the illusion that this is a service available to anyone.” In a world where dangerous or unpleasant tasks are routinely outsourced, a viewer might think, Well, why not this too?
This article appears in the July 2019 print edition with the headline “Hired Guns.”
WordPress 5.6 is set to add a UI that allows users to opt into auto-updates for major versions of core. Previously, developers could turn these updates on by setting the WP_AUTO_UPDATE_CORE constant to true or by using the allow_major_auto_core_updates filter. Version 5.6 exposes this setting in the UI to make it more accessible for users.
Jb Audras posted a dev note on the feature yesterday with instructions for how developers can extend it to add more options.
A previous version of this UI specified that the setting refers to major versions:
Keep my site up-to-date with regular feature updates (major versions).
This was changed 11 days ago to remove the wording that tells users which versions the setting controls.
“The idea was to make the wording more general, and maybe easier to understand,” Jb Audras said. “As minor updates are already automatically updated (since 3.7), new users may not understand what is behind the ‘major versions’ term.”
This new wording makes the setting unclear. Users may not understand what “major versions” are but “feature updates” is even less clear. Does it include updates to existing features? Or only the introduction of brand new features? A better option might be to link “major versions” to documentation on HelpHub.
Audras said he is open to having the wording changed but that so far those testing the beta don’t seem to have a problem with it. String freeze is scheduled for November 10, and after that no more wording updates can be committed.
Contributors are also discussing adding a filter that would allow developers to hide the auto-updates UI for major versions. Mike Schroder noted that this would be especially useful for hosting companies that are handling updates in a different way. Some developers or agencies may want to use the filter to prevent their clients from turning auto-updates on for major versions.
Core Committer Jonathan Desrosiers said he is not in favor of using a filter to hide the UI on a page that is not likely to be accessed by users who have the ability to update core:
If that change is made (disabling the form when the constant is defined or allow_major_auto_core_updates filter is used), then I am not sure the UI should ever be hidden. As raised by @aaroncampbell in today’s weekly meeting, the update page is only accessible to those with the update_core capability (trusted users). While there may be valid use cases for wholesale hiding the new feature, I haven’t seen one yet. To me, disabling the form and explaining why the form cannot be used to update the desired behavior is more valuable to the site owner, as they would be better equipped to make an adjustment.
If you want to contribute to the conversation, check out the dev note on the new auto-updates interface for major versions and the Trac ticket for a filer that would hide the UI.
These words came from an elderly woman sitting behind me on a late-night flight from Los Angeles to Washington, D.C. The plane was dark and quiet. A man I assumed to be her husband murmured almost inaudibly in response, something to the effect of “I wish I was dead.”
I didn’t mean to eavesdrop, but couldn’t help it. I listened with morbid fascination, forming an image of the man in my head as they talked. I imagined someone who had worked hard all his life in relative obscurity, someone with unfulfilled dreams—perhaps of the degree he never attained, the career he never pursued, the company he never started.
At the end of the flight, as the lights switched on, I finally got a look at the desolate man. I was shocked. I recognized him—he was, and still is, world-famous. Then in his mid‑80s, he was beloved as a hero for his courage, patriotism, and accomplishments many decades ago.
As he walked up the aisle of the plane behind me, other passengers greeted him with veneration. Standing at the door of the cockpit, the pilot stopped him and said, “Sir, I have admired you since I was a little boy.” The older man—apparently wishing for death just a few minutes earlier—beamed with pride at the recognition of his past glories.
For selfish reasons, I couldn’t get the cognitive dissonance of that scene out of my mind. It was the summer of 2015, shortly after my 51st birthday. I was not world-famous like the man on the plane, but my professional life was going very well. I was the president of a flourishing Washington think tank, the American Enterprise Institute. I had written some best-selling books. People came to my speeches. My columns were published in The New York Times.
But I had started to wonder: Can I really keep this going? I work like a maniac. But even if I stayed at it 12 hours a day, seven days a week, at some point my career would slow and stop. And when it did, what then? Would I one day be looking back wistfully and wishing I were dead? Was there anything I could do, starting now, to give myself a shot at avoiding misery—and maybe even achieve happiness—when the music inevitably stops?
Though these questions were personal, I decided to approach them as the social scientist I am, treating them as a research project. It felt unnatural—like a surgeon taking out his own appendix. But I plunged ahead, and for the past four years, I have been on a quest to figure out how to turn my eventual professional decline from a matter of dread into an opportunity for progress.
Here’s what I’ve found.
The field of “happiness studies” has boomed over the past two decades, and a consensus has developed about well-being as we advance through life. In The Happiness Curve: Why Life Gets Better After 50, Jonathan Rauch, a Brookings Institution scholar and an Atlantic contributing editor, reviews the strong evidence suggesting that the happiness of most adults declines through their 30s and 40s, then bottoms out in their early 50s. Nothing about this pattern is set in stone, of course. But the data seem eerily consistent with my experience: My 40s and early 50s were not an especially happy period of my life, notwithstanding my professional fortunes.
So what can people expect after that, based on the data? The news is mixed. Almost all studies of happiness over the life span show that, in wealthier countries, most people’s contentment starts to increase again in their 50s, until age 70 or so. That is where things get less predictable, however. After 70, some people stay steady in happiness; others get happier until death. Others—men in particular—see their happiness plummet. Indeed, depression and suicide rates for men increase after age 75.
This last group would seem to include the hero on the plane. A few researchers have looked at this cohort to understand what drives their unhappiness. It is, in a word, irrelevance. In 2007, a team of academic researchers at UCLA and Princeton analyzed data on more than 1,000 older adults. Their findings, published in the Journal of Gerontology, showed that senior citizens who rarely or never “felt useful” were nearly three times as likely as those who frequently felt useful to develop a mild disability, and were more than three times as likely to have died during the course of the study.
One might think that gifted and accomplished people, such as the man on the plane, would be less susceptible than others to this sense of irrelevance; after all, accomplishment is a well-documented source of happiness. If current accomplishment brings happiness, then shouldn’t the memory of that accomplishment provide some happiness as well?
Maybe not. Though the literature on this question is sparse, giftedness and achievements early in life do not appear to provide an insurance policy against suffering later on. In 1999, Carole Holahan and Charles Holahan, psychologists at the University of Texas, published an influential paper in The International Journal of Aging and Human Development that looked at hundreds of older adults who early in life had been identified as highly gifted. The Holahans’ conclusion: “Learning at a younger age of membership in a study of intellectual giftedness was related to … less favorable psychological well-being at age eighty.”
This study may simply be showing that it’s hard to live up to high expectations, and that telling your kid she is a genius is not necessarily good parenting. (The Holahans surmise that the children identified as gifted might have made intellectual ability more central to their self-appraisal, creating “unrealistic expectations for success” and causing them to fail to “take into account the many other life influences on success and recognition.”) However, abundant evidence suggests that the waning of ability in people of high accomplishment is especially brutal psychologically. Consider professional athletes, many of whom struggle profoundly after their sports career ends. Tragic examples abound, involving depression, addiction, or suicide; unhappiness in retired athletes may even be the norm, at least temporarily. A study published in the Journal of Applied Sport Psychology in 2003, which charted the life satisfaction of former Olympic athletes, found that they generally struggled with a low sense of personal control when they first stopped competing.
Recently, I asked Dominique Dawes, a former Olympic gold-medal gymnast, how normal life felt after competing and winning at the highest levels. She told me that she is happy, but that the adjustment wasn’t easy—and still isn’t, even though she won her last Olympic medal in 2000. “My Olympic self would ruin my marriage and leave my kids feeling inadequate,” she told me, because it is so demanding and hard-driving. “Living life as if every day is an Olympics only makes those around me miserable.”
Why might former elite performers have such a hard time? No academic research has yet proved this, but I strongly suspect that the memory of remarkable ability, if that is the source of one’s self-worth, might, for some, provide an invidious contrast to a later, less remarkable life. “Unhappy is he who depends on success to be happy,” Alex Dias Ribeiro, a former Formula 1 race-car driver, once wrote. “For such a person, the end of a successful career is the end of the line. His destiny is to die of bitterness or to search for more success in other careers and to go on living from success to success until he falls dead. In this case, there will not be life after success.”
Call it the Principle of Psychoprofessional Gravitation: the idea that the agony of professional oblivion is directly related to the height of professional prestige previously achieved, and to one’s emotional attachment to that prestige. Problems related to achieving professional success might appear to be a pretty good species of problem to have; even raising this issue risks seeming precious. But if you reach professional heights and are deeply invested in being high up, you can suffer mightily when you inevitably fall. That’s the man on the plane. Maybe that will be you, too. And, without significant intervention, I suspect it will be me.
The Principle of Psychoprofessional Gravitation can help explain the many cases of people who have done work of world-historical significance yet wind up feeling like failures. Take Charles Darwin, who was just 22 when he set out on his five-year voyage aboard the Beagle in 1831. Returning at 27, he was celebrated throughout Europe for his discoveries in botany and zoology, and for his early theories of evolution. Over the next 30 years, Darwin took enormous pride in sitting atop the celebrity-scientist pecking order, developing his theories and publishing them as books and essays—the most famous being On the Origin of Species, in 1859.
But as Darwin progressed into his 50s, he stagnated; he hit a wall in his research. At the same time an Austrian monk by the name of Gregor Mendel discovered what Darwin needed to continue his work: the theory of genetic inheritance. Unfortunately, Mendel’s work was published in an obscure academic journal and Darwin never saw it—and in any case, Darwin did not have the mathematical ability to understand it. From then on he made little progress. Depressed in his later years, he wrote to a close friend, “I have not the heart or strength at my age to begin any investigation lasting years, which is the only thing which I enjoy.”
Presumably, Darwin would be pleasantly surprised to learn how his fame grew after his death, in 1882. From what he could see when he was old, however, the world had passed him by, and he had become irrelevant. That could have been Darwin on the plane behind me that night.
It also could have been a younger version of me, because I have had precocious experience with professional decline.
As a child, I had just one goal: to be the world’s greatest French-horn player. I worked at it slavishly, practicing hours a day, seeking out the best teachers, and playing in any ensemble I could find. I had pictures of famous horn players on my bedroom wall for inspiration. And for a while, I thought my dream might come true. At 19, I left college to take a job playing professionally in a touring chamber-music ensemble. My plan was to keep rising through the classical-music ranks, joining a top symphony orchestra in a few years or maybe even becoming a soloist—the most exalted job a classical musician can hold.
But then, in my early 20s, a strange thing happened: I started getting worse. To this day, I have no idea why. My technique began to suffer, and I had no explanation for it. Nothing helped. I visited great teachers and practiced more, but I couldn’t get back to where I had been. Pieces that had been easy to play became hard; pieces that had been hard became impossible.
Perhaps the worst moment in my young but flailing career came at age 22, when I was performing at Carnegie Hall. While delivering a short speech about the music I was about to play, I stepped forward, lost my footing, and fell off the stage into the audience. On the way home from the concert, I mused darkly that the experience was surely a message from God.
But I sputtered along for nine more years. I took a position in the City Orchestra of Barcelona, where I increased my practicing but my playing gradually deteriorated. Eventually I found a job teaching at a small music conservatory in Florida, hoping for a magical turnaround that never materialized. Realizing that maybe I ought to hedge my bets, I went back to college via distance learning, and earned my bachelor’s degree shortly before my 30th birthday. I secretly continued my studies at night, earning a master’s degree in economics a year later. Finally I had to admit defeat: I was never going to turn around my faltering musical career. So at 31 I gave up, abandoning my musical aspirations entirely, to pursue a doctorate in public policy.
Life goes on, right? Sort of. After finishing my studies, I became a university professor, a job I enjoyed. But I still thought every day about my beloved first vocation. Even now, I regularly dream that I am onstage, and wake to remember that my childhood aspirations are now only phantasms.
I am lucky to have accepted my decline at a young enough age that I could redirect my life into a new line of work. Still, to this day, the sting of that early decline makes these words difficult to write. I vowed to myself that it wouldn’t ever happen again.
Will it happen again? In some professions, early decline is inescapable. No one expects an Olympic athlete to remain competitive until age 60. But in many physically nondemanding occupations, we implicitly reject the inevitability of decline before very old age. Sure, our quads and hamstrings may weaken a little as we age. But as long as we retain our marbles, our quality of work as a writer, lawyer, executive, or entrepreneur should remain high up to the very end, right? Many people think so. I recently met a man a bit older than I am who told me he planned to “push it until the wheels came off.” In effect, he planned to stay at the very top of his game by any means necessary, and then keel over.
But the odds are he won’t be able to. The data are shockingly clear that for most people, in most fields, decline starts earlier than almost anyone thinks.
According to research by Dean Keith Simonton, a professor emeritus of psychology at UC Davis and one of the world’s leading experts on the trajectories of creative careers, success and productivity increase for the first 20 years after the inception of a career, on average. So if you start a career in earnest at 30, expect to do your best work around 50 and go into decline soon after that.
The specific timing of peak and decline vary somewhat depending on the field. Benjamin Jones, a professor of strategy and entrepreneurship at Northwestern University’s Kellogg School of Management, has spent years studying when people are most likely to make prizewinning scientific discoveries and develop key inventions. His findings can be summarized by this little ditty:
Age is, of course, a fever chill
that every physicist must fear.
He’s better dead than living still
when once he’s past his thirtieth year.
Dirac overstates the point, but only a little. Looking at major inventors and Nobel winners going back more than a century, Jones has found that the most common age for producing a magnum opus is the late 30s. He has shown that the likelihood of a major discovery increases steadily through one’s 20s and 30s and then declines through one’s 40s, 50s, and 60s. Are there outliers? Of course. But the likelihood of producing a major innovation at age 70 is approximately what it was at age 20—almost nonexistent.
Much of literary achievement follows a similar pattern. Simonton has shown that poets peak in their early 40s. Novelists generally take a little longer. When Martin Hill Ortiz, a poet and novelist, collected data on New York Times fiction best sellers from 1960 to 2015, he found that authors were likeliest to reach the No. 1 spot in their 40s and 50s. Despite the famous productivity of a few novelists well into old age, Ortiz shows a steep drop-off in the chance of writing a best seller after the age of 70. (Some nonfiction writers—especially historians—peak later, as we shall see in a minute.)
This research concerns people at the very top of professions that are atypical. But the basic finding appears to apply more broadly. Scholars at Boston College’s Center for Retirement Research studied a wide variety of jobs and found considerable susceptibility to age-related decline in fields ranging from policing to nursing. Other research has found that the best-performing home-plate umpires in Major League Baseball have 18 years less experience and are 23 years younger than the worst-performing umpires (who are 56.1 years old, on average). Among air traffic controllers, the age-related decline is so sharp—and the potential consequences of decline-related errors so dire—that the mandatory retirement age is 56.
In sum, if your profession requires mental processing speed or significant analytic capabilities—the kind of profession most college graduates occupy—noticeable decline is probably going to set in earlier than you imagine.
If decline not only is inevitable but also happens earlier than most of us expect, what should we do when it comes for us?
Whole sections of bookstores are dedicated to becoming successful. The shelves are packed with titles like The Science of Getting Rich and The 7 Habits of Highly Effective People. There is no section marked “Managing Your Professional Decline.”
But some people have managed their declines well. Consider the case of Johann Sebastian Bach. Born in 1685 to a long line of prominent musicians in central Germany, Bach quickly distinguished himself as a musical genius. In his 65 years, he published more than 1,000 compositions for all the available instrumentations of his day.
Early in his career, Bach was considered an astoundingly gifted organist and improviser. Commissions rolled in; royalty sought him out; young composers emulated his style. He enjoyed real prestige.
But it didn’t last—in no small part because his career was overtaken by musical trends ushered in by, among others, his own son, Carl Philipp Emanuel, known as C.P.E. to the generations that followed. The fifth of Bach’s 20 children, C.P.E. exhibited the musical gifts his father had. He mastered the baroque idiom, but he was more fascinated with a new “classical” style of music, which was taking Europe by storm. As classical music displaced baroque, C.P.E.’s prestige boomed while his father’s music became passé.
Bach easily could have become embittered, like Darwin. Instead, he chose to redesign his life, moving from innovator to instructor. He spent a good deal of his last 10 years writing The Art of Fugue, not a famous or popular work in his time, but one intended to teach the techniques of the baroque to his children and students—and, as unlikely as it seemed at the time, to any future generations that might be interested. In his later years, he lived a quieter life as a teacher and a family man.
What’s the difference between Bach and Darwin? Both were preternaturally gifted and widely known early in life. Both attained permanent fame posthumously. Where they differed was in their approach to the midlife fade. When Darwin fell behind as an innovator, he became despondent and depressed; his life ended in sad inactivity. When Bach fell behind, he reinvented himself as a master instructor. He died beloved, fulfilled, and—though less famous than he once had been—respected.
The lesson for you and me, especially after 50: Be Johann Sebastian Bach, not Charles Darwin.
How does one do that?
A potential answer lies in the work of the British psychologist Raymond Cattell, who in the early 1940s introduced the concepts of fluid and crystallized intelligence. Cattell defined fluid intelligence as the ability to reason, analyze, and solve novel problems—what we commonly think of as raw intellectual horsepower. Innovators typically have an abundance of fluid intelligence. It is highest relatively early in adulthood and diminishes starting in one’s 30s and 40s. This is why tech entrepreneurs, for instance, do so well so early, and why older people have a much harder time innovating.
Crystallized intelligence, in contrast, is the ability to use knowledge gained in the past. Think of it as possessing a vast library and understanding how to use it. It is the essence of wisdom. Because crystallized intelligence relies on an accumulating stock of knowledge, it tends to increase through one’s 40s, and does not diminish until very late in life.
Careers that rely primarily on fluid intelligence tend to peak early, while those that use more crystallized intelligence peak later. For example, Dean Keith Simonton has found that poets—highly fluid in their creativity—tend to have produced half their lifetime creative output by age 40 or so. Historians—who rely on a crystallized stock of knowledge—don’t reach this milestone until about 60.
Here’s a practical lesson we can extract from all this: No matter what mix of intelligence your field requires, you can always endeavor to weight your career away from innovation and toward the strengths that persist, or even increase, later in life.
Like what? As Bach demonstrated, teaching is an ability that decays very late in life, a principal exception to the general pattern of professional decline over time. A study in The Journal of Higher Education showed that the oldest college professors in disciplines requiring a large store of fixed knowledge, specifically the humanities, tended to get evaluated most positively by students. This probably explains the professional longevity of college professors, three-quarters of whom plan to retire after age 65—more than half of them after 70, and some 15 percent of them after 80. (The average American retires at 61.) One day, during my first year as a professor, I asked a colleague in his late 60s whether he’d ever considered retiring. He laughed, and told me he was more likely to leave his office horizontally than vertically.
Our dean might have chuckled ruefully at this—college administrators complain that research productivity among tenured faculty drops off significantly in the last decades of their career. Older professors take up budget slots that could otherwise be used to hire young scholars hungry to do cutting-edge research. But perhaps therein lies an opportunity: If older faculty members can shift the balance of their work from research to teaching without loss of professional prestige, younger faculty members can take on more research.
Patterns like this match what I’ve seen as the head of a think tank full of scholars of all ages. There are many exceptions, but the most profound insights tend to come from those in their 30s and early 40s. The best synthesizers and explainers of complicated ideas—that is, the best teachers—tend to be in their mid-60s or older, some of them well into their 80s.
That older people, with their stores of wisdom, should be the most successful teachers seems almost cosmically right. No matter what our profession, as we age we can dedicate ourselves to sharing knowledge in some meaningful way.
A few years ago, I saw a cartoon of a man on his deathbed saying, “I wish I’d bought more crap.” It has always amazed me that many wealthy people keep working to increase their wealth, amassing far more money than they could possibly spend or even usefully bequeath. One day I asked a wealthy friend why this is so. Many people who have gotten rich know how to measure their self-worth only in pecuniary terms, he explained, so they stay on the hamster wheel, year after year. They believe that at some point, they will finally accumulate enough to feel truly successful, happy, and therefore ready to die.
This is a mistake, and not a benign one. Most Eastern philosophy warns that focusing on acquisition leads to attachment and vanity, which derail the search for happiness by obscuring one’s essential nature. As we grow older, we shouldn’t acquire more, but rather strip things away to find our true selves—and thus, peace.
At some point, writing one more book will not add to my life satisfaction; it will merely stave off the end of my book-writing career. The canvas of my life will have another brushstroke that, if I am being forthright, others will barely notice, and will certainly not appreciate very much. The same will be true for most other markers of my success.
What I need to do, in effect, is stop seeing my life as a canvas to fill, and start seeing it more as a block of marble to chip away at and shape something out of. I need a reverse bucket list. My goal for each year of the rest of my life should be to throw out things, obligations, and relationships until I can clearly see my refined self in its best form.
And that self is … who, exactly?
Last year, the search for an answer to this question took me deep into the South Indian countryside, to a town called Palakkad, near the border between the states of Kerala and Tamil Nadu. I was there to meet the guru Sri Nochur Venkataraman, known as Acharya (“Teacher”) to his disciples. Acharya is a quiet, humble man dedicated to helping people attain enlightenment; he has no interest in Western techies looking for fresh start-up ideas or burnouts trying to escape the religious traditions they were raised in. Satisfied that I was neither of those things, he agreed to talk with me.
I told him my conundrum: Many people of achievement suffer as they age, because they lose their abilities, gained over many years of hard work. Is this suffering inescapable, like a cosmic joke on the proud? Or is there a loophole somewhere—a way around the suffering?
Acharya answered elliptically, explaining an ancient Hindu teaching about the stages of life, or ashramas. The first is Brahmacharya, the period of youth and young adulthood dedicated to learning. The second is Grihastha, when a person builds a career, accumulates wealth, and creates a family. In this second stage, the philosophers find one of life’s most common traps: People become attached to earthly rewards—money, power, sex, prestige—and thus try to make this stage last a lifetime.
The antidote to these worldly temptations is Vanaprastha, the third ashrama, whose name comes from two Sanskrit words meaning “retiring” and “into the forest.” This is the stage, usually starting around age 50, in which we purposefully focus less on professional ambition, and become more and more devoted to spirituality, service, and wisdom. This doesn’t mean that you need to stop working when you turn 50—something few people can afford to do—only that your life goals should adjust.
Vanaprastha is a time for study and training for the last stage of life, Sannyasa, which should be totally dedicated to the fruits of enlightenment. In times past, some Hindu men would leave their family in old age, take holy vows, and spend the rest of their life at the feet of masters, praying and studying. Even if sitting in a cave at age 75 isn’t your ambition, the point should still be clear: As we age, we should resist the conventional lures of success in order to focus on more transcendentally important things.
I told Acharya the story about the man on the plane. He listened carefully, and thought for a minute. “He failed to leave Grihastha,” he told me. “He was addicted to the rewards of the world.” He explained that the man’s self-worth was probably still anchored in the memories of professional successes many years earlier, his ongoing recognition purely derivative of long-lost skills. Any glory today was a mere shadow of past glories. Meanwhile, he’d completely skipped the spiritual development of Vanaprastha, and was now missing out on the bliss of Sannyasa.
There is a message in this for those of us suffering from the Principle of Psychoprofessional Gravitation. Say you are a hard-charging, type-A lawyer, executive, entrepreneur, or—hypothetically, of course—president of a think tank. From early adulthood to middle age, your foot is on the gas, professionally. Living by your wits—by your fluid intelligence—you seek the material rewards of success, you attain a lot of them, and you are deeply attached to them. But the wisdom of Hindu philosophy—and indeed the wisdom of many philosophical traditions—suggests that you should be prepared to walk away from these rewards before you feel ready. Even if you’re at the height of your professional prestige, you probably need to scale back your career ambitions in order to scale up your metaphysical ones.
When the New York Times columnist David Brooks talks about the difference between “résumé virtues” and “eulogy virtues,” he’s effectively putting the ashramas in a practical context. Résumé virtues are professional and oriented toward earthly success. They require comparison with others. Eulogy virtues are ethical and spiritual, and require no comparison. Your eulogy virtues are what you would want people to talk about at your funeral. As in He was kind and deeply spiritual, not He made senior vice president at an astonishingly young age and had a lot of frequent-flier miles.
You won’t be around to hear the eulogy, but the point Brooks makes is that we live the most fulfilling life—especially once we reach midlife—by pursuing the virtues that are most meaningful to us.
I suspect that my own terror of professional decline is rooted in a fear of death—a fear that, even if it is not conscious, motivates me to act as if death will never come by denying any degradation in my résumé virtues. This denial is destructive, because it leads me to ignore the eulogy virtues that bring me the greatest joy.
How can I overcome this tendency? The Buddha recommends, of all things, corpse meditation: Many Theravada Buddhist monasteries in Thailand and Sri Lanka display photos of corpses in various states of decomposition for the monks to contemplate. “This body, too,” students are taught to say about their own body, “such is its nature, such is its future, such is its unavoidable fate.” At first this seems morbid. But its logic is grounded in psychological principles—and it’s not an exclusively Eastern idea. “To begin depriving death of its greatest advantage over us,” Michel de Montaigne wrote in the 16th century, “let us deprive death of its strangeness, let us frequent it, let us get used to it; let us have nothing more often in mind than death.”
Psychologists call this desensitization, in which repeated exposure to something repellent or frightening makes it seem ordinary, prosaic, not scary. And for death, it works. In 2017, a team of researchers at several American universities recruited volunteers to imagine they were terminally ill or on death row, and then to write blog posts about either their imagined feelings or their would-be final words. The researchers then compared these expressions with the writings and last words of people who were actually dying or facing capital punishment. The results, published in Psychological Science, were stark: The words of the people merely imagining their imminent death were three times as negative as those of the people actually facing death—suggesting that, counterintuitively, death is scarier when it is theoretical and remote than when it is a concrete reality closing in.
For most people, actively contemplating our demise so that it is present and real (rather than avoiding the thought of it via the mindless pursuit of worldly success) can make death less frightening; embracing death reminds us that everything is temporary, and can make each day of life more meaningful. “Death destroys a man,” E. M. Forster wrote, but “the idea of Death saves him.”
Decline is inevitable, and it occurs earlier than almost any of us wants to believe. But misery is not inevitable. Accepting the natural cadence of our abilities sets up the possibility of transcendence, because it allows the shifting of attention to higher spiritual and life priorities.
But such a shift demands more than mere platitudes. I embarked on my research with the goal of producing a tangible road map to guide me during the remaining years of my life. This has yielded four specific commitments.
The biggest mistake professionally successful people make is attempting to sustain peak accomplishment indefinitely, trying to make use of the kind of fluid intelligence that begins fading relatively early in life. This is impossible. The key is to enjoy accomplishments for what they are in the moment, and to walk away perhaps before I am completely ready—but on my own terms.
So: I’ve resigned my job as president of the American Enterprise Institute, effective right about the time this essay is published. While I have not detected deterioration in my performance, it was only a matter of time. Like many executive positions, the job is heavily reliant on fluid intelligence. Also, I wanted freedom from the consuming responsibilities of that job, to have time for more spiritual pursuits. In truth, this decision wasn’t entirely about me. I love my institution and have seen many others like it suffer when a chief executive lingered too long.
Leaving something you love can feel a bit like a part of you is dying. In Tibetan Buddhism, there is a concept called bardo, which is a state of existence between death and rebirth—“like a moment when you step toward the edge of a precipice,” as a famous Buddhist teacher puts it. I am letting go of a professional life that answers the question Who am I?
I am extremely fortunate to have the means and opportunity to be able to walk away from a job. Many people cannot afford to do that. But you don’t necessarily have to quit your job; what’s important is striving to detach progressively from the most obvious earthly rewards—power, fame and status, money—even if you continue to work or advance a career. The real trick is walking into the next stage of life, Vanaprastha, to conduct the study and training that prepare us for fulfillment in life’s final stage.
Time is limited, and professional ambition crowds out things that ultimately matter more. To move from résumé virtues to eulogy virtues is to move from activities focused on the self to activities focused on others. This is not easy for me; I am a naturally egotistical person. But I have to face the fact that the costs of catering to selfishness are ruinous—and I now work every day to fight this tendency.
Fortunately, an effort to serve others can play to our strengths as we age. Remember, people whose work focuses on teaching or mentorship, broadly defined, peak later in life. I am thus moving to a phase in my career in which I can dedicate myself fully to sharing ideas in service of others, primarily by teaching at a university. My hope is that my most fruitful years lie ahead.
Because I’ve talked a lot about various religious and spiritual traditions—and emphasized the pitfalls of overinvestment in career success—readers might naturally conclude that I am making a Manichaean separation between the worlds of worship and work, and suggesting that the emphasis be on worship. That is not my intention. I do strongly recommend that each person explore his or her spiritual self—I plan to dedicate a good part of the rest of my life to the practice of my own faith, Roman Catholicism. But this is not incompatible with work; on the contrary, if we can detach ourselves from worldly attachments and redirect our efforts toward the enrichment and teaching of others, work itself can become a transcendental pursuit.
“The aim and final end of all music,” Bach once said, “should be none other than the glory of God and the refreshment of the soul.” Whatever your metaphysical convictions, refreshment of the soul can be the aim of your work, like Bach’s.
Bach finished each of his manuscripts with the words Soli Deo gloria—“Glory to God alone.” He failed, however, to write these words on his last manuscript, “Contrapunctus 14,” from The Art of Fugue, which abruptly stops mid-measure. His son C.P.E. added these words to the score: “Über dieser Fuge … ist der Verfasser gestorben” (“At this point in the fugue … the composer died”). Bach’s life and work merged with his prayers as he breathed his last breath. This is my aspiration.
Throughout this essay, I have focused on the effect that the waning of my work prowess will have on my happiness. But an abundance of research strongly suggests that happiness—not just in later years but across the life span—is tied directly to the health and plentifulness of one’s relationships. Pushing work out of its position of preeminence—sooner rather than later—to make space for deeper relationships can provide a bulwark against the angst of professional decline.
Dedicating more time to relationships, and less to work, is not inconsistent with continued achievement. “He is like a tree planted by streams of water,” the Book of Psalms says of the righteous person, “yielding its fruit in season, whose leaf does not wither, and who prospers in all he does.” Think of an aspen tree. To live a life of extraordinary accomplishment is—like the tree—to grow alone, reach majestic heights alone, and die alone. Right?
The secret to bearing my decline—to enjoying it—is to become more conscious of the roots linking me to others. If I have properly developed the bonds of love among my family and friends, my own withering will be more than offset by blooming in others.
When I talk about this personal research project I’ve been pursuing, people usually ask: Whatever happened to the hero on the plane?
I think about him a lot. He’s still famous, popping up in the news from time to time. Early on, when I saw a story about him, I would feel a flash of something like pity—which I now realize was really only a refracted sense of terror about my own future. Poor guy really meant I’m screwed.
But as my grasp of the principles laid out in this essay has deepened, my fear has declined proportionately. My feeling toward the man on the plane is now one of gratitude for what he taught me. I hope that he can find the peace and joy he is inadvertently helping me attain.
In April 2018, I spent three days in Austin, Texas, in the companyof more than 2,500 people, most of them women, who are deeply concerned about the problem of workplace sexual harassment. The venue was the city’s convention center, and when a man named Derek Irvine took the vast stage and said that there had been “an uprising in the world of those who refuse to be silent,” the crowd roared its support. He introduced a panel of speakers who have been intimately involved with the #MeToo movement: Tarana Burke, the creator of the original campaign and hashtag; Ronan Farrow, who broke the Harvey Weinstein story in The New Yorker; and Ashley Judd, one of the actors who says she was harassed by Weinstein. Adam Grant, the author of many highly regarded books on management theory and a professor at the Wharton School, interviewed them, and their remarks were often interrupted by loud, admiring applause.
The session ended to a standing ovation, which was not surprising, given the moral authority of the speakers. What was surprising, however, was the makeup of the audience: This was a gathering not of activists, but of professionals who work in human resources. The event was a convention called Workhuman, put on by a software company.
For 30 years, ever since Anita Hill testified at Clarence Thomas’s Supreme Court confirmation hearings, HR has been almost universally accepted as the mechanism by which employers attempt to prevent, police, and investigate sexual harassment. Even the Equal Employment Opportunity Commission directs Americans to their HR offices if they experience harassment. That the #MeToo movement kept turning up so many shocking stories at so many respected places of employment seemed to me to reflect a massive failure of human resources to do the job we have expected it to perform. Even Harvey Weinstein’s company, after all, had an HR department.
I went to Texas to get a sense of how the people who work in the field were feeling about this exposure of their profession’s shortcomings. Each morning at the convention, I fished around in my suitcase for something that looked businessy and then clip-clopped across the street to the convention center, joining a stream of similarly attired women. (HR is a profession of women; 75 percent of the field’s workers are female—as, of course, are the overwhelming majority of employees who experience sexual harassment.) Our numbers grew in strength until we became a river of Banana Republic blazers and Ann Taylor wrap dresses and J.Crew slingbacks, a crowd of professional women, ages 25 to 60, all in an aggressively upbeat mood, many in chunky jewelry.
As I got to know them, I found the Workhuman attendees to be extremely personable and helpful, eager to wave me over to lunch tables and coffee groups. But they evinced an oddly disinterested attitude toward #MeToo. On the one hand, they were inspired by the movement. On the other hand, they did not exhibit any particular sense of responsibility for the kinds of failures that have allowed harassment to flourish. When Farrow said that #MeToo was about the “elaborate systems in place that could be utilized by the most powerful, the wealthiest, men” and Grant replied that the “reporting systems in companies tend not to work very well,” I thought the crowd might take offense, but no one seemed insulted.
The problem of sexual harassment wasn’t merely one of “bad apples,” Grant continued, but also of “bad barrels.” I looked around the hall, thinking that in this analogy the harassing men were the apples, and the systems that protect them—such as HR—were the bad barrels. But no one else appeared to take his remark that way. The audience members applauded graciously when Farrow paid them a high compliment: He was “so happy to be speaking to this room,” to people who took preventing sexual harassment as their “sacred charge.”
No one called for reforming or replacing HR. Just the opposite: The answer to the failures of HR, it seemed, was more HR.
The experience left me with a question: If HR is such a vital component of American business, its tentacles reaching deeply into many spheres of employees’ work lives, how did it miss the kind of sexual harassment at the center of the #MeToo movement? And given that it did, why are companies still putting so much faith in HR? I returned to these questions many times over the course of the following year, interviewing workplace experts, lawyers, management consultants, and workers in the field.
Finally, I realized I had it all wrong. The simple and unpalatable truth is that HR isn’t bad at dealing with sexual harassment. HR is actually very good at it.
In the old days,there was personnel: payroll, hiring, and—should things go terribly awry—pink slips. It was an office where the clatter of a typewriter signaled that volumes of paperwork were being shifted from inbox to outbox, and where employees could be just as bloodlessly reshuffled from “in” to “out.” It was women’s work, and in the popular imagination it was the terrain of the spinster: humorless, a stickler.
Human resources performs all of these old functions, along with a host of new ones. Employees often imagine that the “resources” on offer are the benefits that flow to them from that department, but in the term’s 19th-century origins, it is the workers themselves who are the resources, one more asset—along with equipment, factories, and capital—at the company’s disposal. Most HR reps today would never dream of speaking about employees as a type of commodity (at least not to their face), although it can be hard to understand what, exactly, these reps are talking about, because the field is rich with jargon: onboarding, balanced scorecards, cultural integration, the 80/20 rule.
On The Office, Michael Scott once said of Toby, the Dunder Mifflin HR rep: “If I had a gun with two bullets, and I was in a room with Hitler, bin Laden, and Toby, I would shoot Toby twice.” Over the past year, every time a friend asked what I was working on and I mentioned the letters HR, there was a remarkably consistent response: a quiet groan and a brief, skyward look—not a two-bullet look, but not a one-bullet look, either.
Fairly or not, HR is seen as the division of the company that slows things down, generates endless memos, meddles in employees’ personal business, holds compulsory “trainings,” and ruins any fun and spirit-lifting thing employees come up with. A notorious Fast Company cover story, published in 2005, is called “Why We Hate HR.” Its author, Keith H. Hammonds, laid out a string of damning questions that have resonated with businesspeople ever since:
Why are annual performance appraisals so time-consuming—and so routinely useless? Why is HR so often a henchman for the chief financial officer, finding ever-more ingenious ways to cut benefits and hack at payroll? Why do its communications—when we can understand them at all—so often flout reality? Why are so many people processes duplicative and wasteful, creating a forest of paperwork for every minor transaction? And why does HR insist on sameness as a proxy for equity?
But the real reason many workers don’t love human resources is that while the department often presents itself as functioning like a union—the open door for worker complaints, the updates on valuable new benefits—it is not a union. In a strong job market, HR is the soul of generosity, making employees feel valued and significant. But should the economy change, or should management decide to go in another direction, HR can just as quickly become assassin as friend. The last face you’ll see is Jane’s—your pal from HR, who hands out the discounted tickets to Knott’s Berry Farm and sends the blast emails about Chipotle Friday—and she’ll be dry-eyed while collecting your employee badge and invoking the executioner’s code: COBRA.
Jane’s not a bad person—she’s just carrying out orders from far up the ladder. And when it comes to sexual harassment, women understand that Jane reports to upper management, not some neutral body that stands in allegiance with right moral action. If employers judged HR departments by their ability to prevent sexual harassment, most would have gotten a failing grade long ago. What HR is actually responsible for—one of the central ways the department “adds value” to a company—is serving as the first line of defense against a sexual-harassment lawsuit. These two goals are clearly aligned, but if the past year has taught us anything, it’s that you can achieve the latter without doing much of anything at all about the former.
In October 2014, Ellen DeGeneres did something on her talk show that we can hardly imagine in today’s environment: She made an extended joke about sexual harassment. “Last week we had our mandatory sexual-harassment training seminar,” she told the audience. “We have it every year for all of the employees, and it combines frank discussions about the workplace behavior and … mind-numbing boredom.” The people in the audience laughed appreciatively—they knew exactly what she meant. Then she introduced a game: “Sexual-Harassment Training or Late-Night Movie?” And, with the eager participation of the audience, she read lines of dialogue and asked the crowd to guess their source.
Ellen’s joke depended on our common understanding that in the decades since Anita Hill’s testimony, HR has created a huge body of instructional films, computer training modules, seminar scripts, and written policies on sexual harassment. That a subject as urgent and—in its own, lurid way—bound with eros, fear, and guilt created an oeuvre known primarily for its stupefying dullness should have been a clue that the serious issue of harassment was being funneled through a bureaucracy whose aim was not (at least not purely) protecting women workers.
Hill’s testimony riveted the nation. It occurred years before the forensically prurient Starr Report became part of breakfast-table discourse; before hard-core pornography became a subject of open conversation; before sex workers were interviewed, respectfully, on staid national news programs. It was unprecedented: a dignified and extremely well-educated woman testifying before a group of male senators about pubic hair on a Coke can, all while the camera whirred before her and the entire country looked on. It was, in other words, exactly the kind of sui generis event that should not have resonated on a deeply personal level with any woman, save perhaps some of Clarence Thomas’s law clerks. Yet it did resonate with women—millions of them. Their response was nonpartisan, unifying, nationwide, and—for many men—eye-opening. The concept, if not the linguistic formation, of “Me too” was born almost overnight. Hill’s composure in the face of withering and often humiliating male commentary (including, let us not forget, that of Joe Biden) was stirring. “I am not given to fantasy,” she said simply. “This is not something I would have come forward with if I were not absolutely sure.”
Hill’s testimony gave American women a way of understanding something that the Supreme Court had decided four and a half years earlier, in the famous Meritor Savings Bank v. Vinson case, which established that sexual harassment is a form of discrimination as defined by Title VII of the Civil Rights Act. A potent combination of factors was born: Women could sue for sexual harassment, and their employer could be on the line for big damages. That last fact caught the attention of American employers and is the true father of the system that Ellen and so many other Americans have mocked.
At solving the problem, HR is not great. At creating protocols of “compliance” to defend a company against lawsuits? By that criterion, it has been a smashing success. How do we know? Partly because employers are so devoted to it; the first thing many an executive will do when a company is under scrutiny for sexual harassment is heap praise on its crackerjack HR team, and describe the accused men as outliers.
Pam Teren, an employment lawyer in Los Angeles, graduated from law school and began working at a firm in 1990. “I thought I’d probably never have a sexual-harassment case,” she told me. The next year, Anita Hill testified, and these cases poured in. She told herself, “This is a five-year window. Because how simple is this? Don’t grab women. Don’t stare at their chests.” We both laughed—it really was pretty obvious. She figured that men would catch on quickly and the window would close. But she was wrong. Like thousands of lawyers across the country, she has been taking sexual-harassment cases ever since. Her entire career has been devoted to this work.
One aspect of the #MeToo movement that has puzzled observers is the nature of its inciting incident: the reports of Harvey Weinstein’s alleged sexual harassment. Why is it that such a singular bit of horror—with its luxury hotels, its glamorous locales, its involvement of famous actors, and its promises of Hollywood stardom—launched a movement that united women from all walks of life and all types of jobs?
The reality is that #MeToo was waiting to happen. Women’s anger and frustration had been a simmering pot, its lid jittering. Something was going to cause it to boil over soon enough. The anger was about harassment; the frustration was about the system that had been created to address it.
In the months before Weinstein’s crimes were revealed, in October 2017, three prominent American women spoke out against human-resources departments. In February, an Uber engineer named Susan Fowler wrote a 3,000-word blog post alleging sexual harassment at the company. She described her experiences with a male manager who propositioned her and wrote that she “expected that I would report him to HR, they would handle the situation appropriately, and then life would go on.” It didn’t. On April 6, Nancy Erika Smith, who represented Gretchen Carlson in her harassment suit against Fox News chief Roger Ailes, spoke at the annual Women in the World conference and told the audience a stark truth: “HR is not your friend. HR will not help you.” That same day, Anita Hill wrote in The Washington Post, “There are still companies that pay lip service to human-resources departments while quietly allowing women to be vilified when they come forward.” All of this set the table for what has happened in the wake of Weinstein.
In fact, the movement could have begun a full year earlier, in 2016, when a special task force from the EEOC released its findings on sexual harassment. The occasion was the anniversary of the Meritor case. The task force had been charged with determining how much progress the country had made since that historic decision. Its finding: very little. “Much of the training done over the last 30 years has not worked as a prevention tool,” the task force found. That’s an incredible statement—three decades of failure.
The EEOC report is a government white paper for the ages: sprawling, maddeningly unfocused, almost willfully opaque. But wade through it with pen in hand, and you realize it is also a startling document. It reveals that sexual harassment is “widespread” and “persistent,” and that 85 percent of workers who are harassed never report it. It found that employees are much more likely to come up with their own solution—such as avoiding the harasser, downplaying the harassment, or simply enduring it—than to seek help from HR. They are far more likely to ask a family member or co-worker for advice than to file a complaint, because they fear that they will face repercussions if they do.
Anti-harassment training, and the centrality of HR in resolving women’s problems, has been an ever-growing part of employees’ lives since 1998, when a pair of Supreme Court decisions, Faragher v. City of Boca Raton and Burlington Industries, Inc. v. Ellerth, changed the way courts look at harassment claims. If a company uses a Faragher-Ellerth defense, it is asserting that despite clear policy and regular training, the employee who was harassed failed to make a report to HR and so the employer was unable to resolve the problem. This is why all of that training—the videos and online courses and worksheets—seems so useless: because it’s designed to serve as a defense against an employment lawsuit. The task force cited a study that found “no evidence that the training affected the frequency of sexual harassment experienced by the women in the workplace.” The task force also said that HR trainings and procedures are “too focused on protecting the employer from liability,” and not focused enough on ending the problem.
The findings are depressing. Yet in a PowerPoint presentation accompanying the report, the task force maintains a hopeful, at times even chipper, tone. “The good news,” it pronounces after delivering the dire facts, is that “we have some creative ideas.”
One of these ideas comes in a bold assertion: “An ‘It’s On Us’ campaign in the workplace could be a game changer.” Here the EEOC is referring to a campaign that was introduced during President Barack Obama’s administration to reduce sexual assault on college campuses. There’s no indication that the campaign did anything at all to reduce sexual assault, and after an exciting and widely publicized launch it has been largely forgotten. Other suggestions include bystander-intervention training and civility training—the latter of which even the EEOC admits hasn’t been “rigorously evaluated” as a tool for preventing harassment.
“Creative ideas”? A year before Weinstein, the agency ultimately responsible for fighting sexual harassment in America was grasping at straws.
When I was in Austin,I asked conference attendees why HR has accomplished so little. Over and over, they gave me a version of the same answer: They don’t have power. They can deliver the trainings and write the policies, they can take reports and conduct investigations, but unless the harasser is of relatively low status within the organization, they have little say in the outcome. Most of the time, if the man is truly important to the company, the case is quickly whisked out of HR’s hands, the investigation delivered to lawyers and the final decision rendered by executives. These executives are under no legal imperative to terminate an alleged offender or even to enforce a particular sanction, only to ensure that the woman who made the report is safe in the future.
Making a show out of tossing highly placed harassers to the curb proved good for business during the early and middle phases of #MeToo. But social change that is built on a popular movement is destined to fade when that movement becomes less fashionable. Already the country has begun to move on. The steady drumbeat of famous men being fired has slowed, and a backlash against what are perceived as unfair punishments has gathered strength. In April, Taffy Brodesser-Akner wrote a New York Times Magazine cover story about allegations of an extensive pattern of sexual harassment and discrimination at Sterling Jewelers, a chain of retailers that includes Kay, Jared, and Zales. Here was a major writer reporting on an important subject—it was the kind of piece that a year earlier would have gotten the attention of the country. But as the #MeToo movement wanes, outrage has turned to resigned acceptance.
Like everyone else who understands the problem, including the EEOC, the HR workers I met at the conference reported that there is only one way to eradicate harassment from a workplace: by creating a climate and culture that starts at the very top of the company and establishes that harassment is not tolerated and will be punished severely. Middle managers can’t change the culture of a company; only the most senior people can do that. And expecting an HR worker—with a car loan, a mortgage, college tuition around the corner—to risk her job in a fight against management on behalf of an employee she barely knows is unrealistic.
Throughout the course of writing this article, I encountered many alternatives to the traditional system of dealing with sexual harassment. I talked with one hugely successful high-stakes litigator in Los Angeles named Mark Baute. “I’m brutal in a courtroom,” he told me cheerfully, “and I’ve got the goods.” Baute has sometimes been asked to speak to executives at big companies. When he is introduced, none of the men in the audience takes much notice of him, because the subject is HR. But then, in his booming courtroom voice, he says, “I’m here so that this company doesn’t hire someone like me to come in and destroy your career.” The men put down their phones. He hammers them with every possible outcome of harassing a co-worker, and then he shames them: “Tell me something. If you’re such a stud, why do you only date women who depend on you for employment? Why can’t you walk outside of this building and find someone else to date?”
Baute’s approach is aggressive, unconventional, and apparently very effective. It struck me as exactly the kind of “creative idea” that the EEOC might have been looking for. But no one is talking about taking it mainstream. Instead, we can expect to see HR deploy new techniques aimed more at protecting companies than at protecting employees.
So-called love contracts are becoming popular, requiring employees who are dating to report to HR to sign paperwork affirming that they are willingly taking part in a consensual relationship. It is like a posting of the banns: a semipublic profession of intention, HR squirreling away one more signed document to indemnify the company from the human impulses of its workers. This could be understood as HR mission creep. Or it could be understood as another mile marker of the journey we are all on, as religion falls away and customs erode, and new norms of behavior are reverse-engineered from employment law. Title VII is the Bible, compliance training is the sermon, and the HR office is the confessional.
HR is no match for sexual harassment. It pits male sexual aggression against a system of paperwork and broken promises, and women don’t trust it. For 30 years, we’ve invested responsibility in HR, and it hasn’t worked out. We have to find a better way.
Tennessee has voted Republican in nine of the last 12 presidential elections, reliably serving as a Republican stronghold. The state is not home to any major down-ballot races for governor, House, or Senate.
The state is firmly in Republican control, with Republicans holding both houses of the state legislature, governor, and nine out of 11 congressional seats. Tennessee has 11 electoral votes and is not expected by analysts to be particularly competitive.