During the seven years that I worked as a management consultant, I spent a lot of time trying to look older than I was. I became pretty good at furrowing my brow and putting on somber expressions. Those who saw through my disguise assumed I made up for my youth with a fabulous education in management. They were wrong about that. I don’t have an M.B.A. I have a doctoral degree in philosophy—nineteenth-century German philosophy, to be precise. Before I took a job telling managers of large corporations things that they arguably should have known already, my work experience was limited to part-time gigs tutoring surly undergraduates in the ways of Hegel and Nietzsche and to a handful of summer jobs, mostly in the less appetizing ends of the fast-food industry.
The strange thing about my utter lack of education in management was that it didn’t seem to matter. As a principal and founding partner of a consulting firm that eventually grew to 600 employees, I interviewed, hired, and worked alongside hundreds of business-school graduates, and the impression I formed of the M.B.A. experience was that it involved taking two years out of your life and going deeply into debt, all for the sake of learning how to keep a straight face while using phrases like “out-of-the-box thinking,” “win-win situation,” and “core competencies.” When it came to picking teammates, I generally held out higher hopes for those individuals who had used their university years to learn about something other than business administration.
After I left the consulting business, in a reversal of the usual order of things, I decided to check out the management literature. Partly, I wanted to “process” my own experience and find out what I had missed in skipping business school. Partly, I had a lot of time on my hands. As I plowed through tomes on competitive strategy, business process re-engineering, and the like, not once did I catch myself thinking, Damn! If only I had known this sooner! Instead, I found myself thinking things I never thought I’d think, like, I’d rather be reading Heidegger! It was a disturbing experience. It thickened the mystery around the question that had nagged me from the start of my business career: Why does management education exist?
Management theory came to life in 1899 with a simple question: “How many tons of pig iron bars can a worker load onto a rail car in the course of a working day?” The man behind this question was Frederick Winslow Taylor, the author of The Principles of Scientific Management and, by most accounts, the founding father of the whole management business.
Taylor was forty-three years old and on contract with the Bethlehem Steel Company when the pig iron question hit him. Staring out over an industrial yard that covered several square miles of the Pennsylvania landscape, he watched as laborers loaded ninety-two-pound bars onto rail cars. There were 80,000 tons’ worth of iron bars, which were to be carted off as fast as possible to meet new demand sparked by the Spanish-American War. Taylor narrowed his eyes: there was waste there, he was certain. After hastily reviewing the books at company headquarters, he estimated that the men were currently loading iron at the rate of twelve and a half tons per man per day.
Taylor stormed down to the yard with his assistants (“college men,” he called them) and rounded up a group of top-notch lifters (“first-class men”), who in this case happened to be ten “large, powerful Hungarians.” He offered to double the workers’ wages in exchange for their participation in an experiment. The Hungarians, eager to impress their apparent benefactor, put on a spirited show. Huffing up and down the rail car ramps, they loaded sixteen and a half tons in something under fourteen minutes. Taylor did the math: over a ten-hour day, it worked out to seventy-five tons per day per man. Naturally, he had to allow time for bathroom breaks, lunch, and rest periods, so he adjusted the figure approximately 40 percent downward. Henceforth, each laborer in the yard was assigned to load forty-seven and a half pig tons per day, with bonus pay for reaching the target and penalties for failing.
When the Hungarians realized that they were being asked to quadruple their previous daily workload, they howled and refused to work. So Taylor found a “high-priced man,” a lean Pennsylvania Dutchman whose intelligence he compared to that of an ox. Lured by the promise of a 60 percent increase in wages, from $1.15 to a whopping $1.85 a day, Taylor’s high-priced man loaded forty-five and three-quarters tons over the course of a grueling day—close enough, in Taylor’s mind, to count as the first victory for the methods of modern management.
Taylor went on to tackle the noble science of shoveling and a host of other topics of concern to his industrial clients. He declared that his new and unusual approach to solving business problems amounted to a “complete mental revolution.” Eventually, at the urging of his disciples, he called his method “scientific management.” Thus was born the idea that management is a science—a body of knowledge collected and nurtured by experts according to neutral, objective, and universal standards.
At the same moment was born the notion that management is a distinct function best handled by a distinct group of people—people characterized by a particular kind of education, way of speaking, and fashion sensibility. Taylor, who favored a manly kind of prose, expressed it best in passages like this:
… the science of handling pig iron is so great and amounts to so much that it is impossible for the man who is best suited to this type of work to understand the principles of this science, or even to work in accordance with these principles, without the aid of a man better educated than he is.
From a metaphysical perspective, one could say that Taylor was a “dualist”: there is brain, there is brawn, and the two, he believed, very rarely meet.
Taylor went around the country repeating his pig iron story and other tales from his days in the yard, and these narratives formed something like a set of scriptures for a new and highly motivated cult of management experts. This vanguard ultimately vaulted into the citadel of the Establishment with the creation of business schools. In the spring of 1908, Taylor met with several Harvard professors, and later that year Harvard opened the first graduate school in the country to offer a master’s degree in business. It based its first-year curriculum on Taylor’s scientific management. From 1909 to 1914, Taylor visited Cambridge every winter to deliver a series of lectures—inspirational discourses marred only by the habit he’d picked up on the shop floor of swearing at inappropriate moments.
Yet even as Taylor’s idea of management began to catch on, a number of flaws in his approach were evident. The first thing many observers noted about scientific management was that there was almost no science to it. The most significant variable in Taylor’s pig iron calculation was the 40 percent “adjustment” he made in extrapolating from a fourteen-minute sample to a full workday. Why time a bunch of Hungarians down to the second if you’re going to daub the results with such a great blob of fudge? When he was grilled before Congress on the matter, Taylor casually mentioned that in other experiments these “adjustments” ranged from 20 percent to 225 percent. He defended these unsightly “wags” (wild-ass guesses, in M.B.A.-speak) as the product of his “judgment” and “experience”—but, of course, the whole point of scientific management was to eliminate the reliance on such inscrutable variables.
One of the distinguishing features of anything that aspires to the name of science is the reproducibility of experimental results. Yet Taylor never published the data on which his pig iron or other conclusions were based. When Carl Barth, one of his devotees, took over the work at Bethlehem Steel, he found Taylor’s data to be unusable. Another, even more fundamental feature of science—here I invoke the ghost of Karl Popper—is that it must produce falsifiable propositions. Insofar as Taylor limited his concern to prosaic activities such as lifting bars onto rail cars, he did produce propositions that were falsifiable—and, indeed, were often falsified. But whenever he raised his sights to management in general, he seemed capable only of soaring platitudes. At the end of the day his “method” amounted to a set of exhortations: Think harder! Work smarter! Buy a stopwatch!
The trouble with such claims isn’t that they are all wrong. It’s that they are too true. When a congressman asked him if his methods were open to misuse, Taylor replied, No. If management has the right state of mind, his methods will always lead to the correct result. Unfortunately, Taylor was right about that. Taylorism, like much of management theory to come, is at its core a collection of quasi-religious dicta on the virtue of being good at what you do, ensconced in a protective bubble of parables (otherwise known as case studies).
Curiously, Taylor and his college men often appeared to float free from the kind of accountability that they demanded from everybody else. Others might have been asked, for example: Did Bethlehem’s profits increase as a result of their work? Taylor, however, rarely addressed the question head-on. With good reason. Bethlehem fired him in 1901 and threw out his various systems. Yet this evident vacuum of concrete results did not stop Taylor from repeating his parables as he preached the doctrine of efficiency to countless audiences across the country.
In the management literature these days, Taylorism is presented, if at all, as a chapter of ancient history, a weird episode about an odd man with a stopwatch who appeared on the scene sometime after Columbus discovered the New World. Over the past century Taylor’s successors have developed a powerful battery of statistical methods and analytical approaches to business problems. And yet the world of management remains deeply Taylorist in its foundations.
At its best, management theory is part of the democratic promise of America. It aims to replace the despotism of the old bosses with the rule of scientific law. It offers economic power to all who have the talent and energy to attain it. The managerial revolution must be counted as part of the great widening of economic opportunity that has contributed so much to our prosperity. But, insofar as it pretends to a kind of esoteric certitude to which it is not entitled, management theory betrays the ideals on which it was founded.
That Taylorism and its modern variants are often just a way of putting labor in its place need hardly be stated: from the Hungarians’ point of view, the pig iron experiment was an infuriatingly obtuse way of demanding more work for less pay. That management theory represents a covert assault on capital, however, is equally true. (The Soviet five-year planning process took its inspiration directly from one of Taylor’s more ardent followers, the engineer H. L. Gantt.) Much of management theory today is in fact the consecration of class interest—not of the capitalist class, nor of labor, but of a new social group: the management class.
I can confirm on the basis of personal experience that management consulting continues to worship at the shrine of numerology where Taylor made his first offering of blobs of fudge. In many of my own projects, I found myself compelled to pacify recalcitrant data with entirely confected numbers. But I cede the place of honor to a certain colleague, a gruff and street-smart Belgian whose hobby was to amass hunting trophies. The huntsman achieved some celebrity for having invented a new mathematical technique dubbed “the Two-Handed Regression.” When the data on the correlation between two variables revealed only a shapeless cloud—even though we knew damn well there had to be a correlation—he would simply place a pair of meaty hands on the offending bits of the cloud and reveal the straight line hiding from conventional mathematics.
The thing that makes modern management theory so painful to read isn’t usually the dearth of reliable empirical data. It’s that maddening papal infallibility. Oh sure, there are a few pearls of insight, and one or two stories about hero-CEOs that can hook you like bad popcorn. But the rest is just inane. Those who looked for the true meaning of “business process re-engineering,” the most overtly Taylorist of recent management fads, were ultimately rewarded with such gems of vacuity as “BPR is taking a blank sheet of paper to your business!” and “BPR means re-thinking everything, everything!”
Each new fad calls attention to one virtue or another—first it’s efficiency, then quality, next it’s customer satisfaction, then supplier satisfaction, then self-satisfaction, and finally, at some point, it’s efficiency all over again. If it’s reminiscent of the kind of toothless wisdom offered in self-help literature, that’s because management theory is mostly a subgenre of self-help. Which isn’t to say it’s completely useless. But just as most people are able to lead fulfilling lives without consulting Deepak Chopra, most managers can probably spare themselves an education in management theory.
The world of management theorists remains exempt from accountability. In my experience, for what it’s worth, consultants monitored the progress of former clients about as diligently as they checked up on ex-spouses (of which there were many). Unless there was some hope of renewing the relationship (or dating a sister company), it was Hasta la vista, baby. And why should they have cared? Consultants’ recommendations have the same semantic properties as campaign promises: it’s almost freakish if they are remembered in the following year.
In one episode, when I got involved in winding up the failed subsidiary of a large European bank, I noticed on the expense ledger that a rival consulting firm had racked up $5 million in fees from the same subsidiary. “They were supposed to save the business,” said one client manager, rolling his eyes. “Actually,” he corrected himself, “they were supposed to keep the illusion going long enough for the boss to find a new job.” Was my competitor held to account for failing to turn around the business and/or violating the rock-solid ethical standards of consulting firms? On the contrary, it was ringing up even higher fees over in another wing of the same organization.
And so was I. In fact, we kind of liked failing businesses: there was usually plenty of money to be made in propping them up before they finally went under. After Enron, true enough, Arthur Andersen sank. But what happened to such stalwarts as McKinsey, which generated millions in fees from Enron and supplied it with its CEO? The Enron story wasn’t just about bad deeds or false accounts; it was about confusing sound business practices with faddish management ideas, celebrated with gusto by the leading lights of the management world all the way to the end of the party.
If you believed our chief of recruiting, the consulting firm I helped to found represented a complete revolution from the Taylorist practices of conventional organizations. Our firm wasn’t about bureaucratic control and robotic efficiency in the pursuit of profit. It was about love.
We were very much of the moment. In the 1990s, the gurus were unanimous in their conviction that the world was about to bring forth an entirely new mode of human cooperation, which they identified variously as the “information-based organization,” the “intellectual holding company,” the “learning organization,” and the “perpetually creative organization.” “R-I-P. Rip, shred, tear, mutilate, destroy that hierarchy,” said über-guru Tom Peters, with characteristic understatement. The “end of bureaucracy” is nigh, wrote Gifford Pinchot of “intrapreneuring” fame. According to all the experts, the enemy of the “new” organization was lurking in every episode of Leave It to Beaver.
Many good things can be said about the “new” organization of the 1990s. And who would want to take a stand against creativity, freedom, empowerment, and—yes, let’s call it by its name—love? One thing that cannot be said of the “new” organization, however, is that it is new.
In 1983, a Harvard Business School professor, Rosabeth Moss Kanter, beat the would-be revolutionaries of the nineties to the punch when she argued that rigid “segmentalist” corporate bureaucracies were in the process of giving way to new “integrative” organizations, which were “informal” and “change-oriented.” But Kanter was just summarizing a view that had currency at least as early as 1961, when Tom Burns and G. M. Stalker published an influential book criticizing the old, “mechanistic” organization and championing the new, “organic” one. In language that eerily anticipated many a dot-com prospectus, they described how innovative firms benefited from “lateral” versus “vertical” information flows, the use of “ad hoc” centers of coordination, and the continuous redefinition of jobs. The “flat” organization was first explicitly celebrated by James C. Worthy, in his study of Sears in the 1940s, and W. B. Given coined the term “bottom-up management” in 1949. And then there was Mary Parker Follett, who in the 1920s attacked “departmentalized” thinking, praised change-oriented and informal structures, and—Rosabeth Moss Kanter fans please take note—advocated the “integrative” organization.
If there was a defining moment in this long and strangely forgetful tradition of “humanist” organization theory—a single case that best explains the meaning of the infinitely repeating whole—it was arguably the work of Professor Elton Mayo of the Harvard Business School in the 1920s. Mayo, an Australian, was everything Taylor was not: sophisticated, educated at the finest institutions, a little distant and effete, and perhaps too familiar with Freudian psychoanalysis for his own good.
A researcher named Homer Hibarger had been testing theories about the effect of workplace illumination on worker productivity. His work, not surprisingly, had been sponsored by a maker of electric lightbulbs. While a group of female workers assembled telephone relays and receiver coils, Homer turned the lights up. Productivity went up. Then he turned the lights down. Productivity still went up! Puzzled, Homer tried a new series of interventions. First, he told the “girls” that they would be entitled to two five-minute breaks every day. Productivity went up. Next it was six breaks a day. Productivity went up again. Then he let them leave an hour early every day. Up again. Free lunches and refreshments. Up! Then Homer cut the breaks, reinstated the old workday, and scrapped the free food. But productivity barely dipped at all.
Mayo, who was brought in to make sense of this, was exultant. His theory: the various interventions in workplace routine were as nothing compared with the new interpersonal dynamics generated by the experimental situation itself. “What actually happened,” he wrote, “was that six individuals became a team and the team gave itself wholeheartedly and spontaneously to cooperation … They felt themselves to be participating, freely and without afterthought, and were happy in the knowledge that they were working without coercion.” The lessons Mayo drew from the experiment are in fact indistinguishable from those championed by the gurus of the nineties: vertical hierarchies based on concepts of rationality and control are bad; flat organizations based on freedom, teamwork, and fluid job definitions are good.
On further scrutiny, however, it turned out that two workers who were deemed early on to be “uncooperative” had been replaced with friendlier women. Even more disturbing, these exceptionally cooperative individuals earned significantly higher wages for their participation in the experiment. Later, in response to his critics, Mayo insisted that something so crude as financial incentives could not possibly explain the miracles he witnessed. That didn’t make his method any more “scientific.”
Mayo’s work sheds light on the dark side of the “humanist” tradition in management theory. There is something undeniably creepy about a clipboard-bearing man hovering around a group of factory women, flicking the lights on and off and dishing out candy bars. All of that humanity—as anyone in my old firm could have told you—was just a more subtle form of bureaucratic control. It was a way of harnessing the workers’ sense of identity and well-being to the goals of the organization, an effort to get each worker to participate in an ever more refined form of her own enslavement.
So why is Mayo’s message constantly recycled and presented as something radically new and liberating? Why does every new management theorist seem to want to outdo Chairman Mao in calling for perpetual havoc on the old order? Very simply, because all economic organizations involve at least some degree of power, and power always pisses people off. That is the human condition. At the end of the day, it isn’t a new world order that the management theorists are after; it’s the sensation of the revolutionary moment. They long for that exhilarating instant when they’re fighting the good fight and imagining a future utopia. What happens after the revolution—civil war and Stalinism being good bets—could not be of less concern.
Between them, Taylor and Mayo carved up the world of management theory. According to my scientific sampling, you can save yourself from reading about 99 percent of all the management literature once you master this dialectic between rationalists and humanists. The Taylorite rationalist says: Be efficient! The Mayo-ist humanist replies: Hey, these are people we’re talking about! And the debate goes on. Ultimately, it’s just another installment in the ongoing saga of reason and passion, of the individual and the group.
The tragedy, for those who value their reading time, is that Rousseau and Shakespeare said it all much, much better. In the 5,200 years since the Sumerians first etched their pictograms on clay tablets, come to think of it, human beings have produced an astonishing wealth of creative expression on the topics of reason, passion, and living with other people. In books, poems, plays, music, works of art, and plain old graffiti, they have explored what it means to struggle against adversity, to apply their extraordinary faculty of reason to the world, and to confront the naked truth about what motivates their fellow human animals. These works are every bit as relevant to the dilemmas faced by managers in their quest to make the world a more productive place as any of the management literature.
In the case of my old firm, incidentally, the endgame was civil war. Those who talked loudest about the ideals of the “new” organization, as it turned out, had the least love in their hearts. By a strange twist of fate, I owe the long- evity of my own consulting career to this circumstance. When I first announced my intention to withdraw from the firm in order to pursue my vocation as an unpublishable philosopher at large, my partners let me know that they would gladly regard my investment in the firm as a selfless contribution to their financial well-being. By the time I managed to extricate myself from their loving embrace, nearly three years later, the partnership had for other reasons descended into the kind of Hobbesian war of all against all from which only the lawyers emerge smiling. The firm was temporarily rescued by a dot-com company, but within a year both the savior and the saved collapsed in a richly deserved bankruptcy. Of course, your experience in a “new” organization may be different.
My colleagues usually spoke fondly of their years at business school. Most made great friends there, and quite a few found love. All were certain that their degree was useful in advancing their careers. But what does an M.B.A. do for you that a doctorate in philosophy can’t do better?
The first point to note is that management education confers some benefits that have little to do with either management or education. Like an elaborate tattoo on an aboriginal warrior, an M.B.A. is a way of signaling just how deeply and irrevocably committed you are to a career in management. The degree also provides a tidy hoard of what sociologists call “social capital”—or what the rest of us, notwithstanding the invention of the PalmPilot, call a “Rolodex.”
For companies, M.B.A. programs can be a way to outsource recruiting. Marvin Bower, McKinsey’s managing director from 1950 to 1967, was the first to understand this fact, and he built a legendary company around it. Through careful cultivation of the deans and judicious philanthropy, Bower secured a quasi-monopoly on Baker Scholars (the handful of top students at the Harvard Business School). Bower was not so foolish as to imagine that these scholars were of interest on account of the education they received. Rather, they were valuable because they were among the smartest, most ambitious, and best-connected individuals of their generation. Harvard had done him the favor of scouring the landscape, attracting and screening vast numbers of applicants, further testing those who matriculated, and then serving up the best and the brightest for Bower’s delectation.
Of course, management education does involve the transfer of weighty bodies of technical knowledge that have accumulated since Taylor first put the management-industrial complex in motion—accounting, statistical analysis, decision modeling, and so forth—and these can prove quite useful to students, depending on their career trajectories. But the “value-add” here is far more limited than Mom or Dad tend to think. In most managerial jobs, almost everything you need to know to succeed must be learned on the job; for the rest, you should consider whether it might have been acquired with less time and at less expense.
The best business schools will tell you that management education is mainly about building skills—one of the most important of which is the ability to think (or what the M.B.A.s call “problem solving”). But do they manage to teach such skills?
I once sat through a presentation in which a consultant, a Harvard M.B.A., showed a client, the manager of a large financial institution in a developing country, how the client company’s “competitive advantage” could be analyzed in terms of “the five forces.” He even used a graphic borrowed directly from guru-of-the-moment Michael Porter’s best- selling work on “competitive strategy.” Not for the first time, I was embarrassed to call myself a consultant. As it happens, the client, too, had a Harvard M.B.A. “No,” he said, shaking his head with feigned chagrin. “There are only three forces in this case. And two of them are in the Finance Ministry.”
What they don’t seem to teach you in business school is that “the five forces” and “the seven Cs” and every other generic framework for problem solving are heuristics: they can lead you to solutions, but they cannot make you think. Case studies may provide an effective way to think business problems through, but the point is rather lost if students come away imagining that you can go home once you’ve put all of your eggs into a two-by-two growth-share matrix.
Next to analysis, communication skills must count among the most important for future masters of the universe. To their credit, business schools do stress these skills, and force their students to engage in make-believe presentations to one another. On the whole, however, management education has been less than a boon for those who value free and meaningful speech. M.B.A.s have taken obfuscatory jargon—otherwise known as bullshit—to a level that would have made even the Scholastics blanch. As students of philosophy know, Descartes dismantled the edifice of medieval thought by writing clearly and showing that knowledge, by its nature, is intelligible, not obscure.
Beyond building skills, business training must be about values. As I write this, I know that my M.B.A. friends are squirming in their seats. They’ve all been forced to sit through an “ethics” course, in which they learned to toss around yet more fancy phrases like “the categorical imperative” and discuss borderline criminal behavior, such as what’s a legitimate hotel bill and what’s just plain stealing from the expense account, how to tell the difference between a pat on the shoulder and sexual harassment, and so on. But, as anyone who has studied Aristotle will know, “values” aren’t something you bump into from time to time during the course of a business career. All of business is about values, all of the time. Notwithstanding the ostentatious use of stopwatches, Taylor’s pig iron case was not a description of some aspect of physical reality—how many tons can a worker lift? It was a prescription—how many tons should a worker lift? The real issue at stake in Mayo’s telephone factory was not factual—how can we best establish a sense of teamwork? It was moral—how much of a worker’s sense of identity and well-being does a business have a right to harness for its purposes?
The recognition that management theory is a sadly neglected subdiscipline of philosophy began with an experience of déjà vu. As I plowed through my shelfload of bad management books, I beheld a discipline that consists mainly of unverifiable propositions and cryptic anecdotes, is rarely if ever held accountable, and produces an inordinate number of catastrophically bad writers. It was all too familiar. There are, however, at least two crucial differences between philosophers and their wayward cousins. The first and most important is that philosophers are much better at knowing what they don’t know. The second is money. In a sense, management theory is what happens to philosophers when you pay them too much.
The idea that philosophy is an inherently academic pursuit is a recent and diabolical invention. Epicurus, Descartes, Spinoza, Locke, Hume, Nietzsche, and most of the other great philosophers of history were not professors of philosophy. If any were to come to life and witness what has happened to their discipline, I think they’d run for the hills. Still, you go to war with the philosophers you have, as they say, not the ones in the hills. And since I’m counting on them to seize the commanding heights of the global economy, let me indulge in some management advice for today’s academic philosophers:
■Expand the domain of your analysis! Why so many studies of Wittgenstein and none of Taylor, the man who invented the social class that now rules the world?
■Hire people with greater diversity of experience! And no, that does not mean taking someone from the University of Hawaii. You are building a network—a team of like-minded individuals who together can change the world.
■Remember the three Cs: Communication, Communication, Communication! Philosophers (other than those who have succumbed to the Heideggerian virus) start with a substantial competitive advantage over the PowerPoint crowd. But that’s no reason to slack off. Remember Plato: it’s all about dialogue!
With this simple three-point program (or was it four?) philosophers will soon reclaim their rightful place as the educators of management. Of course, I will be charging for implementation.