Promising Practices Promising PracticesPromising Practices
A forum for government's best ideas and most innovative leaders.

The Origins of Office Speak


Here’s your ‘buzzword bingo’ card for the meeting,” Wally says to Dilbert, handing him a piece of paper. “If the boss uses a buzzword on your card, you check it off. The objective is to fill a row.”

They go to the meeting, where their pointy-haired boss presides. “You’re all very attentive today,” he observes. “My proactive leadership must be working!”

“Bingo, sir,” says Wally.

This 1994 comic strip by Scott Adams is a perfect caricature of office speak: An oblivious, slightly evil-seeming manager spews conceptual, meaningless words while employees roll their eyes. Yet, even the most cynical cubicle farmers are fluent in buzzwords. An email might be full of calisthenics, with offers to “reach out,” “run it up the flagpole,” and “circle back.” There are nature metaphors like “boil the ocean” and “streamline,” and food-inspired phrases like “soup to nuts” and “low-hanging fruit.” For the fiercest of office workers, there’s always the violent imagery of “pain points,” “drilling down,” and “bleeding edge.”

Over time, different industries have developed their own tribal vocabularies. Some of today’s most popular buzzwords were created by academics who believed that work should satisfy one’s soul; others were coined by consultants who sold the idea that happy workers are effective workers. The Wall Street lingo of the 1980s all comes back to “the bottom line,” while the techie terms of today suggest that humans are creative computers, whose work is measured in “capacity” and “bandwidth.” Corporate jargon may seem meaningless to the extent that it's best described as “bullshit,”  but it actually reveals a lot about how workers think about their lives.

The mechanistic worker came of age amid a whirl of turbines at the turn of the century. The Second Industrial Revolution was well underway, and the massive companies run by titans like Andrew Carnegie and Henry Ford relied on factory assembly lines.

In 1911, Frederick Winslow Taylor published The Principles of Scientific Management, a book with one goal: destroy worker inefficiency. His theory, often called “Taylorism,” was all about maximizing every action on an assembly line. “There was a shift to the logic of science and efficiency,” Rakesh Khurana, a professor at Harvard Business School and soon-to-be-dean of Harvard College, told me. “Divide work into its smallest component parts, figure out the timing, remove any unnecessary efficiencies. That was the way work was organized, and that had a huge impact on the way corporate culture was organized.” The words used to talk about workers in books and boardrooms were accordingly mechanistic, emphasizing accuracy, precision, incentives, and maximized production.

This idea started to shift in the late 1920s and ’30s. In 1924, the Australian sociologist George Elton Mayo started running a series of experiments at Hawthorne Works, a large factory of the Western Electric Company in the suburbs of Chicago. He set out with a simple task: figure out how the brightness of the lights in the factory affected worker productivity. But his team got some surprising results: Whenever the lights changed—no matter whether they got dimmer or brighter—workers got better at their jobs. They concluded that the workers’ physical environment wasn’t what made them better—it was that they thought their bosses were paying attention to them.

Mayo and his team quickly changed their focus: Instead of thinking of workers as cogs in a vast machine, they began thinking of them as living units of a large, complex social organism.

“In the 1930s, you begin getting this human relations perspective, in many ways in opposition to the scientific imagery,” Khurana said. “This is really about this notion that managers don’t understand the psychology of workers. By treating them as machines, they not only deny their humanity; it actually results in ineffective management, social disorganization, lack of cooperation, and an increase in tensions between labor and management.” Although the methodology of the Hawthorne experiment has since been criticized, the results triggered a shift in how researchers thought about workers.

This seemed to come at just the right time: The Great Depression had set in, and industries were in an existential crisis. “Alienation, abseentism, labor turn-over, wild-cat strikes—these came to be associated not with meeting the workers’ economic needs, but their psychological and social needs,” Khurana said.

World War II liberated these theories from the halls of academia. Suddenly, organizational science was seen as a possible tool for understanding what had happened to nations like Germany and Japan. “What was it about the culture of those societies that led them to suddenly shift from what was seen as quite enlightened and advanced to suddenly becoming very authoritarian? The government became interested in this, and they started funding all sorts of studies.”

At the same time, American companies were changing. “Most of the large organizations that were emerging at this time were not in any single business,” Khurana said. “They were large, diversified conglomerates that had been created as a consequence of World War II and of the huge mergers and acquisitions activity that took place in the 1950s and ’60s. Firms like Pepsico owned trucking companies, even though they were in the food business.”

This made it more difficult for workers to feel a connection to their companies, Khurana said. “What people were very much focused on was: How can we get workers to feel differently about their jobs?”

For academics, this was as much a question of sociology as efficiency. It soon became a question of money, too: “As a manager, how can I maximize profits by creating a certain emotional atmosphere at my company?”

In trying to answer this question, office speak was born.

The Self-Actualizers

In the 1950s, two schools of thought began to emerge. At Carnegie Mellon, academics were working on what they called management science—a theory of decision-making inspired by the computers that had come out during World War II. Meanwhile, at MIT, three professors—Douglas McGregor, Edgar Schein, and Richard Beckhard—were creating a new field called organizational development.

Schein, now 86, is largely credited with coining the term organizational culture (the linguistic cousin of corporate culture). “In the 1960s, there was an emphasis on humanistic psychology, involving the worker, because then they would work better,” he told me. “We were interested in how groups and leadership could be made more effective. So we started something called the human relations lab.”

A pair of hypotheses rose out of these labs. As McGregor explained in his 1960 book The Human Side of Enterprise, managers could think of their employees in one of two ways: as lazy work-haters who need to be closely supervised (Theory X), or as ambitious self-motivators who thrive in an atmosphere of trust (Theory Y). “This introduced the idea that effective managers believe in their people and trust them and don’t feel that they have to monitor them all the time,” Schein said.

Although the researchers didn’t necessarily favor one theory over the other, Theory Y fit perfectly with the zeitgeist of the 60s. It drew on Abraham Maslow’s increasingly popular theory of the hierarchy of needs, which positioned “self-actualization” as the highest goal of human life. Inspired by Maslow, Michael Murphy and Dick Price founded the Esalen Institute in 1962 to nurture the burgeoning Human Potential Movement, and Look magazine’s George Leonard helped bring it into the mainstream. Theory Y extended this worldview into the realm of work: Jobs, much like meditation and mind-enhancing drugs, were seen as a way to discover untapped inner power and find personal fulfillment. Over the years, the idea has stuck: In 2001, The Human Side of Enterprise was voted the fourth most influential management book in the 20th century by the Academy of Management.

In the decades that followed, academics continued to come up with memorable buzzwords. British psychologist Raymond Cattell repurposed the word synergy, which was originally a Protestant term for cooperation between the human will and divine grace. The UC Berkeley philosopher Thomas Kuhn popularized the term paradigm shift in his 1962 book, The Structure of Scientific Revolutions. And, much later, Harvard professor Clayton Christensen coined the term disruptwhich has become a favorite in today’s climate of start-up worship. But more importantly, academics have had a big effect on how workers work, all thanks to one group of people: consultants.

The Optimizers

Douglas McGregor may have written the fourth most influential management book of the 20th century, but Peter Drucker wrote the third: In his 1954 manifesto, The Practice of Management, he wrote that “the manager is the dynamic, life-giving element in every business.” Over the next five decades, Drucker helped companies find new ways to turn “resources”—people, in other words—into productivity engines.

In 1981, Drucker started working with one of his biggest clients: General Electric. The company had just been taken over by Jack Welch, who was looking to overhaul its management in the midst of a recession. Over the next decade, Welch systematically redesigned the culture of the organization, hitting a peak in 1989 with his Work-Out program, which was designed to help managers and employees solve problems faster. In the language of Work-Out, low-hanging fruit were problems that were easily identified and solved. Other fantastic jargon from the program included rattlers, or obvious problems (so-named because they “make a lot of noise”) and pythons, or challenging problems that come from bloated bureaucracy. A little ironically, Welch wrote that Work-Out would create “a company where jargon and double-talk are ridiculed and candor is demanded.”

Although Work-Out is credited with reinvigorating General Electric, other attempts to overhaul company culture failed miserably. After AT&T was broken up into multiple companies in 1984, the newly independent telephone service provider Pacific Bell hired two associates of Charles Krone, a California-based management consultant known for following the teachings of Armenian mystic Georges Gurdjieff. His “leadership development” program, known as “kroning,” maintained that certain words helped employees communicate better, improving the health of the organization. Some 23,000 employees went through the $40 million training program, learning new terms like task cycle and functioning capabilities that were supposed to help them care more about their work and express themselves more clearly.

Instead, the company’s language became incredibly opaque. For example, its 1987 “statement of principles” defined “interaction” as:

The continuous ability to engage with the connectedness and relatedness that exists and potentially exists, which is essential for the creations necessary to maintain and enhance viability of ourselves and the organization of which we are a part.

When the San Francisco Chronicle reported that the training had caused widespread discontent, the California Public Utilities Commission started an investigation, and the program was discontinued. “Perhaps one thing that we learn from the Krone case,” wrote University of Richmond professor Joanne Ciulla in 2004, “ is that attempts at engineering appropriate attitudes and emotions can actually undercut genuine feelings for a company.”

But even if firms like Bain, McKinsey, and Boston Consulting Group didn’t import New Age values into their consulting the way Krone and his associates did, they did develop distinctive, pseudo-scientific language to pitch themselves to clients. “They all had to come up with something new,” John Van Maanen, a management professor at MIT, told me.

For example, consultants are responsible for a lot of the veiled language used by today’s HR departments. “The consulting industry came up with a whole slew of euphemisms for firing people that has become universal,” said Matthew Stewart, the author of The Management Myth. “There’s a whole body of kind of Orwellian speak about developing human capital and managing people and all that.” Streamline, restructure, let go, create operational efficiencies: All of these are roundabout ways of saying that people are about to lose their jobs. The common theme among them is efficiency—after all, these are human resources, and what are resources for if not the company’s bottom line?

Read more at The Atlantic

(Image via gst/

Close [ x ] More from GovExec

Thank you for subscribing to newsletters from
We think these reports might interest you:

  • Forecasting Cloud's Future

    Conversations with Federal, State, and Local Technology Leaders on Cloud-Driven Digital Transformation

  • The Big Data Campaign Trail

    With everyone so focused on security following recent breaches at federal, state and local government and education institutions, there has been little emphasis on the need for better operations. This report breaks down some of the biggest operational challenges in IT management and provides insight into how agencies and leaders can successfully solve some of the biggest lingering government IT issues.

  • Communicating Innovation in Federal Government

    Federal Government spending on ‘obsolete technology’ continues to increase. Supporting the twin pillars of improved digital service delivery for citizens on the one hand, and the increasingly optimized and flexible working practices for federal employees on the other, are neither easy nor inexpensive tasks. This whitepaper explores how federal agencies can leverage the value of existing agency technology assets while offering IT leaders the ability to implement the kind of employee productivity, citizen service improvements and security demanded by federal oversight.

  • IT Transformation Trends: Flash Storage as a Strategic IT Asset

    MIT Technology Review: Flash Storage As a Strategic IT Asset For the first time in decades, IT leaders now consider all-flash storage as a strategic IT asset. IT has become a new operating model that enables self-service with high performance, density and resiliency. It also offers the self-service agility of the public cloud combined with the security, performance, and cost-effectiveness of a private cloud. Download this MIT Technology Review paper to learn more about how all-flash storage is transforming the data center.

  • Ongoing Efforts in Veterans Health Care Modernization

    This report discusses the current state of veterans health care


When you download a report, your information may be shared with the underwriters of that document.