CHAPTER TWENTY-ONE

Management

The Problems of Success

THE BEST-KEPT SECRET IN MANAGEMENT is that the first systematic applications of management theory and management principles did not take place in business enterprise. They occurred in the public sector. The first systematic and deliberate application of management principles in the United States—undertaken with full consciousness of its being an application of management—was the reorganization of the U.S. Army by Elihu Root, Teddy Roosevelt’s secretary of war. Only a few years later, in 1908, came the first “city manager” (in Staunton, Virginia), the result of a conscious application of such then-brand-new management principles as the separation of “policy” (lodged in an elected and politically accountable city council) from “management” (lodged in a nonpolitical professional, accountable managerially). The city manager, by the way, was the first senior executive anyplace called a manager, in business, this title was still quite unknown. Frederick W. Taylor, for instance, in his famous 1911 testimony before the U.S. Congress never used the term but spoke of “the owners and their helpers.” And when Taylor was asked to name an organization that truly practiced “Scientific Management,” he did not name a business but the Mayo Clinic.

Thirty years after, the city manager Luther Gulick applied management and management principles to the organization of a federal government that had grown out of control in the New Deal years. It was not until 1950 and 1951, that is, more than ten years later, that similar management concepts and principles were systematically applied in a business enterprise to a similar task: the reorganization of the General Electric Company after it had outgrown its earlier, purely functional organization structure.

Today, surely, there is as much management outside of business as there is in business—maybe more. The most management-conscious of our present institutions are probably the military, followed closely by hospitals. Forty years ago the then-new management consultants considered only business enterprises as potential clients. Today half of the clients of a typical management consulting firm are nonbusiness: government agencies, the military, schools and universities, hospitals, museums, professional associations, and community agencies like the Boy Scouts and the Red Cross.

And increasingly, holders of the advanced degree in Business Administration, the MBA, are the preferred recruits for careers in city management, in art museums, and in the federal government’s Office of Management and Budget.

Yet most people still hear the words business management when they hear or read management. Management books often outsell all other nonfiction books on the bestseller lists; yet they are normally reviewed on the business page. One “graduate business school” after another renames itself “School of Management.” But the degree it awards has remained the MBA, the Master of Business Administration. Management books, whether textbooks for use in college classes or books for the general reader, deal mainly with business and use business examples or business cases.

That we hear and read business management when the word management is spoken or printed has a simple explanation. The business enterprise was not the first of the managed institutions. The modern university and the modern army each antedate the modern business enterprise by a half century. They emerged during and shortly after the Napoleonic Wars. Indeed, the first “CEO” of a modern institution was the chief of staff of the post-Napoleonic Prussian army, an office developed between 1820 and 1840. In spirit as well as in structure, both the new university and the new army represented a sharp break with their predecessors. But both concealed this—deliberately—by using the old titles, many of the old rites and ceremonies and, especially, by maintaining the social position of the institution and of its leaders.

No one could, however, have mistaken the new business enterprise, as it arose in the third quarter of the nineteenth century, for a direct continuation of the old and traditional “business firm”—the “counting house” consisting of two elderly brothers and one clerk that figures so prominently in Charles Dickens’s popular books published in the 1850s and 1860s, and in so many other nineteenth-century novels, down to Thomas Mann’s Buddenbrooks published in 1906.

For one, the new business enterprise—the long-distance railroad as it developed in the United States after the Civil War, the Universal Bank as it developed on the European Continent, or the trusts such as United States Steel, which J. P. Morgan forged in the United States at the turn of the twentieth century—were not run by the “owners.” Indeed, they had no owners, they had “shareholders.” Legally, the new university or the new army was the same institution it had been since time immemorial, however much its character and function had changed. But to accommodate the new business enterprise, a new and different legal persona had to be invented, the “corporation.” A much more accurate term is the French Société Anonyme, the anonymous collective owned by no one and open to investment by everyone. In the corporation, shares become a claim to profits rather than to property. Share ownership is, of necessity, separate from control and management, and easily divorced from both. And in the new corporation capital is provided by large, often by very large, numbers of outsiders, with each of them holding only a minute fraction and with none of them necessarily having an interest in, or—a total novelty—any liability for, the conduct of the business.

This new “corporation,” this new “Société Anonyme,” this new “Aktiengesellschaft,” could not be explained away as a reform, which is how the new army, the new university, and the new hospital presented themselves. It clearly was a genuine innovation. And this innovation soon came to provide the new jobs—at first, for the rapidly growing urban proletariat, but increasingly also for educated people. It soon came to dominate the economy. What in the older institutions could be explained as different procedures, different rules, or different regulations became in the new institution very soon a new function, management, and a new kind of work. And this then invited study; it invited attention and controversy.

But even more extraordinary and unprecedented was the position of this newcomer in society. It was the first new autonomous institution in hundreds of years, the first to create a power center that was within society yet independent of the central government of the national state. This was an offense, a violation of everything the nineteenth century (and the twentieth-century political scientists still) considered “law of history,” and frankly a scandal.

Around 1860 one of the leading social scientists of the time, the Englishman Sir Henry Maine, coined the phrase in his book Ancient Law that the progress of history is “from status to contract.” Few phrases ever have become as popular and as widely accepted as this one.

And yet, at the very time at which Maine proclaimed that the law of history demands the elimination of all autonomous power centers within society, the business enterprise arose. And from the beginning it was clearly a power center within society and clearly autonomous.

To many contemporaries it was, and understandably so, a totally unnatural development and one that bespoke a monstrous conspiracy. The first great social historian America produced, Henry Adams, clearly saw it this way. His important novel, Democracy, which he wrote during the Grant administration, portrays the new economic power as itself corrupt and, in turn, as corrupting the political process, government, and society. Henry’s brother, Brooks Adams, a few decades later, further elaborated on this theme in one of the most popular political books ever published in the United States, The Degeneration of the Democratic Dogma.

Similarly, the Wisconsin economist, John R. Commons—the brain behind the progressive movement in Wisconsin, the father of most of the “reforms” that later became the social and political innovations of the New Deal, and, last but not least, commonly considered the father of America’s “business unionism”—took very much the same tack. He blamed business enterprise on a lawyers’ conspiracy leading to a misinterpretation of the Fourteenth Amendment to the Constitution by which the corporation was endowed with the same “legal personality” as the individual.

Across the Atlantic in Germany, Walter Rathenau—himself the successful chief executive of one of the very large new “corporations” (and later on to become one of the earliest victims of Nazi terror when he was assassinated in 1922 while serving as foreign minister of the new Weimar Republic)—similarly felt that the business enterprise was something radically new, something quite incompatible with prevailing political and social theories, and indeed a severe social problem.

In Japan, Shibusawa Eiichi, who had left a promising government career in the 1870s to construct a modern Japan through building businesses, also saw in the business enterprise something quite new and distinctly challenging. He tried to tame it by infusing it with the Confucian ethic; and Japanese big business as it developed after World War II is very largely made in Shibusawa’s image.

Everyplace else, the new business enterprise was equally seen as a radical and dangerous innovation. In Austria, for instance, Karl Lueger, the founding father of the “Christian” parties that still dominate politics in Continental Europe, was elected lord mayor of Vienna in 1897 on a platform that defended the honest and honorable small businessman—the shopkeeper and the craftsman—against the evil and illegitimate corporation. A few years later, an obscure Italian journalist, Benito Mussolini, rose to national prominence by denouncing “the soulless corporation.”

And thus quite naturally, perhaps even inevitably, concern with management, whether hostile to it or friendly, concentrated on the business enterprise. No matter how much management was being applied to other institutions, it was the business enterprise that was visible, prominent, controversial, and above all, new, and therefore significant.

By now, however, almost a hundred years after management arose in the early large business enterprises of the 1870s, it is clear that management pertains to every single social institution. In the last hundred years every major social function has become lodged in a large and managed organization. The hospital of 1870 was still the place where the poor went to die. By 1950 the hospital had become one of the most complex organizations, requiring management of extraordinary competence. The labor union in developed countries is run today by a paid managerial staff, rather than by the politicians who are nominally at the head. Even the very large university of 1900 (and the largest then had only five thousand students) was still simple, with a faculty of, at most, a few hundred, each professor teaching his own specialty. It has by now become increasingly complex—including undergraduate, graduate, and postgraduate students—with research institutes and research grants from government and industry and, increasingly, with a large administrative superstructure. And in the modern military, the basic question is the extent to which management is needed and the extent to which it interferes with leadership—with management apparently winning out.

The identification of management with business can thus no longer be maintained. Even though our textbooks and our studies still focus heavily on what goes on in a business—and typically, magazines having the word management in their title (for example, Britain’s Management Today or Germany’s Management Magazin) concern themselves primarily if not exclusively with what goes on in business enterprises—management has become the pervasive, the universal organ of a modern society.

For modern society has become a “society of organizations.” The individual who conforms to what political and social theorists still consider the norm has become a small minority: the individual who stands in society directly and on his own, with no intermediary institution of which he is a member and an employee between himself and the sovereign government. The overwhelming majority of all people in developed societies are employees of an organization; they derive their livelihood from the collective income of an organization, see their opportunity for career and success primarily as opportunity within an organization; and define their social status largely through their position within the ranks of an organization. Increasingly, especially in the United States, the only way in which the individual can amass a little property is through the pension fund, that is, through membership in an organization.

And each of these organizations, in turn, depends for its functioning on management. Management makes an organization out of what otherwise would be a mob. It is the effective, integrating, life-giving organ.

In a society of organizations, managing becomes a key social function and management the constitutive, the determining, the differential organ of society.

The New Pluralism

The dogma of the “liberal state” is still taught in our university departments of government and in our law schools. According to it, all organized power is vested in one central government. But the society of organizations is a pluralist society. In open defiance of the prevailing dogma, it contains a diversity of organizations and power centers. And each has to have a management and has to be managed. The business enterprise is only one; there are the labor unions and the farm organizations, the health-care institutions and the schools and universities, not to mention the media. Indeed, even government is increasingly becoming a pluralist congeries of near-autonomous power centers, very different indeed from the branches of government of the American Constitution. There is the civil service, for instance. The last president of the United States who had effective control of the civil service was Franklin D. Roosevelt fifty years ago; in England it was Winston Churchill; in Russia, Stalin. Since their time the civil service in all major countries has become an establishment in its own right. And so, increasingly, has the military.

In the nineteenth century the “liberal state” had to admit the parties, though it did so grudgingly and with dire misgivings. But the purpose of the parties was the conquest of government. They were, so to speak, gears in the governmental machine and had neither existence nor justification outside of it.

No such purpose animates the institutions of the new pluralism.

The institutions of the old pluralism, that is, of medieval Europe or of medieval Japan (the princes and the feudal barons, the free cities, the artisans, the bishoprics and abbeys) were themselves governments. Each indeed tried to annex as much of the plenitude of governmental power as it could get away with. Each levied taxes and collected customs duties. Each strove to be granted the right to make laws, and to establish and run its own law courts. Each tried to confer knighthoods, patents of nobility, or titles of citizenship. And each tried to obtain the most coveted right of them all, the right to mint its own coins.

But the purpose of today’s pluralist institution is nongovernmental: to make and to sell goods and services, to protect jobs and wages, to heal the sick, to teach the young, and so on. Each only exists to do something that is different from what government does or, indeed, to do something so that government need not do it.

The institutions of the old pluralism also saw themselves as total communities. Even the craft guild, the powerful woolen weavers of Florence, for instance, organized itself primarily to control its members. Of course, weavers got paid for selling woolen goods to other people. But their guild tried as hard as possible to insulate the members against economic impacts from the outside by severely restricting what could be made, how much of it, and how and at what price it could be sold, and by whom. Every guild gathered its members into its own quarter in the city, over which it exerted governmental control. Every one immediately built its own church with its own patron saint. Every one immediately built its own school; there is still “Merchant Taylor’s” in London. Every one controlled access to membership in the guild. If the institutions of the old pluralism had to deal with the outside at all, they did so as “foreign relations” through formal pacts, alliances, feuds, and, often enough, open war. The outsider was a foreigner.

The institutions of the new pluralism have no purpose except outside of themselves. They exist in contemplation of a “customer” or a “market.” Achievement in the hospital is not a satisfied nurse, but a cured former patient. Achievement in business is not a happy work force, however desirable it may be; it is a satisfied customer who reorders the product.

All institutions of the new pluralism, unlike those of the old, are single-purpose institutions. They are tools of society to supply one specific social need, whether making or selling cars, giving telephone service, curing the sick, teaching children to read, or providing benefit checks to unemployed workers. To make this single, specific contribution, they themselves need a considerable measure of autonomy, however. They need to be organized in perpetuity, or at least for long periods of time. They need to dispose of a considerable amount of society’s resources, of land, raw materials, and money, but above all of people, and especially of the scarcest resource of them all, highly trained and highly educated people. And they need a considerable amount of power over people, and coercive power at that. It is only too easy to forget that in the not-so-distant past, only slaves, servants, and convicts had to be at the job at a time set for them by someone else.

This institution has—and has to have—power to bestow or to withhold social recognition and economic rewards. Whichever method we use to select people for assignments and promotions—appointment from above, selection by one’s peers, even rotation among jobs—it is always a power decision made for the individual rather than by him, and on the basis of impersonal criteria that are related to the organization’s purpose rather than to the individual’s purpose. The individual is thus, of necessity, subjected to a power grounded in the value system of whatever specific social purpose the institution has been created to satisfy.

And the organ through which this power is exercised in the institution is the organ we call management.

This is new and quite unprecedented. We have neither political nor social theory for it as yet.

This new pluralism immediately raises the question, Who takes care of the commonweal when society is organized in individual power centers, each concerned with a specific goal rather than with the common good?

Each institution in a pluralist society sees its own purpose as the central and the most important one. Indeed, it cannot do otherwise. The school, for instance, or the university could not function unless they saw teaching and research as what makes a good society and what makes a good citizen. Surely nobody chooses to go into hospital administration or into nursing unless he or she believes in health as an absolute value. And as countless failed mergers and acquisitions attest, no management will do a good job running a company unless it believes in the product or service the company supplies, and unless it respects the company’s customers and their values.

Charles E. Wilson, GM’s chairman (later President Eisenhower’s secretary of defense), never said, “What is good for General Motors is good for the country.” What he actually said is “What is good for the country is good for General Motors, and vice versa.” But that Wilson was misquoted is quite irrelevant. What matters is that everybody believed that he not only said what he was misquoted to have said, but that he actually believed it. And indeed no one could run General Motors—or Harvard University, or Misericordia Hospital, or the Bricklayers Union, or the Marine Corps—unless he believed that what is good for GM, or Harvard, or Misericordia, or the Bricklayers, or the Marines is indeed good for the country and is indeed a “mission,” that if not divinely ordained, is still essential to society.

Yet each of these missions is one and only one dimension of the common good—important yes, indispensable perhaps, and yet a relative rather than an absolute good. As such, it must be limited, weighed in the balance with, and often subordinated to, other considerations. Somehow the common good must be made to emerge out of the clash and clamor of special interests.

The old pluralism never solved this problem. This explains why suppressing it became the “progressive cause” and the one with which the moral philosophers of the modern age (that is, of the sixteenth through the nineteenth centuries) aligned themselves.

Can the new pluralism do any better? One solution is, of course, to suppress the pluralist institutions. This is the answer given by totalitarianism and is indeed its true essence. The totalitarian state, whether it calls itself Fascist, Nazi, Stalinist, or Maoist, makes all institutions subservient to and extensions of the state (or of the omnipotent party). This saves the “state” of modern political theory, but at the sacrifice of individual freedom, of free thought and free expression, and of any limitation on power altogether. The state (or the party) is then indeed the only power center, as traditional theory preaches. But it can maintain its monopoly on power only by being based on naked terror, as Lenin was the first to realize. And even at that horrible price, it does not really work. As we now know—and the experience of all totalitarian regimes is exactly the same, whether they call themselves Right or Left—the pluralist institutions persist behind the monolithic facade. They can be deprived of their autonomy only if they and society altogether are rendered unable to perform, for instance, through Stalin’s purges or Mao’s Cultural Revolution. What the totalitarian regimes have proved is that modern society has to be a “society of organizations,” and that means a pluralist society. The only choice is whether individual freedom is being maintained or is being suppressed and destroyed, albeit to no purpose other than naked power.

The opposite approach to that of the totalitarian is the American one. The United States, alone among modern nations, never fully accepted the dogma of the liberal state. It opposed to it, quite early in its history, a pluralist political theory, that of John C. Calhoun’s “concurrent majority.” In the way in which Calhoun presented his theory in the 1830s and 1840s, that is, as a pluralism exercised through the individual states and intended to prevent the breakup of the Union over slavery, the “concurrent majority” did not survive the Civil War. But thirty years later, Mark Hanna, the founder of the modern Republican party and of modern American politics altogether, reformulated Calhoun’s pluralism as a concurrent majority of the major “interests”: farmers, workers, business. Each of these three “estates of the realm” can effectively veto the majority. It must not impose its will on the others. But it must be able to prevent the others from imposing their will on it. Another thirty years later, Franklin D. Roosevelt made this the basic political creed of the New Deal. In Roosevelt’s system government became the arbiter whose job it is to make sure that no one interest gets too powerful. When Roosevelt came in, “capital”—business as a term came later, and management later still—appeared to be far too powerful. Farmers and workers were thus organized to offset the business power. And then, not so many years later, when the labor power seemed to become too great, farmers and business were organized to offset and balance labor power, and so on.

Each of the “interests” is free to pursue its own goals regardless of the common good; it is indeed expected to do so. In the darkest days of World War II, in 1943 when American troops still lacked arms and ammunition, John L. Lewis, the founder of the Congress of Industrial Organizations (that is, of modern American unionism) and the powerful head of the coal miners’ union, called a coal strike to get higher wages for his men, defying national wage controls. President Roosevelt attacked him publicly for endangering the nation’s survival. Lewis retorted: “The President of the United States is paid to look after the nation’s survival. I am paid to look after the interests of the coal miners.” And while the newspapers attacked Lewis harshly, public opinion apparently felt that Lewis had only said out aloud what the Roosevelt administration had practiced all along. It gave Lewis enough support to win the strike.

This example, however, shows that the American pluralist doctrine is hardly adequate. Indeed, just as the old pluralism did, it has given birth to so many vested interests and pressure groups that it is almost impossible to conduct the business of government, let alone to conduct it for the common good.

In 1984–85 practically everyone in the United States agreed that the country needed a drastic tax reform to replace an increasingly complicated and irrational tax code, with a few tax rates and with exemptions eliminated. But no such code could be enacted. Every single exemption became the sacred cause of a vested interest. And even though some of them represented only a few hundred or a few thousand voters, each of them could and did block tax reform.

Is there a way out? The Japanese seem to be the only ones so far able to reconcile a society of organizations with the pursuit of the common good. It is expected of the major Japanese interests that they take their cue from “what is good for the country”: Then they are expected to fit what is good for themselves into the framework of a public policy designed to serve the national interest.

It is doubtful, however, whether even Japan can long maintain this approach. It reflects a past in which Japan saw herself as isolated in a hostile and alien world—so that all of Japan, regardless of immediate interests, had to hang together lest it hang separately. Will this attitude survive Japan’s success? And could such an approach have a chance in the West, where interests are expected to behave as interests?

Is this a problem of management, it will be asked? Is it not a problem of politics, of government, or political philosophy? But if management does not tackle it, then almost inevitably there will be imposed political solutions. When, for instance, the health-care institutions in America, the hospitals and the medical profession, did not take responsibility for spiraling health-care costs, government imposed restrictions on them, for example, the Medicare restrictions on the care of the aged in hospitals. These rules clearly are not concerned with health care at all and may even be detrimental to it. They are designed to serve short-run fiscal concerns of government and employers, that is, designed to substitute a different but equally one-sided approach for the one-sided, self-centered approach of the health-care “interests.”

This must be the outcome unless the managements of the institutions of the new pluralism see it as their job to reconcile concern for the common good with the pursuit of the special mission for the sake of which their institution exists.

The Legitimacy of Management

Power has to be legitimate. Otherwise it has only force and no authority, is only might and never right. To be legitimate, power has to be grounded outside of it in something transcending it that is accepted as a genuine value, if not as a true absolute by those subject to the power—whether descent from the gods or apostolic succession; divine institution or its modern, totalitarian counterpart the scientific laws of history; the consent of the governed, popular election or, as in so much of modern society, the magic of the advanced degree. If power is an end in itself, it becomes despotism and both illegitimate and tyrannical.

Management has to have power to do its job, whatever the organization. In that respect there is little difference between the Catholic diocese, the university, the hospital, the labor union, and the business enterprise. And because the governing organ of each of these institutions has to have power, it has to have legitimacy.

And here we encounter a puzzle. The management of the key institutions of our society of organizations is by and large accepted as legitimate. The single exception is the management of the business enterprise. Business enterprise is seen as necessary and accepted as such. Indeed, society is often more concerned with the survival of a large business or an industry than it is with that of any other single institution. If a major business is in trouble, there is a crisis and desperate attempts to salvage the company. But at the same time, business management is suspect. And any exercise of management power is denounced as usurpation, with cries from all sides for legislation or for judicial action to curb if not to suppress managerial power altogether.

One common explanation is that the large business enterprise wields more power than any other institution. But this simply does not hold water. Not only is business enterprise hemmed in in its power on all sides—by government and government regulations, by labor unions, and so on. The power of even the largest and wealthiest business enterprise is insignificant next to that of the university now that a college degree has become a prerequisite for access to any but the most menial jobs. The university and its management are often criticized, but their legitimacy is rarely questioned.

The large labor union in Western Europe and in American mass-production industries surely has more power than any single business enterprise in its country or industry. Indeed in Western Europe, both in Britain and on the Continent, the large labor union became society’s most powerful institution in the period after World War II, more powerful sometimes than the nation’s government. The unions’ exercise of their power during this period was only too often self-serving, if not irresponsible. But even their bitterest critics in Western Europe and in the United States rarely questioned the unions’ legitimacy.

Another explanation—the prevalent one these days—is that the managements of all other institutions are altruistic, whereas business is profit-seeking and therefore out for itself and materialistic. But even if it is accepted that for many people nonprofit is virtuous, and profit dubious, if not outright sinful, the explanation that profit undermines the legitimacy of business management is hardly adequate. In all Western countries the legitimacy of owners, that is, of real capitalists, and their profits is generally accepted without much question. That of a professional management is not, yet professional management obtains profits for other people rather than for itself—and its main beneficiaries today are the pension funds of employees.

And then there is the situation in Japan. In no other country, not even in France or in Sweden, was the intellectual climate of the postwar period as hostile to “profit” as in Japan, at least until 1975 or so. The left-wing intelligentsia of Japan in the universities or the newspapers might have wanted to nationalize Japan’s big businesses. But it never occurred even to the purest Marxist among them to question the necessity of management or its legitimacy.

The explanation clearly lies in the image which Japanese management has of itself and which it presents to its society. In Japanese law, as in American and European law, management is the servant of the stockholders. But this the Japanese treat as pure fiction. The reality which is seen as guiding the behavior of Japanese big-business management (even in companies that are family-owned and family-managed like Toyota) is management as an organ of the business itself. Management is the servant of the going concern, which brings together in a common interest a number of constituencies: employees first, then customers, then creditors, and finally suppliers. Stockholders are only a special group of creditors, rather than “the owners” for whose sake the enterprise exists. As their performance shows, Japanese businesses are not run as philanthropies and know how to obtain economic results. In fact, the Japanese banks, which are the real powers in the Japanese economy, watch economic performance closely and move in on a poorly performing or lackluster top management much faster than do the boards of Western publicly held companies. But the Japanese have institutionalized the going concern and its values through lifetime employment, under which the employees’ claim to job and income comes first—unless the survival of the enterprise itself is endangered.

The Japanese formulation presents very real problems, especially at a time of rapid structural change in technology and economy when labor mobility is badly needed. Still, the Japanese example indicates why management legitimacy is a problem in the West. Business management in the West (and in particular business management in the United States) has not yet faced up to the fact that our society has become a society of organizations of which management is the critical organ.

Thirty years ago or so, when the serious study of management began, Ralph Cordiner, then CEO of the General Electric Company, tried to reformulate the responsibility of corporate top management. He spoke of its being the “trustee for the balanced best interest of stockholders, employees, customers, suppliers and plant communities”—the groups which would now be called stakeholders or constituencies. As a slogan this caught on fast. Countless other American companies wrote it into their Corporate Philosophy statement. But neither Mr. Cordiner nor any of the other chairmen and presidents who embraced his rhetoric did what the Japanese have done: institutionalize their professions. They did not think through what the best-balanced interest of these different stakeholders would mean, how to judge performance against such an objective, and how to create accountability for it. The statement remained good intentions. And good intentions are not enough to make power legitimate. In fact, good intentions as the grounds for power characterize the “enlightened despot.” And enlightened despotism never works.

The term enlightened despot was coined in the eighteenth century—with Voltaire probably its greatest and most enthusiastic exponent—when the divine right of princes was no longer generally accepted as a ground of legitimate power. The prince with the best intentions among eighteenth-century enlightened despots and the very model of the progressive, the enlightened liberal, was the Austrian emperor Joseph II (reigned 1765–90). Every one of the reforms that he pioneered was a step in the right direction—the abolition of torture; religious toleration for Protestants, Jews, and even atheists; universal free education and public hospitals in every county; abolition of serfdom; codification of the laws; and so on. Yet his subjects, and especially his subjects in the most advanced parts of his empire, the Austrian Netherlands, rose against him in revolt. And when, a few years later, the French Revolution broke out, the enlightened despots of Europe toppled like ninepins. They had no constituency to support them.

Because Ralph Cordiner and his contemporaries never even tried to ground management power in institutional arrangements, their assertion very rapidly became enlightened despotism. In the 1950s and 1960s it became corporate capitalism, in which an enlightened “professional” management has absolute power within its corporation, controlled only by itself and irremovable except in the event of catastrophe. “Stock ownership,” it was argued, had come to be so widely dispersed that shareholders no longer could interfere, let alone exercise control.

But this is hubris: arrogance and sinful pride, which always rides before a fall. Within ten years after it had announced the independence of management in the large, publicly owned corporation, “corporate capitalism” began to collapse. For one, stock ownership came to be concentrated again, in the hands of the pension funds.

And then inflation distorted values, as it always does, so that stock prices, which are based on earnings expectations, came to appear far lower than book values and liquidation values. The result was the wave of hostile takeovers that has been inundating the American economy these last years and is spilling over into Europe now. Underlying it is the assertion that the business enterprise exists, and solely, for the sake of stockholder profits, and short-run, immediate profits at that.

By now it has become accepted widely—except on Wall Street and among Wall Street lawyers—that the hostile takeover is deleterious and in fact one of the major causes of the loss of America’s competitive position in the world economy. One way or another, the hostile takeover will be stopped (on this see also Chapter 28 of this volume). It may be through a “crash"; speculative booms always collapse in the end. It may be through such changes as switching to different classes of common stock, with the shares owned by the outside public having a fraction of the voting power of the insiders’ shares, or by giving up voting rights for publicly held common shares altogether. (I owe this suggestion to Mr. Walter Wriston, the chairman emeritus of New York’s Citibank.)

No matter how the hostile takeover boom is finally stopped, it will have made certain that the problem of management legitimacy has to be tackled. We know some of the specifications for the solution. There have to be proper safeguards of the economic performance of a business: its market standing, the quality of its products or services, and its performance as an innovator. There has to be emphasis on, and control of, financial performance. If the takeover boom has taught us one thing, it is that management must not be allowed substandard financial performance.

But somehow the various “stakeholders” also have to be brought into the management process (for example, through the company’s pension plan as a representative of the company’s employees for whom the pension plan is the trustee). And somehow the maintenance of the wealth-producing and the job-producing capacity of the enterprise, that is, the maintenance of the going concern, needs to be built into our legal and institutional arrangements. It should not be too difficult. After all, we built the preservation of the going concern into our bankruptcy laws all of ninety years ago when we gave it priority over all other claims, including the claims of the creditors. But whatever the specifics, business management has to attain legitimacy; its power has to be grounded in a justification outside and beyond it and has to be given the “constitutional” sanction it still largely lacks.

Closely connected to the problem of the legitimacy of management is management’s compensation.

Management, to be legitimate, must be accepted as “professional.” Professionals have always been paid well and deserve to be paid well. But it has always been considered unprofessional to put money ahead of professional responsibility and professional standards. This means that there have to be limitations on managerial incomes. It is surely not professional for a chief executive officer to give himself a bonus of several millions at the very time at which the pay of the company’s other employees is cut by 30 percent, as the chief executive officer of Chrysler did a few years ago. It is surely not professional altogether for people who are employees and not “owners” to pay themselves salaries and bonuses greatly in excess of what their own colleagues, that is, other members of management, receive. And it is not professional to pay oneself salaries and bonuses that are so far above the norm as to create social tension, envy, and resentment. Indeed there is no economic justification for very large executive incomes. German and Japanese top managers surely do as good a job as American top managers—perhaps, judging by results, an even better one. Yet their incomes are, at the most, half of what American chief executives of companies in similar industries and of similar size are sometimes being paid.

But there is also work to be done on the preparation, testing, and selection of, and on the succession to, the top-management jobs in the large business enterprises; on the structure of top management; and on performance standards for top management and the institutional arrangements for monitoring and enforcing them.

Business management is not yet fully accepted as legitimate in the West because it has not yet realized the full implications of its success. Individual executives, even those of the biggest company, are largely anonymous. They only make asses of themselves if they try to behave as if they were aristocrats. They are hired hands like the rest of us. On the day on which they retire and move out of the executive suite they become “nonpersons” even in their old company. But while in office they represent; individually almost faceless, collectively they constitute a governing group. As such their behavior is seen as representative. What is private peccadillo for ordinary mortals becomes reprehensible misconduct and indeed betrayal if done by a leader. For not only is the leader visible; it is his duty to set an example.

But then there is also the big question of what is now being called the “social responsibility” of management. It is not, despite all rhetoric to the contrary, a social responsibility of business but of all institutions—otherwise we would hardly have all the malpractice suits against American hospitals or all the suits alleging discrimination against American colleges and universities. But business is surely one of the key institutions of a society of organizations and as such needs to determine what its social responsibilities are—and what they are not.

Surely business, like anyone else, is responsible for its impacts: responsibility for one’s impacts is, after all, one of the oldest tenets of the law. And surely, business, like anyone else, is in violation of its responsibilities if it allows itself impacts beyond those necessary to, and implicit in, its social purpose, for example, producing goods and services. To overstep these limits constitutes a tort, that is, a violation.

But what about problems that do not result from an impact or any other activity of business and yet constitute grave social ills? Clearly it is not a responsibility of business, or of any organization, to act where it lacks competence; to do so is not responsibility but irresponsibility. Thus when a former mayor of New York City in the 1960s called for “General Electric and the other big corporations of New York City to help solve the problem of the Black Ghetto by making sure that there is a man and father in the home of every Black Welfare Mother,” he was not only ridiculous. He demanded irresponsibility.

But also management must not accept “responsibility” if by doing so it harms and impedes what is its first duty: the economic performance of the enterprise. This is equally irresponsible.

But beyond these caveats there is a no-man’s-land where we do not even fully understand what the right questions are. The problems of New York, for instance, are in no way caused by business. They were largely caused by public policies business had warned against and fought against: primarily by rent control, which, as it always does, destroys the very housing the poor need, that is, decent, well-maintained older housing; by demagogic welfare policies; and by equally demagogic labor-relations policies. And yet when New York City was on the verge of self-destruction, in the late 1960s and early 1970s, a small group of senior executives of major New York business enterprises mobilized the business community to reverse the downward slide and to renew New York City—people like Austin Tobin of the Port of New York Authority; David Rockefeller of the Chase Manhattan Bank; Walter Wriston and William Spencer of Citibank; Felix Rohatyn of Lazard Frères, the private bankers; the top management of Pfizer, a pharmaceutical company; and several others. They did this not by “taking responsibility” for things they lacked competence in, for example, the problems of the black ghetto. They did it by doing what they were highly competent to do: they started and led the most dramatic architectural development of any major city since Napoleon III had created a new Paris and Francis Joseph a new Vienna a hundred years earlier. The black ghetto is still there, and so are all the ills associated with it, for example, crime on the streets. But the city has been revitalized.

And this did not happen because these businesses and their managements needed the city; excepting only the Port of New York Authority, they could all have moved out, as a good many of their colleagues—IBM, for instance, or General Electric, or Union Carbide—were doing. These businesses and their top managements acted because the city needed them, though, of course, they benefited in the end if only because a business—and any other institution—does better in a healthy rather than a diseased social environment.

Is there a lesson in this? There surely is a challenge.

Altogether, for management of the big business to attain full legitimacy, it will have to accept that to remain “private” it has to accept that it discharges a social, and that means a “public,” function.

The Job as Property Right

When, in 1985, a fair-size Japanese company found itself suddenly threatened by a hostile takeover bid made by a group of American and British “raiders”—the first such bid in recent Japanese history—the company’s management asserted that the real owners of the business, and the only ones who could possibly sell it, were not the stockholders, but the employees. This was considerable exaggeration, to be sure. The real owners of a major Japanese company are the banks, as has already been said. But it is true that the rights of the employees to their jobs are the first and overriding claim in a large Japanese company, except when the business faces a crisis so severe that its very survival is at stake.

To Western ears the Japanese company statement sounded very strange. But actually the United States—and the West in general—may be as far along in making the employees the dominant interest in business enterprise, and not only in the large one as in Japan. All along, of course, the employees’ share of the revenues of a business, almost regardless of size, exceeds what the “owners” can possibly hope to get: ranging from being four times as large (that is, 7 percent for after-tax profits, as against 25 percent for wages and salaries) to being twelve times as large (that is, 5 percent for profits versus 60 percent of revenues for wages and salaries). The pension fund not only greatly increased the share of the revenues that go into the “wage fund,” to the point that in poor years the pension fund may claim the entire profit and more. American law now also gives the pension fund priority over the stockholders and their property rights in a company’s liquidation, way beyond anything Japanese law and Japanese custom give to the Japanese worker.

Above all, the West, with the United States in the lead, is rapidly converting the individual employee’s job into a new property right and, paradoxically, at the very time at which the absolute primacy of stockholder short-term rights is being asserted in and by the hostile takeover.

The vehicle for this transformation in the United States is not the union contract or laws mandating severance pay as in many European countries. The vehicle is the lawsuit. First came the suit alleging discrimination, whether in hiring an employee, in firing, in promotion, in pay, or in job assignment—discrimination on grounds of race or sex or age or handicap. But increasingly these suits do not even allege discrimination, but violation of “due process.” They claim that the employer has to treat the employee’s job, including the employee’s expectations for pay and promotion, as something the enjoyment of which and of its fruits can be diminished or taken away only on the basis of preset and objective standards and through an established process which includes an impartial review and the right to appeal. But these are the features that characterize “property” in the history of the law. In fact, they are the only features a right must possess to be called property in the Western legal tradition.

And as few managements yet seem to realize, in practically every such suit the plaintiff wins and the employer loses.

This development was predictable. Indeed, it was inevitable. And it is irreversible. It is also not “novel” or “radical.” What gives access to a society’s productive resources—gives access thereby to a livelihood and to social function and status and constitutes a major, if not the major, avenue to economic independence however modest—has always become a “property right” in Western society. And this is what the job has become, and especially the knowledge worker’s job as a manager or a professional.

We still call land “real” property. For until quite recently it was land alone that gave to the great majority of mankind—95 percent or more—what “property” gives: access to, and control over, society’s productive resources; access to a livelihood and to social status and function; and finally a chance at an estate (the term itself meant, at first, a landholding) and with it economic independence.

In today’s developed societies, however, the overwhelming majority—all but 5 or 10 percent of the population—find access to and control over productive resources and access to a livelihood and to social status and function through being employees of organizations, that is, through their jobs. For highly educated people the job is practically the only access route. Ninety-five percent, or more, of all people with college degrees will spend their entire working lives as employees of an organization. Modern organization is the first, and so far the only, place where we can put large numbers of highly educated people to productive work and pay them for applying knowledge.

For the great majority of Americans, moreover, the pension fund at their place of employment is their only access to an “estate,” that is to a little economic independence. By the time the main breadwinner in the American family, white collar or blue collar, is forty-five years old, the claim to the pension fund is likely to be the family’s largest asset, far exceeding in value the equity in the home or the family’s personal belongings, for example, their automobiles.

Thus the job had to become a property right—the only question is in what form and how fast.

Working things like this out through lawsuits may be “as American as apple pie,” but is hardly as wholesome. There is still a chance for management to take the initiative in this development and to shape the new property rights in the job so that they equally serve the employee, the company, and the economy. We need to maintain flexibility of employment. We need to make it possible for a company to hire new people and to increase its employment. And this means that we must avoid the noose the Europeans have put around their neck: the severance pay which the law of so many Continental countries mandates makes it so expensive to lay off anybody that companies simply do not hire people. That Belgium and Holland have such extraordinarily high unemployment is almost entirely the result of these countries’ severance pay laws. But whichever way we structure the new property rights which the job embodies, there will be several requirements which every employer, that is, every organization, will have to satisfy. First, there must be objective and equal performance standards for everyone performing a given job, regardless of race, color, sex, or age. Secondly, to satisfy the requirements of due process, the appraisal against these standards of performance has to be reviewed by somebody who is truly disinterested. Finally, due process demands a right of appeal—something, which by the way, as “authoritarian” a company as IBM has had for more than half a century.

The evolution of the job into a “property right” changes the position of the individual within the organization. It will change equally, if not more, the position of the organization in society. For it will make clear what at present is still nebulous: organized and managed institutions have increasingly become the organs of opportunity, of achievement, and of fulfillment for the individual in the society of organizations.

Conclusion

There is still important work ahead—and a great deal of it—in areas that are conventionally considered “management” in the schools of management, in management journals, and by practicing managers themselves. But the major challenges are new ones, and well beyond the field of management as we commonly define it. Indeed, it will be argued that the challenges I have been discussing are not management at all, but belong in political and social theory and public law.

Precisely. The success of management has not changed the work of management. But it has greatly changed management’s meaning. Its success has made management the general, the pervasive function, and the distinct organ of our society of organizations. As such, management inevitably has become “affected with the public interest.” To work out what this means for management theory and management practice will constitute the “management problems” of the next fifty years.

(1986)

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.147.140.206