The Election of Donald Trump and the Law of Small Numbers: A Statistical Note on the Choice of Incompetent Presidents.

The Election of Donald Trump and the Law of Small Numbers:

A Statistical Note on the Choice of Incompetent Presidents.

Burton Weltman

Making too much out of too little.

Statisticians have long warned us not to violate what they call the Law of Small Numbers, which is the mistake of making too much out of too little, and drawing big conclusions from a small sample of evidence.  At the same time, psychologists have told us that we are hardwired to do just that, and are programmed to reach hasty conclusions that would not survive reasoned reflection.  Without someone or something to make us stop and think, we all too often make decisions that we later regret.  And all of this, they tell us, is a result of evolution.

The tendency to reach hasty conclusions was, in fact, an evolutionary advantage for our puny ancestors, little rat-like mammals scurrying around trying to avoid being eaten by large predators.  For them, for example, seeing a potential predator in a given place more than once was probably a good reason to avoid that place forever more.  Taking extra precautions such as this was a key to survival for them.  But, the fact of the matter was that the appearance and reappearance of that predator in that place was often more likely a matter of chance than a pattern of behavior.  There was probably nothing to fear, but better safe than sorry was the order of the day.  The extra precaution was wise, albeit it was not statistically necessary.

We humans today are still operating under that primitive imperative of better safe than sorry, and we almost inevitably jump to broad conclusions based on limited data.  But what was a wise thing for our ancestors to do may be unwise for us.  Concluding, for example, that since your buddy was able to pick five winners in five horse races, you should place your life’s savings on his sixth tip, is probably unwise.  The sample of five winners in five races is just too small to reach a reasonable conclusion that your buddy knows what he is doing.  Unless, of course, you know that he has inside information and that the fix is in for the sixth race.

Making a silk purse out of a sow’s ear.

So, what does that have to do with the presidential election of Donald Trump?  Just this.  If you look at our system of electing Presidents in the United States, and read what commentators have been saying about it since the adoption of our Constitution, it is hard not to conclude that it is an extremely inefficient process.  At its best, the process has not worked at least since the first quarter of the nineteenth century to select the best and most qualified people to be our Presidents.  And with the degeneration of our political parties in recent years, and their declining influence, and with the increasing influence of big money and the mass media in the election of our Presidents, the process has gotten even worse in recent decades.  It is, at best, a random process, giving us maybe a fifty-fifty chance of having a decent or a disastrous President.

Under these circumstances, we have actually been very lucky in this country that we have had so few disastrous Presidents in our history.  Yet, we are surprised when someone as disastrous as Donald Trump gets elected.  We should not be surprised.  Our surprise is a function of our being fooled by the Law of Small Numbers.  Since a good majority of our Presidents have been at least decent, with disastrous Presidents seemingly as exceptions rather than the rule, we think we have a system of elections that works reasonably well.  Well, we don’t.  And it has once again been proven to us in spades.

There are many specific reasons why Trump was elected.  It was seemingly a perfect storm of things that went wrong or went against Hillary Clinton’s campaign, and went right or in favor of Trump’s.  But there are also many things wrong with our system of electing Presidents which contributed to his victory, from the undemocratic Electoral College to the extraordinary length of our election campaigns.

Some of these things will likely never be fixed, including the Electoral College, but some things can.  In particular, we need strong political parties that are organized from the bottom on up, and a strong program of public financing of election campaigns.  For information and ideas about public campaign financing, you can consult the website of Democracy Matters, a grass roots organization dedicated to taking big money out of our politics.  With stronger political parties and public financing, we can minimize the influence of demagoguery through the mass media of the sort that Trump successfully engaged in during the last election, and we can minimize the undemocratic influence of billionaires and big corporations on our elections.

These things are doable.  And we should, at least, learn from this last election not to trust in the illusion of conclusions that violate the Law of Small Numbers.  Given the nature of our electoral system, and the odds of the game, something like Trump was to be expected.  We can do better.

1/31/17

 

False Equivalencies Equal Bad History and Bad Politics: Populism vs. Nativism/Sanders vs. Trump.

False Equivalencies Equal Bad History and Bad Politics:

Populism vs. Nativism/Sanders vs. Trump.

 Burton Weltman

The current election cycle has featured two candidates for President, Bernie Sanders and Donald Trump, who were outsiders within the Democratic and Republican Parties, respectively.  Both succeeded beyond anyone’s expectations, seemingly even theirs.  Sanders came close to winning the Democratic nomination, and Trump actually won the Republican nomination.  Pundits and politicians have been grasping for months for an explanation of these candidates’ success.

A frequent explanation given by observers is that both candidates are “populists,” and that both are channeling the motives and emotions that were represented by the Populists of the late nineteenth century.  In turn, the challenges that Sanders and Trump have made to their parties’ establishments have been considered by the pundits to be equivalent.[1]  It is, however, neither historically nor politically accurate to label them both as populists, and it does a disservice to political discourse to propagate the idea that their challenges are equivalent.

Populism/populism.  Populism (with a capital “P”) was a late nineteenth-century political movement that hoped to sustain the viability of small farmers and small factories in this country through cooperative programs that would give them the economies of scale of big businesses and big farms (cooperative purchasing and selling agreements, sharing expensive equipment, working collectively on various tasks), and through government regulations that would keep big businesses and big banks from trampling on the little guys (limits on railroad rates, storage fees, bank loans, and price gouging).

Contrary to much present-day popular belief, Populists did not naively promote wild-eyed programs that had no chance of implementation or success.  Many Populist proposals were adopted at the state level, and they worked.  Some were enacted at the federal level, and they worked, too.  There is, in fact, no good reason why we have to have giant corporate farms or giant corporate businesses in most things.  Small can be beautiful, and can work.

The United States Supreme Court, however, was controlled in the late nineteenth century by a group of Justices who literally believed that laissez-faire was written into the Constitution, and they overturned most Populist legislation on Constitutional grounds.  This did not mean that Populists were unrealistic.  New Dealers faced a similar obstacle with a conservative Supreme Court during the 1930’s.  They were eventually able to overcome that hurdle.  Populist ideas were widely popular.  Given some changes in circumstances, Populism could conceivably have become the conventional wisdom of the country, and Populist policies might have resulted in a very different and possibly better America.

William Jennings Bryan is one of the reasons Populism failed.  He is often labeled a Populist, but he was not a Populist.  He ran his 1896 campaign as essentially a Silverite.  The Silverites believed that if the federal government backed its currency with silver and not merely gold, all would be well with small farmers and businesses.  Populists supported the coining of silver money, but did not see it as a cure-all.  Rather than promoting Populism, Bryan’s campaign for President, with its single-minded focus on a silver bullet solution, helped to kill it.  He is responsible in large part for the Populists’ reputation as being unrealistic.

Populist programs were revived during the New Deal by Secretary of Agriculture Henry Wallace through the Department of Agriculture.  He promoted the TVA (there were going to be five such projects), cooperative farm programs, cooperative small factories, and Greenbelt cities.  And all of these programs worked.  Wallace’s programs died as a result of the conservative political backlash of the late 1930’s, and so did Populism.

Populism was a big and broad movement.  As such, it included many different factions and tendencies.  Populism had a racist and xenophobic element at its fringes.  So did the Socialist movement, the Progressive movement, and the labor movement at that time.  In fact, the taint of racism and xenophobia has attached itself to almost every movement in American history.  But they were not major themes in Populism and they have not been major elements in Leftwing movements generally, as they have been in Rightwing movements to the present day.

Populism (with a capital “P”) was a producers’ movement that focused on the ways and means of producing things, with a goal of sustaining smallish scale production.  The populism (with a small “p”) that survives to the present day is a consumers’ movement that focuses on getting better wages, working and living conditions, and social services for ordinary people and people who are hard up.  The Bernie Sanders campaign was part of that movement.  And his campaign began with some very sensible proposals about what could be done now at the state level (single payer healthcare, higher minimum wage, environmental protections and many other populist programs can be adopted at the state level) and at the federal level (executive orders can do a lot).  As success went to his head, Sanders’ claims became progressively less realistic, but that does not mean that his campaign was based on naivete and wild-eyed proposals.

Nativism.  Nativism is not populism.  It is the Rightwing response to populism.  Nativism is the use of fear of racial and ethnic minorities as a means of promoting the social status quo and protecting the social position of those in control of a society.  Nativism, along with its components racism and xenophobia, have typically risen in this country in times of populist upheaval and social reform.  Nativism is an attempt to squelch social change through fear and hatred.  Its mantra is that change will help only Them and will hurt Us.

Contrary to popular opinion among pundits, George Wallace was not a populist.  He was a racist nativist.  Donald Trump is not a populist.  He is a racist and xenophobic nativist.  Nativism, racism and xenophobia are founded on the status anxiety of people with a little something, who are willing to support those who have the best and most of everything, in order to fend off and stay above those who have little or nothing.  Racism is the answer to the question of why Southern white small farmers supported with their lives a system of slavery that well served a few rich planters, but hurt them in every way other than in their ability to feel themselves above the black slaves.  Racism is the answer to the question of why so many whites today are so opposed to Obamacare.

Populism is a positive program of reform based on hope.  Nativism is a negative program of hate based on fear.  Populists and nativists are appealing to some of the same constituencies, but there is no equivalency in the appeals.

Consequences.  In turn, I would predict that there will be little equivalency in the consequences of the Sanders and Trump campaigns.  Assuming that Trump loses, his has been a campaign based on the lies that immigrants are taking American jobs, committing lots of crimes, and living on the dole paid for by good white American taxpayers.  His base has been old white people.  The lies will out and the old people will die out.  That could and should be the end of his influence.  Sanders’ campaign was based on truths about healthcare and wages, and his base has been young people.  His truths and his base will likely only grow so that even if he is personally done, his movement could and should live on.  Of course, if Trump wins, then all bets are off and God help us.

9/23/16

[1] See, for example, the recent article by John Judis in The New Republic. “All the Rage.” Sept. 19, 2016.

Do unto others before they do unto you: The Devolution of Conservatism from Burke to Trump And the Evolution of Pragmatic Liberalism from Madison to Obama.

Do unto others before they do unto you:

The Devolution of Conservatism from Burke to Trump

And the Evolution of Pragmatic Liberalism from Madison to Obama.

 

Burton Weltman

 

“We’ve got what they want, and we aim to keep it.”

Vice President Spiro Agnew

 

Prelude: A Concern with Unintended Consequences.

My purposes in writing this essay are twofold.  First, I will outline what I see as the devolution of conservatism from its starting highpoint in the eighteenth century to its low point as blatant racism, ethnocentrism and mere obstructionism in the present day.  I will focus on the historic concern of conservatives with the potential for unintended negative consequences in undertaking social reform, and their claim that negative results invariably overwhelm any positive change.  Edmund Burke, the father of conservatism, voiced this concern during the eighteenth century as a legitimate question of whether and how we can predict the results of social reform.

What began as a legitimate concern about unintended consequences devolved over the years into an excuse by conservative politicians to oppose any change that might negatively impact their wealthy sponsors.  That practice eventually devolved into a justification for opposing any program that might help racial and ethnic minorities, a coded appeal to the racial fears of white people.  In the current election cycle, what had been a coded appeal to bigotry has become open fearmongering and hate peddling by Donald Trump.  I will argue that the turning point in the devolution of conservatism came with the advent of Social Darwinism at the turn of the twentieth century, and the acceptance of its basic premises by most conservative politicians.

Second, I will argue that the evolution in the early twentieth century of pragmatism as a comprehensive social theory and practice undermined the rationale for conservativism and transformed the rationale for liberalism.  Backed by the methods of the then newly emerging social and physical sciences, pragmatism offered a way for social reforms to be subject to experimental methods, ongoing evaluation, and continuous revision.  This pragmatic review process could effectively mitigate most legitimate concerns about the unintended consequences of reform, so that conservatism had been rendered obsolete.  Politics could safely become a realm of continuous social reform, which is the position represented by President Obama.

Act I.  Actions, Reactions, and Reactionaries: The Birth of Liberalism and Conservatism.

“To every action there is always opposed an equal reaction.”

                 Isaac Newton.

“Ambition must be made to counteract ambition.”

               James Madison.

“We must all obey the law of change.  It is the most powerful law of nature.”

              Edmund Burke.

 

Setting the Scene: Let us reason together.

It was the turn of the eighteenth century.  Europeans had suffered through almost two centuries of political upheaval and religious wars.  The Protestant Reformation had precipitated the Catholic Counter-Reformation, which had led to Protestants and Catholics slaughtering each other, and to both Christian groups killing Muslims and Jews.  At the same time, the decline of feudalism had precipitated the economic upheaval of nascent capitalism, with land enclosures creating massive unemployment and unrest.

Europe was, however, about to enter a period that contemporaries called the Enlightenment in which prominent intellectuals and their backers tried to leave behind the superstitions, authoritarianism and violence of previous centuries.  And it was a period of relative calm compared to the recent past, despite the imperial rivalry of England and France, who engaged in a series of imperial wars from the 1690’s through the 1810’s.  During one of those wars, the French helped a group of North American colonies gain their independence from England, and establish the United States.  Calmness and control were watchwords in culture and society during the period.  These goals were reflected in the scientific and political theories and practices of the time, which included the rise of liberalism and conservatism as political philosophies.[1]

Isaac Newton’s World: Inertia, Friction and Orderly Change.

The eighteenth century marks the definitive opening act of modern science and politics.  By modern, I mean the theories and practices from which we most closely derive our own ideas today.  There are many people who can be cited as precursors of modernity, for example Bacon and Galileo in the physical sciences.  But their ideas were not given full exposition until the work of Isaac Newton at the beginning of the eighteenth century.  Newton established a framework that dominated the physical sciences for some two hundred years.  Most notably, in his Three Laws of Motion, Newton reversed scientific theories that dated back to Aristotle, and rejected common sense human experience as well.

In his First Law of Motion, Newton claimed that something in motion would continue moving in a straight line forever unless it was disturbed by some change in circumstances, some force that pushed it out of its inertial course.  That law was in direct contradiction to ancient Aristotle’s theory and to our common sense experience that a thing must be continuously pushed by a force in order to continue in motion.  In our common experience, things grind to a halt unless they are pushed.  That is mainly the result of friction, but since we live in a world of friction, we usually take it for granted, and do not factor it in as a countervailing force in our thinking about things.  Since we have little experience of things moving in a vacuum, in which there is no friction, Newton’s First Law is counter-intuitive to most of us.

Newton’s Second Law of Motion describes the change in circumstances, that is, the force, necessary to change the inertial course of something – to start it, stop it, or redirect it.  His Third Law emphasizes that for every action, there is an equal and opposite reaction.  Push and you will be pushed back.  This also seems counter-intuitive to most of us, as we do not experience as pushback the inertial resistance of something we are pushing.  We merely think of it as the heaviness of the object, not that the object is pushing back at us.

In his Laws of Motion, theories of gravity, and other work, Newton described a mechanical universe of complementary and competing forces, in which things take their customary course ad infinitum, unless they are forced to change by natural or unnatural circumstances.  These Laws of Motion were not only counter-intuitive to common sense experience, they also described a more orderly picture of the world than was experienced by most people.  Most Europeans were still reeling from the consequences of the religious and political wars of the sixteenth and seventeenth centuries, and the social and economic upheaval of nascent capitalism.  Most ordinary people lived precarious lives in circumstances that seemed in constant turmoil.  In the religious and political beliefs of most people, the only thing that kept things going and kept them in order was the constant intervention of God, the King and/or some strong outside force.

Newton disagreed.  Although he was a deeply religious man, who spent more time and effort in his studies of religion and ethics than he did on science, Newton’s scientific theories delineated a universe that was very different than that portrayed in conventional religious and political theory.  Contrary to the conventional view of the world as constantly teetering on turmoil, he portrayed a universe which was essentially stable, and in which ordinary people could choose to keep things the same or change them.  He was, thereby, describing the essence of our modern world view.[2]

Newtonian Politics and The Rise of Conventional Political Ideology. Developments in political theory and practice during the eighteenth century followed a course similar to that of physics.  Sharing a Newtonian view of the universe, newly evolving political theories described a political world which operated mechanically and predictably, instead of on the edge of chaos, and in which people could choose their governments, being no longer tethered to Divine Right Kings.  In this political development, liberalism came first, and conservatism came in reaction.

The liberal and conservative ideologies that emerged during this time dominated political theory and practice in England and America for some two hundred years.  They are still influential today.  Aspects of these ideologies were developed by Thomas Hobbes and John Locke during the seventeenth century, Hobbes a conservative forerunner, Locke a liberal forerunner.  Their ideas were given full exposition during the eighteenth century in the theories and practices of the liberal James Madison and the conservative Edmund Burke.[3]

Liberalism: The Obvious Truth.  The term “liberal” began as an ethical concept that denoted generosity.  A liberal person was someone, usually a person of station and means, who gave generously to the less fortunate in society.  During the eighteenth century, the term was extended to politics.  In politics, a liberal was a social reformer and social planner, usually a person of station and with a formal education, whose proposals were designed to make society fairer and more efficient, and were generously intended to help the less fortunate and oppressed in society.

Political liberals, like most devotees of the Enlightenment, believed in the power of Reason (with a capital R).  They generally held that one could derive self-evident truths through reasoning, and then develop social policies based thereon.  They were planners, who thought that if something was wrong, they could rationally design a fix for it.  They were impatient with tradition, as the sepulchral grip of the dead hand of the past choking the present, and insisted on change as the function of reason.  Nature was, to them, something to be tamed and made to work for humans.  Finely landscaped gardens, neatly plowed and hedged wheat fields, and clearly mapped roads and routes were their ideal of nature.  Human nature had similarly to be tamed and bounded, even as social problems were being solved.

Most eighteenth century liberals assumed a social hierarchy in which the People would instinctively defer to their natural leaders, that is, to those in the social and educational elite of society, so long as those leaders fulfilled their natural obligations to rule on behalf of the People.  Government was the result of a contract with the People, and they acted as a check on the elite.  The Declaration of Independence and the Constitution of our Founding Fathers exemplified eighteenth century liberal theories.  Based on “self-evident truths,” the Declaration outlines a philosophy of “liberty, equality and the pursuit of happiness” that is derived from Reason, and that balances the rights and duties of subjects with the powers and duties of their rulers.

The Constitution follows the philosophy of the Declaration in establishing a government of separate powers that were expected to check and balance each other, even as they worked together to “promote the general welfare” and provide other social goods for “We, the People.”  The Constitution describes a Newtonian political universe of actions and reactions.  Its original provisions even established different mechanisms and constituencies for the selection of members of the different branches of government.  The purpose of this complicated process was to ensure that no one group in society would dominate the government, and that the majority could not oppress minority groups.  It was also intended to facilitate the selection of members of the elite to most offices.

While the Founders were concerned with restraining politicians from running wild and ruining things, the Constitution also assumes an active government and continuous social reform.  It provides the federal government with powers to make changes in almost every area of society, including the government itself.  It is a short document short on specifics and, therefore, needs to constantly be interpreted and re-interpreted according to changes in society.  It also contains provisions for amending itself and, thereby, assumes that government must be changed as society changes.  Liberal social reform is incorporated into the fabric of the Constitution.

Critics of the Enlightenment have frequently contended that liberals of that time foolishly believed in the inevitability of progress.  That is not the case.  While many Enlightenment liberals, including Thomas Jefferson, the primary author of the Declaration of Independence, and James Madison, the primary expositor of the Constitution, may have in some ways been fools, they believed only in the possibility, and not the inevitability, of progress.  The weakness in their proposals was often in the paucity of evidence on which they were based.  Relying heavily on examples from ancient history, especially those of Greece and Rome, and on inevitably biased accounts of recent events, the Founding Fathers often rushed to judgments that proved wrong.  Although they relied on the best available evidence, that evidence was often not good enough.

The rationale for American Revolution was, for example, based on an inappropriate comparison of George III with Charles I, and on inaccurate reports from England about the doings and desires of the King.  The Revolution may have been a mistake.  The Founding Fathers were also seemingly mistaken in their expectations of the outcome of the Revolution, which is why they so quickly abandoned the Articles of Confederation for which they had fought, and established a very different government in the Constitution.  Government and politics under the Constitution, in turn, turned out to be very different than they intended and expected.[4]  This weakness in the predictive powers of liberal reformers opened the door for a conservative counterattack.

Conservatism: Old Truths are the Best.  Edmund Burke is almost universally considered the father of modern conservatism.  He was also almost universally considered by contemporaries to be a man of principle.  As an example, although Burke opposed the liberal philosophies embodied in the Declaration of Independence and the Constitution, he supported the American revolutionaries in their battle for independence from British rule.  A conservative supporting a revolution, and bucking his own political party and party leadership to boot.  To most of us today, this seems like odd behavior for the ur-conservative.  But that is the difference between what most people think of as conservatism today; the way it is represented by most so-called conservatives of the Social Darwinian school; and what it represented in the past.

The term “conservative” began as an ethical concept that denoted caution and frugality.  The term was extended to politics during the late eighteenth and early nineteenth centuries as part of the Romantic revolt against the Enlightenment and against liberal rationalism.  It is popularly thought that conservatives have always opposed all social change, and that they have wanted everything to stay the same or even go back to way they were in the past.  This is not the case.  Conservatives have historically accepted cautious social change.  People who oppose any and all progressive social change are more accurately called “right-wingers.”   Right-wingers generally represent interest groups that benefit from the status quo, and that fear social reform would entail a loss of power, profit and/or status.  And it is so-called “reactionaries,” not Burkean conservatives, who peddle nostalgia for the so-called “good old days” (that usually weren’t so good), and who want things to go back to the way they supposedly were in the past.

In contrast to right-wingers and reactionaries, Burke believed in incremental evolutionary change.  He rejected planned change, but accepted adaptive change.  He believed that society is strongest when it changes so gradually that the changes are barely noticed from generation to generation, and may only be recognized from a long historical distance.  He believed that tradition was the distilled wisdom of the ages.  And he believed that human reason was too weak and short-sighted to safely predict the consequences of social planning.  Burke insisted that the unintended negative consequences of social reforms were almost inevitably going to be greater than the positive effects.  However bad things were now, they would likely be worse if people took action to remedy the situation.

Burke’s insistence on the limits of reason and concern with the unintended consequences of reform comprise the most powerful legacy that Burke left to conservatives.  These ideas have historically been conservatives’ strongest argument against social reform.  They constitute an almost universal argument that can be used against almost any proposed reform.  Burke did not, however, oppose all reform.  He would support social reform if the survival of the social system seemed to require it, and if conscience and human decency seemed to demand it.

Burke believed in a hierarchical society controlled by an elite upper class.  But Burke’s elite could not merely pursue their own self-interest, even if it was justified with some sort of trickle-down theory of social benefits, as right-wingers proclaim today.  Burke’s elite were burdened with the obligation of caring for society, which included the noblesse oblige of the upper class to take care of the masses, a sort of mandatory charitable giving.  He was a vehement opponent of democracy, which he warned would lead to the subjugation of society by an ignorant mass.  But he also opposed oppression of the masses and persecution of racial and religious minorities by the elite.  His insistence on treating people decently was considered a matter of honor among conservatives during the nineteenth century, even if it was a principle that was almost always more honored in the breach.  It is a legacy that is all but gone among so-called conservatives today.

It was based on what he considered respect for tradition and the demands of decency that Burke supported the American revolutionaries.  He claimed that the King and Parliament had taken advantage of the British victory over the French in America in 1763 to radically change the terms on which the American colonies were being governed, and that tradition was being violated.  He thought also that the British government was being too harsh in its treatment of the colonists, and that noblesse oblige was being violated.  As a result, he believed the Americans were justified in rebelling against British misrule.

Burke had a deep respect for the facts.  The historical facts, the facts of evolutionary social change, and the facts of present-day problems were the foundation of his conservative ideology.  He accepted what was, and he did not hanker after what could be or what had been.  He challenged both liberals and reactionaries with what he saw as the facts.  This respect for facts also made him flexible.  He disdained Reason (with a capital R), but attempted to be reasonable.  He was the founder of conservative ideology, but he was not a conservative ideologue.

Act II. Dogmatism versus Pragmatism: Ideologues versus Ideology.

“It is not the strongest of the species that survive, nor the most intelligent, but the most responsive to change.”

“If the misery of the poor be caused not by the laws of nature, but by our institutions, great is our sin.”

             Charles Darwin.

“The social order is fixed by laws of nature precisely analogous to those of the physical order”

“Millionaires are a product of natural selection…Poverty and misery will exist in society just so long as vice exists in human nature.”

            William Graham Sumner.

“Our institutions, though democratic in form, tend to favor in substance privileged plutocracy.”

“Selfishness is the outcome of limited observation and imagination.”

           John Dewey.

Setting the Scene: Trying to find order in the midst of disorder.

It was the turn of the twentieth century. Change was the order of the day.  The nineteenth had been a century of revolution.  Europeans and Americans had suffered through the beginnings of the industrial revolution, which had produced enormous wealth for plutocrats but misery for the working classes, huge cities ringed by wealthy suburbs but with slums in their center, an abundance of goods but want among the masses, powerful inventions but large-scale environmental degradation, and miraculous medical advances but widespread disease.  There had also been a host of political revolutions, civil wars and other upheavals, as democratic aspirations gradually overcame aristocratic opposition in Europe and America.

The intellectual world was upended by the emergence of the specialized physical and social sciences, with their empirical and statistical methods, replacing the traditional emphasis on the classics and on Reason.  A cultural revolution was instigated toward the end of the century by the publication of Charles Darwin’s The Origin of Species.  The book put the theory of evolution and the consequences of evolution at the center of moral, intellectual and political life, where they remain today.

Charles Darwin’s World: Pragmatism, Relativism, and Probabilities.

The turn of the twentieth century was the age of Darwin.  Evolution was both the rage and a source of outrage.  Agnostics and atheists saw it as vindication of their beliefs or non-beliefs.  Protestant fundamentalists and Biblical literalists, in turn, damned it as sacrilege.  Scientists saw it as encouragement to take a more probabilistic and relativistic view of their fields.  Philosophical positivists and intellectual absolutists damned that as nihilism.  And it led some leading liberals and conservatives to revise their respective political beliefs, much to the chagrin of purists in both camps who damned that as unprincipled and immoral backsliding.

Theories of Evolution.  Darwin’s was not the first theory of evolution.  In the early nineteenth century, Jean-Baptist Lamarck had proposed what became a widely popular theory of evolution in which he claimed that creatures could genetically pass on to their progeny characteristics that they had acquired during their lifetimes.  Under Lamarck’s theory, for example, it could be said that giraffes acquired their long necks by dint of successive generations of giraffes stretching up to reach leaves at the tops of trees.  This theory implied that human families, ethnic groups, and racial groups could improve themselves through personal achievements that they then passed down to their descendants.  This seemed to mean that people were ultimately responsible for their own biological and social successes and failures.  A moral value could be attached to biological characteristics and to social success or failure.  People got what they deserved.

Darwin rejected Lamarck’s theory.  His theory was based instead on two key ideas, random variation and natural selection, that generated most of the opposition among religious fundamentalists to this theory.  Darwin claimed that new characteristics are not acquired through personal effort but through random genetic variation, essentially through what we would call mutation.  We cannot tell how or why these mutations occur.  It is pure happenstance to us.

This idea outraged many religious people and was greeted with glee by atheists.  It does not, however, necessarily mean that God is out of the evolutionary picture.  What is random to us humans could be planned by God.  It does not even mean that the creation stories in the Book of Genesis are invalid, if you read the stories metaphorically rather than literally.  The Catholic Church and most liberal Protestant groups read the Bible metaphorically and, therefore, have had no problems with Darwin’s theories.  But Protestant fundamentalists and Biblical literalists have rejected this view, and have rejected evolutionary theory.  They have, in turn, from that time to the present created havoc with the science programs in many American school districts.

Darwin also claimed that species survive and thrive based on their adaptability, which he called natural selection.  Natural selection is the ability of a creature either to successfully respond to environmental changes and challenges, or to fail and disappear.  Living things survive by trying to fit themselves into the existing environment.  They are assimilationists.  But they also try to better fit the environment to themselves.  They are social and environmental reformers.  The impetus for social reform is, thus, built into the structure of life.  Without it, we would die out.

Cultural relativism and ethical pragmatism are implicit in Darwin’s theory, and political and religious dogmatists have rejected Darwinian ideas for this reason.  According to Darwin the ability of living creatures to survive and thrive is based on the adaptability of their beliefs and practices.  If they adopt beliefs that do not work toward survival, they will disappear along with those beliefs.  If circumstances change and they are not willing or able to change with them, they will not survive.  Humans and other living creatures must take a tentative and probabilistic approach to beliefs and practices, willing and able to change them as circumstances require.

Darwin is popularly known for two main ideas, neither of which were his, but which were the foundation of Social Darwinism.  They are the idea of survival of the fittest, and the idea that there are inevitably losers as well as winners in evolution.  The latter idea derives from the population theories of Thomas Malthus.  Malthus claimed that population growth inevitably outpaces resources, and there are not enough resources to satisfy everyone.  In Malthus’ view, it is only through war, disease and famine that the human population has been kept under relative control.  And he opposed charity for the poor because it would only encourage them to have more children.

Malthus’ ideas are the inspiration for what is today known as the “zero-sum” theory of economics.  According to this theory, there is a limited amount of wealth in the world, not enough to make everyone well-off, and if some people get more, others must get less.  Darwin was inspired by Malthus’ population growth theory as an explanation for the rise and fall of the population of some species, but he did not use it as a general explanation of evolution.  Nor did Darwin think that human evolution was inevitably Malthusian.

Survival of the fittest was a term invented by Herbert Spencer.  Spencer had been a devotee of Lamarck’s evolutionary theory, and he believed that fitness was a moral achievement.  Social success as well as biological success were personal achievements that made a person fit to survive and thrive.  Social failure, according to Spencer, was a sign of genetic unfitness and unfitness to survive.  Darwin adopted the phrase “survival of the fittest” in his later works, but without any of the moral overtones that Spencer gave it.

Fitness did not mean for Darwin that one was the strongest, smartest, most powerful, most socially successful, or best in any other way except that one was able to fit oneself to the environment and fit the environment to oneself.  Spencer became a well-known supporter of Darwin’s biological theories, but used them to support his own so-called Social Darwinian social and economic theories, that neither Darwin nor Darwin’s theories supported.[5]

The Influence of Evolution on Philosophy and Science.  The theory of evolution ushered in a sea change in science from a positivist emphasis on finding absolute natural laws to proposing relativistic and probabilistic theories.  Mendel’s genetic principles in biology, Einstein’s theories of relativity in physics, and Heisenberg’s uncertainty principle in quantum mechanics were among turn-of-the-twentieth-century scientific advances that promoted a relativistic approach to truth.  William James’ radical empiricism and John Dewey’s experimentalism were among the philosophical applications of evolutionary theories.  This turn toward relativism on the part of scientists and philosophers generated an emotional reaction against science and philosophy among religious fundamentalists that continues to the present day in the United States.

It is a reaction that is based on misunderstanding.  Relativism does not mean that anything goes, or that there are no standards.  Relativism is not nihilism.  In saying that something is relative, one must always be willing to respond to the question “Relative to what?”, and be able to delineate some stable benchmark that provides a standard for evaluating the relativity of the thing.  In evolutionary theory, for example, survival is the standard by which things are evaluated.  In pragmatist philosophy, whether something works as an answer to a question is the standard.[6]           

The Evolution of Evolutionary Politics: Pragmatist Action, Dogmatist Reaction.  During most of the nineteenth century, liberals and conservatives shared many basic ideas, and their programs often overlapped.  Both liberal and conservative movements were broad-based, with a wide range of beliefs within each movement, and with the left-wing of conservatism shading into liberalism and the left-wing of liberalism shading into socialism.  Both groups had to adapt to the democratic trends of the time, and both hoped to bring order to democracy through the leadership of a meritocratic elite, albeit they had different types of elite in mind.

Conservatives generally looked to the rich to lead society.  Thomas Carlyle, among others, eulogized capitalists as “captains of industry” who ought to take command of society.  Liberals generally focused on education as the primary criteria for leadership, as they for the most part still do today.  John Stuart Mill, the leading liberal of the nineteenth century, advocated that those with more formal education should get more votes than those with less education, and Karl Marx, the leading socialist, promoted leadership by political theoreticians such as himself.

Both liberals and conservatives sought to promote industrialization, but with different emphases on how wealth should be distributed, and what sort of role government should play in the economy.  Both groups believed that government should encourage growth, and discourage corruption and crass exploitation.  Conservatives generally favored government intervention in the economy only if a problem was so severe that it threatened the social system.  Liberals generally supported government action to deal with a wide range of social ills.  Conservatives did, however support reform on humanitarian grounds.  It was English conservatives in the early nineteenth century who first proposed labor laws to protect working women and children.  And Abraham Lincoln, the ur-Republican, was a corporate lawyer who also supported labor rights as well as an end to slavery.

During the last half of the nineteenth century, economic and political events challenged the ideologies of both liberals and conservatives in the United States.  Economic depressions, violent labor disputes, rampant infectious diseases, overcrowded cities, rising crime rates, and other crises upset the orderly ideas of both groups.  Darwinian ideas of evolution came along at a time when both liberals and conservatives were looking for explanations of what was going on.

Avant garde intellectuals and activists among both liberals and conservatives seized on evolutionary ideas, but with very different applications and very different results.  The application of Darwin’s ideas to politics produced major splits within the ranks of liberals and conservatives, with the old guard in both groups fighting rear-guard actions to the present day.  An ever-widening split also developed between the Darwinian liberals and Darwinian conservatives who increasingly came to dominate the Democratic and Republican parties.

Social Darwinism: Every Man for Himself.  Social Darwinism was adopted by many erstwhile conservatives at the turn of the twentieth century as a rationale for control of society by the wealthy, and as a strategy for convincing the masses to support rule by the rich.  Historians have debated exactly how many people used the term Social Darwinism to describe themselves.  It is clear, however, that the ideas and the strategy represented by the term became increasingly influential among conservatives starting in the late nineteenth century and continuing to the present, even as conservatives increasingly rejected Darwinian theories of evolution.

These ideas can be summed up in two phrases, Malthusian catastrophe and survival of the fittest.  The strategy can be summed up in one word, fear.  A Malthusian catastrophe is when the downtrodden masses rise up and use up all the resources that the rest of us need to thrive, so that we all go down to a hellish existence together.  Malthusianism is the prediction of dystopia unless the masses are kept strictly in check.  It is an idea that gained currency when the closing of the American frontier in the 1890’s seemed to presage the closing down of opportunity, and has gained traction in the present day, when globalization seems to have a similar import.

Survival of the fittest means the cultivation of wealth and a cult of the wealthy.  According to this theory, laissez-faire capitalism is the competitive law of nature translated into an economic system, and it is ostensibly the single greatest vehicle for human evolution.  The winners in cutthroat capitalism are the best specimens of humanity, and having won the economic race are the ones who should lead the human race.  The losers in the race should be left behind, lest they become a drag on the rest of us.  This winner-takes-most theory is sometimes rationalized as what has come to be called “trickle-down” economics and culture.  The claim is that when the rich get more of something, some collateral benefits will trickle down to the rest of society.

Fear-mongering was the strategy to implement this theory.  It was a means of convincing those people who have little to support the reign of those people who have a lot in order to protect themselves against those people who have nothing.  Social Darwinism was an ideology and a strategy that allowed conservatives to eschew concern for the welfare of the masses that Burke had considered a matter of honor.  The poor get what they deserve, which is nothing, as do the rich, which is a lot.  Those who have a little bit are frightened into aligning with the rich.

In this theory, the last shall stay last because they chose their own fate.  This view of the poor gave conservatives an even more powerful argument against social reforms than Burke’s concern with unintended consequences.  According to this theory, giving to the poor only wastes precious resources and threatens catastrophe for the rest of us.  As Vice President Spiro Agnew once opined, the downtrodden want what we’ve got, and we’ve got to make sure they don’t get it.  Fear trumps decency, and we have to do unto them before they do unto us, meaning the masses have to be tricked into compliance when possible, repressed into compliance when necessary.[7]

From Herbert Spencer, William Graham Sumner and Andrew Carnegie at the turn of the twentieth century, to William Buckley, Joseph McCarthy, Richard Nixon, and Spiro Agnew in the mid-twentieth century, to George Will, George W. Bush, Dick Cheney, and Donald Trump in the twenty-first century, the proponents of Social Darwinian ideas and strategies have gained increasing prominence among so-called conservatives, and especially within the Republican Party.  Some conservative followers of Ayn Rand, such as Rand Paul and Paul Ryan, have taken to calling themselves libertarians, but they are still Social Darwinians.  All of them should really be called right-wingers or reactionaries, not conservatives in the Burkean sense.

Whatever they call themselves, their ideology is based on the twin principles of zero-sum and laissez-faire economics, and on a strategy of fear.  The strategy promotes nativism, since only those like us can be trusted, and racism, since those unlike us must be feared, especially those who look different.  And Social Darwinian right-wingers are constantly looking for an enemy to fear.  Although Burke and his conservative descendants were by no means loathe to use extreme force and fierce repression against those they considered dangers to the social order, they did not work overtime to invent dangers in order to justify their rule, as have generations of Republicans in the United States.

From the swarthy tramps, immigrants and anarchists at the turn of the twentieth century, to the blacks and bearded Communists in the mid-twentieth century, to the blacks, Hispanics, Arabs, Muslims, and olive-skinned immigrants in the early twenty-first century, fear-mongering has increasingly been the primary strategy of Republicans.  The Other is the danger, and repression is the answer.

With the decline and fall in the late twentieth century of the Soviet Union and Communists as threats, conservatives were hard put to find an enemy with which to scare the public.  George H.W. Bush was so desperate that he invaded Panama to overthrow Manuel Noriega, a former CIA operative and well-known drug trafficker, who had somehow become a grave danger to America.  Noriega is still in jail today, and drug trafficking is more widespread than ever.  The desperation implicit in this type of scaremongering demonstrates the depth of the worry among right-wing politicians that without a dangerous Other to fear, the public might no longer support their retrograde policies.  In the same vein, George W. Bush invaded Iraq to destroy weapons of mass destruction that were not there, with disastrous consequences that continue to the present.

The history of the Republican Party during the twentieth century has been the gradual decline, and now almost complete fall, of Burkean conservatives within the party.  This is a development which is popularly characterized as the disappearance of so-called moderate Republicans.  From Teddy Roosevelt, to Wendell Willkie, to Nelson Rockefeller, the Republican Party had for much of the twentieth century a progressive wing that curtailed the extremism of Republican right-wingers, and was willing to work with moderate Democrats toward bipartisan policies.

But with the rise Newt Gingrich as Speaker of the House of Representatives in the 1990’s, who shut down the federal government rather than cooperate with President Bill Clinton, and with the advent of the current Speaker Paul Ryan along with Senate Majority Leader Mitch McConnell, who have stonewalled every proposal of President Obama for the last seven and one-half years, right-wing Social Darwinians have taken over the Republican Party.  The recent nomination of Donald Trump for President only confirms what has been obvious for some time.

Darwinian Pragmatism and Progressivism.  The term Social Darwinism was a misnomer twice over.  It was not a social but an anti-social doctrine, a doctrine of selfish, self-centered individualism.  And it was not a Darwinian but an anti-Darwinian doctrine, that ran contrary to Darwin’s conclusion that humans have thrived because of their pro-social tendencies.  The pro-social implication of Darwinism was one of the reasons that conservatives increasingly came to reject Darwin’s actual theories of evolution over the course of the twentieth century, even as they increasingly embraced Social Darwinian ideas and strategies.

Darwin contended that socialization rather than individualism was the key to human success.  It was because of our cooperativeness, not our competitiveness, that we humans have done as well as we have.  And, Darwin complained, it is largely a result of competitiveness and our sometime selfish individualism that we have frequently done so poorly.  The pro-social implications of Darwinism were first given extensive treatment in 1883 in Lester Frank Ward’s book Dynamic Sociology.  In one of the first texts of the emerging field of sociology, Ward outlined a pragmatically socialist Darwinism as the genuine evolutionary theory.

Pragmatism was one of the outcomes of Darwin’s evolutionary theories, seemingly an unintended consequence, but one that was quite influential and helpful.  Pragmatism is a philosophy that describes the world as a succession of circumstances, actions and consequences, with the consequences of an action becoming the circumstances that lead to the next round of actions.  Pragmatism is a philosophy of action.  Pragmatists focus on the convergence of theory and practice into action, or what is sometimes called praxis, and they explain the world as a confluence of interconnected actions Pragmatism is a preeminently pro-social philosophy and it is an approach that can be applied to almost all human activities and fields of study.

Pragmatism developed from humble beginnings to become a comprehensive philosophy.  The term pragmatism was first proposed in the late nineteenth century by Charles Sanders Peirce as a contribution to lexicology, that is, a theory about the meaning of words.  Peirce claimed that the meaning of a word was our reaction to it and the action which it implies.  That is, what the word does to us and what we do as a result of the word.  A word, according to Peirce, is a call to action.[8]  Others took his concept of pragmatism as a call to action in a widening circle of fields.

William James took up Pierce’s ideas and applied them first to psychology.  His was a psychology of action, interaction and reaction.  Portraying the mind as “a stream of consciousness,” in which thoughts flow from one to the next in a constant interaction with each other and with the world, James claimed that the mind is neither a passive recipient of knowledge from the outer world nor an organ of logical conjugation.  Thinking is a dynamic activity in which the mind reaches out to the world, and interacts with it.  Thinking is a process of action and interaction.

James claimed, in turn, that our personal identities are defined by how we act toward people and things, and how they react to us.  We are our actions and interactions.  Contrary to Descartes’ claim that personal identity results from the reflection that “I think, therefore I am,” James proffered the explanation that “I think, therefore we are.”  That is, the only way I can know that I am, and who I am – the only way I can say “I” and be referring to my singular self — is through comparing and contrasting myself with others.  And the only way I can know who others are is by doing things with them.  Action, interaction and reaction are all we can know of ourselves.

James later extended these ideas to epistemology, that is, into a theory of knowledge.  Rejecting the Enlightenment idea of Reason (with a capital R) that ostensibly produced self-evident truths, he insisted that we know about things only from interacting with them.  We learn through doing, through action and reaction, precipitated by problems that we need to resolve.  Without the prod of problems, we would function solely on the basis of habit, and never think about anything in any significant way.  When problems arise that interfere with our habitual existence, we ask questions of the world, seek answers to those questions by looking for relevant evidence, and then either find answers or not.  Knowledge is a product of problem-solving, and expanding the realm of knowledge is a product of asking bigger questions and making wider and deeper connections among things.[9]

John Dewey took James’ idea of learning through doing and made it the cornerstone of his pedagogical theories.  It is a fact of life, he said, that we learn through what we do.  For example, a student who passively sits and takes notes about a subject in class is going to mainly learn how to sit still and take notes.  He or she is not going to learn very much about the subject.  It is only by actively engaging with the subject, and doing something with it, that the student will learn much of lasting value.  In formulating his educational theories, Dewey did something that pragmatists have frequently tried.  He took a fact of life and derived a proposed reform from it, in this case, a successful educational practice.

Dewey also extended the idea of learning through doing into an ethical theory which essentially embodies the Golden Rule that we should love our neighbors as ourselves, and we should do unto others as we would have them do unto us.  In formulating his educational ideas, Dewey took a fact of life and made it into an admonition.  In his ethical theories, he took an admonition and claimed it was a fact of life.  Dewey claimed that we do, in fact, love our neighbors in the way that we love ourselves.  The problem is that many of us do not think much of ourselves and, as a result, think the same of others.  People who think well of themselves will think well of others, Dewey concluded, and people who think well of others will think well of themselves.

Dewey claimed, in turn, that we do, in fact, treat others as we think they will treat us.  The problem is that many of us are afraid that other people will treat us badly, so we treat them that way first.  Too many people operate under the Social Darwinian principle of “Do unto others before they do unto you” with the meaning that you should get your goods first before others get them.  Dewey would reinterpret that mantra and have us do well to others before they do anything to us.  People who treat others well will likely be treated well by others, he claims.  He proposes this tactic as a means of establishing a virtuous cycle of people treating each other well, as opposed to the Social Darwinian vicious cycle of people treating each other badly.[10]

Pragmatism was a theory and practice that underlay the emergence of the physical and social sciences at the turn of the twentieth century.  Through most of the nineteenth century, most of what we today call the physical sciences were studied and taught under the umbrella of natural philosophy, and most of the social sciences were studied and taught as moral philosophy.  There was, however, an explosion in the number of academic fields toward the end of the century, with the rise of the multitude of specializations in the physical and social sciences that have produced most of the scientific advances of the twentieth century.  These scientific advances were powered by newly developed experimental and statistical methods, and pragmatism was a driving force in these developments.

Pragmatism was, in turn, a driving force behind the emergence of the Progressive movement in the early twentieth century.  Progressivism was a broad-based and multi-various social movement, encompassing politics, culture, education, and virtually every aspect of modern life, from fashion to the arts to social policy.  It was a movement, not merely a party or a faction, and, as such, it included many different tendencies, and even some conservatives who bowed to its popularity.  In the midst of the swirling trends, Dewey and other pragmatist scholars, journalists and politicians developed a progressive social theory that ran directly counter to the Social Darwinism that was gaining strength among conservatives.

They took as a main theme Hegel’s claim that the self-development of each person is dependent on the self-development of others, and Marx’s formulation of this as “the self-development of each is dependent on the self-development of all,” and vice versa.  That is, a person can only make something worthwhile of him/herself while working with others, so that each of them and the society as a whole prospers.  Social Darwinians claimed that we live in a top-down zero-sum world, and relied on fear to rally support among the masses.  Progressives countered that we live in a cooperative world in which all boats rise together.  They promoted hope as their means to gain popular support.  Theories based on cooperation and strategies based on hope underlay almost all of the progressive social, political, educational, and cultural developments during the twentieth century, and are the gist of pragmatic liberalism to the present.

Act III. The Obsolescence of Conservatism and the Birth of Fascism?

“We build too many walls and not enough bridges.”

            Isaac Newton.

“In republics, the great danger is, that the majority may not sufficiently respect the rights of the minority.”

            James Madison.

“The only thing necessary for the triumph of evil is for good men to do nothing.”

            Edmund Burke.

Where do we go from here?

Fast forward a hundred years from the turn of the twentieth century to the turn of the twenty-first.  Pragmatic liberalism has become the predominant philosophy of the Democratic Party.  The Progressive Era reforms under Woodrow Wilson, the New Deal under Franklin Roosevelt, the Great Society of Lyndon Johnson, and the healthcare reforms of Barack Obama have all been a product of that philosophy.  The fact of the matter is that pragmatic methods, backed by the tools of the social and physical sciences, can make social reform safe and successful.

Social reform in Burke’s time was a blunt instrument.  Social reformers conceived of a reform, and then tried it.  They had little ability to predict the consequences of a reform, or to monitor and reform the reform as it was being implemented.  If it worked, that was fine.  If it didn’t, that was too bad, and people had to live with the negative consequences.  With the specialization of the social and physical sciences that emerged in the late nineteenth century, social reform was revolutionized, and a pragmatic approach to social reform became possible.  Since that time, we have developed statistical methods, social and economic models, testing regimes of all sorts, and a myriad of ways we can evaluate whether or not a proposed social change is working.  The development of computers has enormously enhanced our abilities in this regard.  We can monitor the progress of a reform, see whether it is producing unintended negative consequences, and make adjustments accordingly.  We can protect ourselves against most of the unintended negative consequences that might arise from a reform.

The means and methods of pragmatic liberalism have absorbed and resolved the concerns of conservatives and the rationale for conservatism.  The flexibility that the new techniques and technologies bring to the process of social reform has undermined the core concern of conservatives about unintended consequences.  Conservatism has essentially become obsolete, and pragmatic social reform should be the order of the day.  Social reform can and should become the conventional wisdom of our society.  But only if the politics of our society will permit it.

Most of the problems that have developed in social programs over the last century have, in fact, been political problems, the result of either liberal proponents overselling their proposals or right-wing opponents obstructing the programs.  The present-day problems with Obamacare are only the latest example.  Burkean conservatives should have no big problem with Obamacare.  It is a market-based system that is motivated by common decency.  But Republican right-wingers have been determined to wreck the program, regardless of its successes, and irrespective of harm to individuals and society.  The program has, as a result, suffered from right-wing political obstruction, and reformers have been severely hampered in their efforts to revise and reform the program.

The Republican Party has, unfortunately, turned aggressively against the Progressive Republicanism that was promoted by Theodore Roosevelt and Bob La Follette in the early twentieth century, and the moderate policies of the so-called Rockefeller Republicans of mid-century.  The Party has turned, instead, towards a radical Social Darwinism that is today epitomized by Donald Trump.  Over the last six years, the right-wing Republicans who control Congress have stonewalled every pragmatic proposal from President Obama, while obstructing his work at every turn.  Meanwhile, Trump, the Republican presidential candidate, is flirting with fascism as his theory and practice.  We are a long way from the days of Newton, Madison and Burke, but their actions and their words still speak loudly, and they don’t speak well of Trump or the Republican Party.

Postscript.

Not the End of Ideology but the Beginning of Politics: Pragmatic not Technocratic.

In 1960, the sociologist Daniel Bell published a book called The End of Ideology in which he claimed that ideological conflicts were coming to an end, and were being superseded by the technocratic administration of things.  He claimed that the future society would be a managed capitalism, in which technocratic elites would administer things that needed coordinating, and in which conflicts would take place only among experts around the technical edges of things.  The grand battles over ideas and utopias that had previously occupied history were obsolete and over.

In this prediction, Bell, a one-time Marxist, had turned on its head one of Marx’s utopian hopes, that once capitalism was overthrown and a communist regime fully implemented, government would wither away, leaving only a minimal non-coercive administration of things that needed coordinating.  Bell, still a social democrat but no longer a radical, applied the idea to capitalism.

The possibility of a capitalist system managed by a technocratic elite was not a new idea in 1960.  Le Comte de St. Simon and August Comte had proclaimed similar things during the nineteenth century.  Adolf Berle and Gardiner Means had predicted the evolution of competitive capitalism into managerial capitalism during the twentieth century.  Francis Fukuyama has predicted similar things in more recent years.  I don’t agree, and I think pragmatic politics should not be confounded with technocratic administration.

The gist of my argument in this essay is that the knee-jerk conservative objection to social reform, that we cannot sufficiently predict the unintended consequences of a reform, has lost its legitimacy.  We can sufficiently monitor most social reforms to make sure they are working as they should, and adjust them if they wander off course.  But that does not mean we will be ruled over by apolitical technical experts.

Our ability to plan and monitor social reforms does not mean the end of ideology or politics.  To the contrary, there will always be differences among people as to values and goals.  These will almost inevitably take the form of ideologies, and lead to political debates and struggles.  Rather than ending ideology and politics, the new pragmatic liberalism opens the door to ideologies and politics that are not bogged down by the knee-jerk nay-ism of conventional conservatism.  We should all be pragmatic liberals of one sort or another, but the differences will still be significant.

[1] On the Enlightenment, see Peter Gay. The Enlightenment: An Interpretation. New York: Vintage Books, 1968.

[2] On Isaac Newton, see James Gleick. Isaac Newton. New York: Pantheon Books, 2003.

[3] On James Madison, see Garry Wills. James Madison. New York: Henry Holt and Company, 2002.

On Edmund Burke, see Conor Cruise O’Brien. The Great Melody: A Thematic Biography of Edmund Burke.  Chicago: University of Chicago Press, 1994.

[4] On the coming of the Revolution and the making of the Constitution, see Gordon Wood. The Making of the American Republic, 1776-1787. Chapel Hill, NC: University of North Carolina Press, 1969.  I have written extensively on whether and how the Revolution and the Constitution may have been based on mistaken analyses and expectations in several posts on this blog and in my book Was the American Revolution a Mistake? Bloomington, IN: AuthorHouse, 2013.

[5] On Charles Darwin, see Loren Eiseley. Darwin and the Mysterious Mr. X. New York: E.P. Dutton, 1979.

[6] For the influence of Darwin on philosophy in general and pragmatism in particular, see John Dewey. The Influence of Darwin on Philosophy. Bloomington, IN: Indiana University Press, 1910.

[7] On Social Darwinism, see Richard Hofstadter. Social Darwinism in American Thought. Boston: Beacon Press, 1955.

[8] On Charles Sanders Peirce and the origins of Pragmatism, see Louis Menand.  The Metaphysical Club: A Story of Ideas in America.  New York: Ferrar, Straus and Giroux, 2001.

[9] On William James, see Robert Richardson. William James: In the Maelstrom of American Modernism. Boston: Houghton Mifflin, 2007.

[10] On John Dewey, see Robert Westbrook. John Dewey and American Democracy. Ithaca, NY: Cornell University Press, 1991.

 

Donald Trump and the Contours of American Decision-Making: Do we suffer from a collective thinking disorder? And what can we learn from Star Trek?

Donald Trump and the Contours of American Decision-Making:

Do we suffer from a collective thinking disorder?

And what can we learn from Star Trek?

Burton Weltman

“Insanity is repeating the same mistakes over and over again and expecting different results.”

Attributed to Albert Einstein.

 A.  The Irony of American Decision-Making: A History of Self-Defeating Policies.

“It is curious how often you humans manage to obtain that which you do not want.” 

Mr. Spock on Star Trek.

Why, in the name of peace, has the United States been involved in more wars since the founding of our country than any other country during that period of time (we have been at war in over 200 of our 239 years)?[1]  Why, in the name of self-protection, does the United States have the highest rate of gun ownership (some 88.8 guns per 100 people), but also the highest rate of gun deaths of any industrialized country?[2]  Why, in sum, do American policies often seem to produce the things they are supposed to prevent?  Is this crazy, or what?

Take for example the invasion of Iraq in 2003.  The United States conquered Iraq and overthrew its government in order to eliminate weapons of mass destruction that might threaten us.  It was a preemptive strike to eliminate a potential threat.  There was no evidence that the weapons existed, but George Bush, Dick Cheney and Donald Rumsfeld inveigled the mass media, scared the public, and stampeded the Congress into supporting the invasion.  It turned out, of course, that there were no such weapons.  In fact, the patient containment policy of President Clinton during the 1990’s had seemingly led Saddam Hussein to destroy Iraq’s chemical weapons.

The conquest was also supposed to help bring stability to the Middle East.  As a consequence of the invasion, however, Iraq became a haven for terrorists who continue to pose an actual threat to us.  The invasion also ignited a firestorm of violence in the Middle East and around the world that we are still struggling to get under control.  The Iraq invasion stands as one example among many in our history of the irony of provoking violence in the name of preventing it.

Take also the example of America’s gun policies.  Americans are the most highly and widely armed people in the history of the world.  The ostensible goal is for people to be able to protect themselves against violence.  And every time there is a significant incidence of gun violence in the country, people buy even more guns as a preemptive move to protect themselves.  But this is a self-defeating policy both for individual people and the populace as a whole.

The data is clear that people who own guns are more likely to be shot than those who do not.  And the most likely persons to be shot with a gun that you own are you and people you know.  There is almost no chance that you will ever use your gun to protect yourself or anyone else.  Significantly, states within the United States with looser gun policies have higher rates of gun violence than those with tighter gun restrictions.  Nonetheless, the recent trend has been mainly toward even looser gun controls in those states with the greatest gun violence.  The National Rifle Association and other gun groups largely funded by gun manufacturers have manipulated the mass media, made people afraid of each other and of the government, and stimulated a national obsession with owning guns.

The consequence of these policies is that guns are so easily and widely available in the United States, that conflicts which in other times and places might be settled with fists or, at worst, clubs and knives, are often settled with automatic weapons.  It also seems to have become the case that guns are so widely possessed by people that whenever a police officer confronts someone with something in his hand, the officer feels he has to assume it is a gun, and frequently decides he has to shoot the guy.  A wallet, a toy truck, anything can be taken for a gun.  As a result, we are currently experiencing a reign of fear between the police and the people they are supposed to protect, with each group scared of the guns possessed by the other.

Our gun obsession has given the United States the highest rate of gun violence of any industrialized country, all in the name of self-protection.  American gun policies stand as another instance of the irony of trying to prevent violence with violence.[3]  Why do we so often adopt these sorts of self-defeating policies?

B.  Confounded Founding: Was the American Revolution a Mistake?

“In critical moments, men sometimes see exactly what they wish to see.”

 Mr. Spock on Star Trek

 Concerns about self-defeating policies are not new in our history. They date back at least to the founding of the country.  In the 1780’s, after having defeated the most powerful army in the world and gained independence for the United States from England, George Washington expressed the overwhelming sentiment of the Founding Fathers when he complained about the outcome of the Revolution.  “Have we fought for this?!” Washington lamented as he surveyed the social and political landscape of the United States.  The Founders had expected that peace and harmony would reign among Americans after their English overlords had been expelled.

But the country seemed, instead, to be in chaos.  Social classes were in constant conflict with each other, pitting rich against poor, farmers against bankers, cities against the countryside.  Worse still, the populace was refusing to follow the lead of the Founders, who had expected to be recognized as the natural and rightful leaders of the people once the British were gone.  Demagogic upstarts and hucksters were taking center stage, and vying with the Founders for political power.  What had the Founders gotten wrong?  The problem may have been of their own making, and its roots may be seen in the Declaration of Independence.

The Declaration of Independence opens with the words “When in the course of human events it becomes necessary to….”  The document then goes on to make a case for why it was necessary for the American colonists to revolt against English rule.  The argument consists of two main parts.  The first part is a concise statement of the natural rights theories of John Locke and Francis Hutcheson, to the effect that when a government becomes tyrannical, it is not merely the right but the duty of people to overthrow it.

The second part is a lengthy list of grievances against the King of England which purports to demonstrate that Americans have a duty to revolt against him.  This part opens with the words “The history of the present King of Great Britain is a history of repeated injuries and usurpations, all having in direct object the establishment of an absolute Tyranny over these States” (Emphasis added.). That is to say, the King was in the midst of a long term plan to become a tyrant, not that he already was a tyrant.  This revolt is a precautionary and preemptive move against a king who intends to become a tyrant.  Consistent with this opening statement, the specific grievances that follow are mainly prospective in nature and effect.  The first is “He has refused his Assent to Laws, the most wholesome and necessary for the public good.”  That is, the King was keeping the colonists from doing what they wanted and, thereby, keeping them from growing in wealth and power, not that he was actually abusing them.

Most of the grievances are of this nature.  They complain that the has King ignored requests from the colonists and/or made things inconvenient for them.  The few grievances in the list that alleged actual harm to the colonies were punishments that England had imposed on New England because of the Boston Tea Party and other terrorist actions by the so-called Sons of Liberty.  The Declaration concludes, nonetheless, that the King has demonstrated his intent is to become a tyrant, and that the colonies will likely be strangled to death if they do not revolt.

Although the fear expressed in the Declaration is sincere, the document does not make an argument that the colonies are currently being tyrannically oppressed.  England’s North American colonies were, in fact, the freest places for freemen in the world, and the colonists knew this.  The Declaration is an argument that current events indicated the King’s intention to tyrannically oppress the colonies, and that the colonists had better get out while the getting was still possible. It is essentially an argument to make a preemptive strike for independence before it is too late, before the colonists are enmeshed in tyranny and unable to resist the King.  The Declaration was a powerfully written statement issued by most of the most respected men in the colonies.  It scared enough colonists into supporting a revolution to make that revolution happen.

It was not, however, accurate in the main.  The Founding Fathers were wrong about the King’s intentions and the import of his actions.  The King was not trying to become a tyrant, or to reclaim the powers that English Kings had claimed during earlier centuries.  To the contrary, although George III was an active King, England was evolving into a parliamentary system in which the King had very limited powers.  The Founding Fathers, however, relying to a large extent on misinformation they had gleaned from English radicals, feared for their liberties.  Scorning negotiations that might have produced a peaceful compromise, they rushed into war, risking their lives to save freedoms they were not seriously in danger of losing.[4]

It was a long and brutal war that they eventually won.  But things did not turn out the way the Founders expected.  Instead of peace, they found themselves enmeshed in an unexpected new war of social classes among Americans that upset their plans for the new nation.

The American Revolution was not merely, or even primarily, a movement for national independence.  Most of the revolutionaries did not mind being considered Englishmen.  What they minded was being controlled by the kind of government that ruled England, and that the English were imposing on the colonies.  That is, they were opposed to centralized government, and to government with a strong chief executive that might morph easily into tyranny.  Their goal was to establish a decentralized national government with a weak chief executive.  The Founders were not adherents of small government.  They were adherents of local government, with local government having broad powers of control over the local economy and social life.

This goal was exemplified by the Articles of Confederation, the first constitution of the United States.  The Articles left most governmental power to the states.  The President of the United States under the Articles was essentially the chairman of the meetings of Congress and served for only one year.  But no sooner had the Revolution ended, than Founding Fathers such as George Washington, James Madison, John Adams, and Alexander Hamilton, among others, turned against the decentralized government for which they had been fighting.

The decentralized system of government the Founders had fought so desperately to establish turned out to be unwieldy and unworkable.  They had expected a new regime in which local elites would rule with the consent of the local masses.  But the masses proved to be unruly, and unwilling to defer to the local elites.  The Founders found that in opting for a preemptive strike to prevent the potential of royal tyranny, they had stirred up a hornets’ nest of grievances among ordinary Americans, who saw tyranny in the way local elites were asserting control over things.  Ordinary people wanted to control things, and demagogues vied for their support.  The Founders worried, in turn, that this might lead to a new form of tyranny, the tyranny of the majority.

So, in something of a panic with the way things were going, and too impatient to try to reform the Articles, the Founders moved preemptively and peremptorily to adopt a new Constitution with the very sort of centralized government and strong chief executive that they had fought a war to oppose.  It was a government that defused the influence of ordinary people by allocating most power to officials who were chosen at several removes from the populace.  It also contained checks and balances against the influence of demagogues.  It was a government the Founders believed they could control, and that properly balanced democracy with what we today would call meritocracy.  Unfortunately for them, government and politics under this new Constitution did not work out the way they expected and produced, instead, exactly the sort of the political free-for-all they had hoped to avoid.  But that is another story.[5]

The contours of American decision-making seem to be shaped by impatience and impulsivity, tending toward self-defeating preemptive strikes, and often leading to unintended and unwanted results.  As a consequence of this pattern, we seem to make the same types of mistakes over and over again.  Why is this?  Are the Founding Fathers unwittingly to blame?  The Founders were brilliant and heroic.  They were also sincere in their fears and honest in their impulses.  But did the Founders initiate a pattern of national leadership that is based on fear, and decision-making that is based on impulse, that has devolved to permit the hucksters and demagogues, that they hoped to prevent, to come to the fore?  Are they ultimately responsible for Donald Trump?[6]

C.  Ironies of the American Psyche: Self-Inflicted ADD, DID, and PPD.

“I object to intellect without discipline.”

Mr. Spock on Star Trek.

Donald Trump is not a new or novel phenomenon in American history.  Visitors to the United States during the nineteenth century frequently noted the prominence of self-promoting hucksters and fearmongering politicians, who were full of grandiose promises, bullying bombast, and braggadocio claims.  These visitors generally bemoaned the publicity that demagogues of this sort received from a mass media that favored sensationalism over facts.  It is a character type that first became prominent during the 1820’s and 1830’s, with the rise of the demagogic President Andrew Jackson and a new national ethos of laissez-faire individualism.

Jackson came to power by stoking fear of Native Americans, blacks, immigrants, bankers and intellectuals.  Jackson, like Trump today, was a colorful figure who was always good for a sensational news story.  And Jackson, like Trump, represented a race to the bottom in American politics that appalled the remaining Founding Fathers.  He was exactly the sort of thing the Constitution was supposed to prevent.  Was this, they complained, what they had fought for?[7]

Charles Dickens was one of the visitors who commented on this development.  A firm believer in the ideals of the American republic, he described the Trump-type American during the 1840’s in these terms: “If I was a painter, and was to paint the American Eagle… I should draw it like a bat, for its short-sightedness; like a bantam, for its bragging; like a Peacock, for its vanity; like an Ostrich, for putting its head in the mud, and thinking nobody sees it.”[8]

Dickens was dismayed by the influence of demagogues and hucksters over the public, and incredulous at the way newspapers encouraged them.  Dangerous frauds, they promoted selfishness in the name of progress, slavery in the name of freedom, and murder in the name of peacekeeping.  Their arguments were invariably ad hominem, and they inevitably portrayed social problems as the result of Others who needed to be attacked and defeated.  How was it that Americans were wont to follow these sorts of people?  Were they a cause or a symptom of America’s problems?  Was this some sort of collective mental illness?

Many commentators and policy analysts, both past and present, have concluded that Americans do suffer from a thinking disorder.  One of these is Professor Tara Sonenshine, a former Undersecretary of State of the United States.  Reviewing our history of preemptive actions, self-defeating decisions, short-sighted policies, and violence, she claimed in 2014 that “American impatience is not a passing fad nor is it minimal in scope.  How to reign in the impulsivity in us is a major task.  It might take national therapy — if we have time and patience to explore it.”[9]  How to find the time and patience to deal with our impulsivity and impatience?  This is an ironic question, but a crucial one if we are to avoid falling prey to Trump-type demagogues.  So, if we think of ourselves as suffering from a thinking disorder, how might we diagnose it?

1. Do we collectively exhibit symptoms of Attention Deficit Disorder (ADD), i.e. chronic impatience, impulsiveness and short attention spans?

The French social critic Alexis de Tocqueville visited the United States during the 1830’s and 1840’s. Like Dickens, Tocqueville believed that American-style democracy was the wave of the future in the world.  But, also like Dickens, he complained that Americans were an impatient and impulsive people, who seemingly could not wait for anything, or see anything through to completion.  As a result, he claimed, American laws were “frequently defective and incomplete” because Americans did not think them through, and “even if they were good, the frequent changes they undergo would be an evil.”[10]  That is, Americans were so impatient and impulsive that even when they stumbled into a good public policy, they abandoned it for another policy if it did not succeed immediately.  The net result was that policymaking in America was erratically and poorly done, and violence was often substituted for reason.

Does the impatience and impulsiveness that Tocqueville and others have noted in Americans constitute ADD?  People diagnosed with ADD are generally incapable of making good decisions.  They are too impatient to go through a whole decision-making process, and consider all aspects of a problem.  They lack the concentration to arrive at a well-reasoned decision that is consistent with their own values and goals.  And they are too impulsive to make a decision that can be explained to others and understood by them.[11]  Is this diagnosis a description of American policy-making?  One might reasonably cite American educational policy over the last one hundred fifty years as an example of collective ADD.

Public education in America first took on its modern form during the 1840’s in Massachusetts, where Horace Mann oversaw the establishment of what were called “common school” methods in the elementary and secondary schools, and “normal school” methods for teacher preparation.  Modeled on the factories of the industrial revolution that was beginning in America at that time, common schools were assembly lines for the mass production of standardized education for children.  Mann and his colleagues invented the idea of grade levels, that is, that children should go through standardized stages of education, with all of them learning the same set things at each stage.  They invented standardized curricula for each grade level, standardized textbooks and workbooks, and standardized tests to determine if a child was eligible to move along the assembly line to the next stage.  Normal schools were invented to train teachers in standardized teaching methods appropriate to the common schools.  Rote memorization and recitation were the main teaching methods.

The common schools were a leap forward in both the democratization of education and the use of schools for purposes of socialization and social control.  There was widespread concern in the country at this time about the massive immigration of European peasants to America to work in the new factories and live in the burgeoning cities.  Public education was deemed necessary to make their children into more efficient workers and effective citizens.  Common schools were seen as a prescription for making democracy workable.

The common schools were a cheap and efficient way to instill what were called the 4R’s, reading, writing, ‘rithmetic, and religion.  But the education they provided was unimaginative, uninteresting and unintellectual.  As a result, no sooner did the common schooling become widespread during the mid-to-late nineteenth century, than alternative means and methods were proposed to make education more interesting, effective and intellectual.  Over the course of the next fifty years, two methods gained the most support among educational reformers.

The first is what came to be called Essentialism.  The second is what came to be called Progressivism.  Advocates of both methods rejected the rote teaching methods of common schooling.  Essentialists want schools to focus on teaching the recognized academic disciplines.  They want each of the major subjects taught separately, with the goal of making students into scholars in each of these fields.  Progressives want schools to have interdisciplinary curricula which focus on teaching for real-world problem-solving, with the goal of helping students to become active and effective citizens.  Advocates of common schooling have rejected both methods as frivolous.

The history of American educational reform over the last century and a half has been a struggle among advocates of Essentialism, Progressivism and common schooling, with common schooling as the default position of American public schools.  There has been a pattern to this struggle in which every fifteen to thirty years, there is a call for educational reform and a major proposal is issued by either the Essentialist camp or the Progressive camp, or both.  The proponents tout their proposals as revolutionary new ideas, ostensibly based on new educational research, although they are, in fact, really just repackaged past proposals.

In any case, the reform proposals get media attention, and are adopted in whole or in part by many schools in the country, with the expectation of revolutionary improvements in public education.  In the course of these events, Essentialists attack any Progressive initiatives, Progressives attack any Essentialist initiatives, and both are attacked by advocates of the common schooling status quo.  The mass media promote the controversy.  The reforms invariably have some limited short-term effects, but do not immediately bring radical improvements in education.  So, they are deemed a failure in the media.

Given the infighting between Essentialists and Progressives, and the inertial appeal of the status quo, the reforms are almost entirely abandoned after a few years, and forgotten.  That is, until they are resurrected the next time.  Meanwhile, common schooling defaults as the predominant method of teaching in our public schools, which it still is today.  Isn’t this ADD among educational policy makers and politicians?[12]

2. Do we as a people suffer from a Paranoid Personality Disorder (PPD), i.e. a Violent Hang-up on Violence?

H. Rapp Brown, a leader of the Black Panthers during the 1960’s, once claimed that “Violence is as American as cherry pie.” If so, then trying to prevent violence is as American as apple pie. The problem is that Americans all too often choose preemptive violence to prevent violence.  The idea is to accept a small amount of violence in order to avoid a greater amount.  But that lesser violence frequently becomes the greater violence it was meant to prevent.

The problem dates back to the first English settlers in what became the United States.  The settlers moved into what they thought was a wilderness, and were terrified of being overwhelmed by wild Indians and wild animals.  So to cleanse the continent of Native Americans and native animals, they began a series of wars that lasted almost three hundred years.  They initiated a pattern of preemptive violence that has been repeated with dire results throughout our history.

One of the more ironic examples of self-defeating preventive violence is the secession of southern states in 1861 that led to the Civil War.  It was a suicidal act and an example of snatching defeat from the jaws of victory.  The facts of the matter were that southern slave owners had almost everything going their way in 1861, even with Lincoln’s election as President.  Lincoln got only 40% of the vote, with 60% being split among competing pro-slavery candidates.  If the pro-slavery forces could have agreed at the next election on a single candidate, Lincoln would almost certainly have been a one-term President.  In any case, Lincoln could not do anything about slavery anyway because pro-slavery forces still controlled Congress.  So, Lincoln’s election was at most a temporary political inconvenience.

Meanwhile, the Supreme Court’s Dredd Scott decision of 1857 had held as a matter of Constitutional law that a slave owner could take his slaves anywhere in the country.  Slaves were property, and a man could safely take his property wherever he wanted within the United States.  This decision implied that slavery was legal everywhere in the country, and that there was no such thing as a “free state.”  It would have required a Constitutional amendment to change this decision, and there was no way an anti-slavery amendment ever would have gotten the necessary approval from three quarters of the states.  Rather than slave states seceding to protect slavery, it was northern states that should have seceded if they wanted to avoid slavery in their midst.  The only way in which slavery could have been undermined during the mid-nineteenth century was if southern states seceded so that northerners could become a majority in Congress and northern states could become a three-quarters majority in the Union.  And that is exactly what happened.

A group of so-called Fire Eaters among southern whites became convinced in the 1850’s that the only way they could save slavery was to take preemptive action to secede from the Union before northerners could become populous and powerful enough to conquer the South.  They were convinced that the South had a military and economic advantage over the North at that time, such that secession could be achieved.  And they were convinced that it was a now-or-never crisis.  They must decide for a little violence now to avoid catastrophic violence later.

The Fire Eaters gradually gained the support of the southern news media during the 1850’s.  With Lincoln’s election in 1860, enough southern whites were convinced by the Fire Eaters so that most slave states were pushed into choosing secession and fighting the Civil War.  As a result, they got the catastrophe they had wanted to avoid.  Ironically, if the South had not seceded in 1861, slavery might still be the law of the land today.[13]

Catastrophic violence has often resulted because Americans have had problems with the idea of compromise, confusing it with appeasement, and have had problems being able to sustain strategies of containment instead of resorting to confrontation.  Compromise involves reaching a deal with an adversary in which each side is able to preserve its principles while giving up on some collateral issues.  Compromise is different than appeasement in which one side gives up on its principles in order to assuage the other side.  Forging compromise generally requires patience and flexibility.  Containment involves accepting in the short run a status quo that includes things to which you are opposed, but applying pressure to attain gradual long term change without resorting to violence.  Economic sanctions against an offending person or country constitute an example of a containment measure.  Containment also requires patience and flexibility.

Americans have repeatedly lacked the patience to work out compromises and wait out containments.  In the cases of the American Revolution, the War of 1812 and the Iraq War, Americans resorted to violence just as economic sanctions seemed on the verge of bringing about the goals we had set.  In the summer of 1776, British negotiators were sailing to America to offer the colonists home rule on terms that were consistent with the demands that the colonists had been making.  The radicals in the Continental Congress forced through a Declaration of Independence just a few weeks before the negotiators arrived, so that all parties were faced with a revolutionary fait accompli that scotched any further negotiations.

Likewise, war was declared by Congress in 1812 just days before news arrived from England that the English were acceding to the Americans’ demands.  Again, war was a fait accompli.  And the economic and military sanctions that President Clinton had placed on Iraq had achieved their intent of essentially disarming and disabling the regime of Saddam Hussein, but that did not stop President Bush from declaring war based on bogus claims.  Each of these is an example of snatching war from the grip of peace.  Does this amount to a collective case of PPD?

3. Do we exhibit symptoms as a nation of Dissociative Identity Disorder (DID), or Split Personality?

Americans frequently have been split into competing political and social groups. The split has often been characterized as between liberals and conservatives, though the definitions of liberal and conservative have evolved over time.  As these terms have been used over the last hundred years or so, liberals are seen as tending to think favorably about government involvement in economic, environmental and social welfare policies.  Conservatives tend to oppose these government activities.  The country has seesawed politically depending on whether liberals or conservatives have had the upper hand.  This split is exemplified today by the battle between so-called Red State conservatives and Blue State liberals.

But Americans have not only been split into different groups, they have also been split as individuals, and have frequently held competing inconsistent political and social positions on issues, often without even realizing it.  For example, Americans have consistently been distrustful of governmental authority, but have also insisted that the government impose law and order on society.  Americans have consistently extolled individual freedom, but expected social conformity.  Americans have generously contributed to private charities for poor people, but often refused to support public programs of welfare for the poor.  Americans have often opposed government programs of economic assistance, but coveted the benefits those programs provide.  It is even common today, for example, to hear the ironic refrain from some conservatives that “I just want to keep the government’s hands off my Medicare.” [14]

This split personality is particularly acute in Red States.  The problem is that many areas of the country are almost totally dependent economically on government expenditures and programs, and these areas are disproportionately in Red States.  To fund these expenditures and programs, the federal government takes in taxes from the country as a whole, which it then doles out to those areas most in need.  By a wide margin, Red States get back in government expenditures more money than they pay in taxes.  That is, Blue State taxpayers are financing Red State recipients, and Red State conservatives generally have no problem with taking money from federal government programs they ostensibly oppose.

Historically, Southern slave owners, for example, claimed to be in favor of “states’ rights,” and condemned as government oppression any attempt to regulate or restrict slavery.  But these same slave owners were overwhelmingly in favor of federal government enforcement of slavery and restrictions on abolitionist campaigns.  They claimed the states’ right to nullify any federal restriction on slavery, but adamantly insisted on the enforcement of federal fugitive slave laws against the nullification of those laws by northern states.  Some might call these inconsistencies mere hypocrisy, but might they not also be examples of an underlying DID?

The ways in which Americans have resolved their social ambivalence and political contradictions in action has frequently turned on how the issue has been framed to them.  Americans have historically responded positively to broad rhetorical appeals to freedom from government control, and to wholesale warnings about possible government oppression.  At the same time, these same Americans have frequently promoted strong government regulations and restrictions with respect to practical matters of interest to them.  It has consistently been the case since public opinion polling first developed during the 1930’s that when a question is asked in broad ideological terms, as in “Do you favor free markets or government economic regulation?”, about two-thirds of the respondents favor free markets.  But when a question is asked in specific pragmatic terms, such as “Do you favor laws that keep dangerous drugs off the market?”, at least two-thirds of respondents answer “Yes.”

As a result, politicians who are against economic regulation — generally conservatives — usually pitch their campaigns in broad ideological terms, while politicians who favor economic regulation — generally liberals — usually tailor their campaigns to specific issues and pragmatic programs of concern to voters.  Likewise, demagogues and hucksters, who are trying merely to manipulate public opinion and have no real solutions to problems, invariably try to frame the discussion in broad generalities, with broad generalizations and stereotypes.  It is generally incumbent on responsible progressive politicians to focus on practicalities of how things might work rather than on sensational generalities.

Fear can easily be generalized.  Hope needs to be particularized.  Trump-type demagogues thrive on appealing to the fears, hatreds, and worse angels of Americans, rather than their hopes, likes and better angels.  They try to resolve the split in our personality in a way that is self-destructive to us.  We need to counter them with pragmatic rationality.[15]

D.  What’s in a Brain and What Can We Learn from Star Trek?

“I find their [humans’] illogic and foolish emotions a constant irritant.”

 Mr. Spock on Star Trek.

So, maybe it is a question of our brains?  Although people are often described as thinking with their hearts, their stomachs or other anatomical parts, we actually think with our brains.  The human brain consists of two key parts, the brain stem and the cerebral cortex, with the cerebral cortex split into a right hemisphere and a left hemisphere. The brain stem is the earliest and least sophisticated portion of the human brain.  We inherited it from our pre-human ancestors.  The brain stem is the locus of the “fright, then fight or flight” reaction of our puny rat-like evolutionary precursors who had to make their way in a world of giant carnivores.  This sort of reaction was apparently a successful survival strategy for helpless mini-mammals.  But it may not be as useful, and may often be counterproductive, in the world of modern humans in which shooting first and asking questions later can lead to unnecessary wars and suffering.

The cerebral cortex evolved later in humanoids, and is the locus of human self-consciousness and critical thinking.  It is in the cerebral cortex that we do our rational thinking.  The cerebral cortex is split into a left hemisphere which is largely responsible for logical thinking and a right hemisphere which is largely responsible for creative thinking and intuition.  Psychologists have described good decision-making as a thought process that combines a person’s brain stem and both hemispheres of a person’s cerebral cortex.  Simplistically put, the brain stem will stimulate the process, the right hemisphere will imagine possible responses, and the left hemisphere will analyze the evidence for them.  A plausible conclusion will then be reached.[16]

There are many formulas for a good decision-making process, but some elements are common to most formulas.  A good decision-making process should be whole, coherent, and transparent.[17]  A process is whole if it is based on significant reflection and discussion, and covers all aspects of the problem under discussion.  A process is coherent if it results in a reasoned decision, the reasons make sense, and the values and goals embedded in the decision are consistent with the actions proposed to deal with the problem.  A process is transparent if the information and reasoning upon which a decision has been made are open to public scrutiny, and the reasoning behind the decision can be replicated by others.  Each of the parts of the brain, the emotional stimulus of the brain stem, the logic of the left hemisphere, and the intuition of the right hemisphere, are crucial to the process.  Making use of the various capacities of the whole brain, with each part checking and balancing the others, is key to an effective decision.[18]  The importance of these elements is illustrated in the 1960’s television series Star Trek.

Star Trek dramatizes the voyages of the spaceship Enterprise.  The three main characters in the crew of the Enterprise represent different personalities and decision-making styles.  There is Mr. Spock, the stoic Vulcan science officer and first mate, who applies cold logic to every situation.  Dr. “Bones” McCoy, the ship’s doctor, is a charming Southern gentleman who is erratically emotional.  And Captain James Kirk, is a Western space cowboy who is always ready for action.  In the course of most episodes, these characters bounce off of each other and eventually combine their respective insights to come up with a workable resolution of whatever crisis confronts them.  One of the aims of the TV show seems to be to illustrate the elements of a good decision-making process, and the ways in which disparate personalities can work together.

Using the brain as an analogy, McCoy represents the excessive influence of the brain stem, Kirk a rashly decisive right hemisphere, and Spock an over-weaning left hemisphere.  Each of them tends to take his tendencies too far and, thus, each of them needs the leavening effects of the others.  And while each is bedeviled by the others, each is also bedeviled by himself.  McCoy must tame his emotional overreactions to operate within the bounds of medical science.  Kirk has to balance his impulse to act rashly on his own with his sense of responsibility as captain to the whole crew of the spaceship.  Spock, who is half-Vulcan and half-human, is frequently torn between his Vulcan rationality and his human emotions and imagination.  But he invariably comes to his senses by focusing pragmatically on whatever is the specific problem at hand.  And that focus on practicality is the key lesson of the show.

McCoy’s hysterical overreactions to problems are almost always wrong.  He repeatedly calls for drastic preemptive actions, and for action based on fright.  Kirk’s initial responses to problems are often unduly aggressive and, thereby, also wrong.  He frequently wants to jump into the middle of things based on insufficient information and reflection, and to act based on courage.  Using the thinking disorder diagnoses, Kirk seems to suffer from ADD (impatience and impulsivity) in his need always to be in motion.  McCoy suffers from PPD (exaggerated fears of others), and is almost always stoking panic.  Spock suffers from a mild case of DID (split personality), but does not exhibit symptoms of either ADD or PPD.

Although Spock’s logic seems alien to his colleagues because it is so cold, his conclusions are almost invariably correct.  Unlike McCoy, who often gets carried away with ideological prejudices and generalized fears, Spock is able to focus on the practicalities of a situation. Without Spock’s Vulcan rationality, the others would many times have doomed the Enterprise through their emotional and impulsive decisions.  And the alieness of Spock’s logic seems to be the point that the show is trying to make.  Star Trek seems to be saying that we Americans need a bit more of Spock’s rationality, a trait which is alien to most of us, and a little less of McCoy’s emotionality and Kirk’s impulsivity, which we find more natural.

Produced during the height of the Cold War, when “un-American” was a term of highest opprobrium, and to suggest that America was not the best at everything was deemed un-American, this was a courageous stand on the part of the producers of the show.  It was a suggestion that could perhaps be safely made to a popular audience only through the guise of science fiction.  Spock, in particular, often said things about other worlds that were pointedly applicable to the United States, as when he said about an alien world that the Enterprise had visited “This troubled planet is a place of the most violent contrast.  Those who receive the rewards are totally separated from those who shoulder the burdens.  It is not wise leadership.”  He said that in the 1960’s, ostensibly about some other world, but it applied to the United States then and applies all too increasingly well to our world today.

E.  A Martian’s Eye View: Positioning oneself in but not of a situation.

Life and death are seldom logical.”

Dr. McCoy on Star Trek.

 Intuition, however illogical, Mr. Spock, is recognized as a command prerogative.

Captain Kirk on Star Trek.

“Logic is the beginning of wisdom, not the end.”

 Mr. Spock on Star Trek.

Although the other characters on Star Trek often disparage Spock as being coldly inhuman, Spock is not inhumane.  To the contrary, while he is dispassionate in his reasoning, he is also very compassionate in his responses.  He frequently cites as one of his moral imperatives that “The needs of the many outweigh the needs of the few,” and repeatedly puts himself in danger to save others.  Despite the repeated references in the show to Spock’s logic, his greatest strength, and I think the main point of the show, is his ability to see things from an outsider’s point of view.  I had a history teacher in high school who would often ask the class to imagine what would a Martian think of the events we were discussing.  That is what Spock essentially does.

Spock, as a Vulcan, is literally inhuman and an outsider, but he is also the colleague of a crew of humans on a spaceship.  So, he is the outsider who is an insider.  As such, he is able to appreciate the situation in the way his human colleagues do, but also break out of their cycle of insiders’ thinking, and break away from the pattern of ADD and PPD that McCoy and Kirk represent.  McCoy’s passion and Kirk’s intuition, which are at least partly a function of their being insiders, are important to the decision-making process on the Enterprise.  But Spock is able to devise pragmatic solutions to problems by seeing them from the outside, where others are overwhelmed by the enormity of problems from seeing them only from the inside.  Spock’s compassionate and considerate rationality, an attribute he can bring to the situation in large part because he is an outsider, is the key to the crew’s survival.  And maybe to ours, as well?

If we want to break out of our vicious cycle of self-defeating policies, and our susceptibility to demagogues and hucksters, we need to adopt the position of outsiders, and see ourselves as outsiders might.  To define our thinking disorder in a nutshell, we Americans suffer from too much McCoy and Kirk, and too little Spock.  To describe our current political crisis in a nutshell, Donald Trump is McCoy on steroids pretending to be Kirk.  He is fooling a lot of people by playing on their fears of outsiders, and trapping those people in their mental cages.  Americans need, instead, to welcome outsiders and alternative points of view.

Abraham Lincoln once famously said that “You can fool all of the people some of the time, and some of the people all of the time, but you cannot fool all of the people all of the time.”[19]  Trump is fooling a lot of people.  We will soon find out how many and for how long.  When will we stop fooling ourselves so that we can stop being fooled and made fools of by the likes of Donald Trump?  Where are the Vulcans when we need them?

            July 26, 2016

[1] Alex Jones “America Has Been At War 93% of the Time – 222 Out of 239 Years – Since 1776.”  www.infowars.com 2/21/15.  See also “List of wars involving the United States.” Wikipedia. 7/19/16.

[2] Jonathan Masters. “U.S. Gun Policy: Global Comparisons.” Council on Foreign Relations. cfr.org 1/12/16.

[3] You can find an extended discussion of the Second Amendment and gun policies in the United States in my blog post on “History as Choice and the Second Amendment: Would you want to keep a musket in your house?”

[4] For the still definitive discussion of the politics of the American Revolution, see Gordon Wood’s The Creation of the American Republic, 1776-1787. University of North Carolina Press: Chapel Hill. 1969.

[5] You can find an extended discussion of how and why the Founders made the Revolution, their expectations for the Revolution, and their disappointments with the outcome and with both the Articles of Confederation and the Constitution in my blog posts on “George III’s Legacy,” “George Washington’s Lament,” “Would it have been better for the colonists and would it be better for us today if the American Revolution had not happened?” “How might things be worse if the American Revolution had not happened,” and in my book Was the American Revolution a Mistake? (AuthorHouse, 2013).

[6] For a Pulitzer Prize winning examination of political and intellectual hucksterism in American history, see Richard Hofstadter, Anti-Intellectualism in American Life. New York: Vintage Books, 1963.  Hofstadter would not be surprised at the rise of Donald Trump.

[7] For a brilliant discussion of early American history with a focus on demagoguery and hucksterism during the Jacksonian era, see Daniel Walker Howe, What Hath God Wrought: The Transformation of America, 1815-1848. New York: Oxford University Press, 2007.

[8] Charles Dickens. Martin Chuzzlewit. New York: Signet, 1965. p.581.

[9] Tara Sonenshine. “The Age of American Impatience: Why It’s a Dangerous Syndrome.”  huffingtonpost.com/tara-sonsenshine/the-age-of-american-impat_b_5916062  Accessed 3/24/15.

[10]  Alexis de Tocqueville. Democracy in America. (New York: Oxford University Press, 1947), 140.

[11] psychcentral.com/disorders/adhd/  Accessed 3/24/15.

[12] You can find an extended discussion of the history of American educational reform in my blog post “Struggling to Raise the Norm: Essentialism, Progressivism and the Persistence of Common/Normal Schooling in America.”

[13] You can find an extended discussion of why the South seceded, why the North did not, and the alternative realities that could have ensued in my three blog posts on “Would the United States still have slavery if the South had not seceded in 1861?”

[14] For an example of a prominent American who exhibited a liberal/conservative split personality, you can find a discussion of James Bryant Conant in my blog post “Progressivism, Postmodernism and Republicanism: The Relevance of James Conant to Educational Theory Today.”

[15] For a conservative view of America’s split personality, see Irwin Stelzer, “Split Personality America,” www.weeklystandard.com 2016.  For a liberal view, see Andrew O’Heir, “America’s Split Personality: Paranoid Superstate and Land of Equality,” www.salon, 2013.

[16]  Jared Diamond. The Third Chimpanzee: The Evolution and Future of the Human Animal. (New York: Harper Perennial, 1993), 220-221.  David Sloan Wilson. Evolution for Everyone. (New York: Delacorte Press, 2007), 51-57.

[17] onlinesuccesscentre.com/2011/04/three-characteristics-of-a-good-decision   Accessed 3/23/15.

[18] John Dewey. How We Think. (Lexington, MA: D.C. Heath, 1933), 96.  Jerome Bruner. “Going Beyond the Information Given,” in Contemporary Approaches to Cognition, J. Bruner, ed. (Cambridge: Harvard University Press, 1957), 66-67.

[19] You can see a discussion of Lincoln’s comment and the power and limits of demagogues in my blog post “Limiting the sum of Lincoln’s ‘Some:’ Democracy, Mobocracy, and Majority Rule.”

 

Progressivism, Postmodernism and Republicanism: The Relevance of James Conant to Educational Theory Today

Progressivism, Postmodernism and Republicanism:

The Relevance of James Conant to Educational Theory Today

Burton Weltman

Recovering a long lost era in Republicanism

I am writing this preface during the spring of 2016 in the midst of the Presidential primary election season.  In the context of the Republican Party’s policies and politics of the last eight years, and especially during this primary election cycle, James Conant was a Republican of a sort it is almost impossible to imagine today.  He was a hawk on foreign policy, a Cold Warrior during the 1950’s and 1960’s, which is similar to most of the current crop of Republican Presidential candidates.  But he was a social democrat on domestic policy, taking positions not unlike those of the socialist Democratic Presidential candidate today, and a world away from those of the present day Republican Party.  The gist of this essay is that Conant has a lot to teach self-styled progressives about education, and that progressive educators should acknowledge Conant as one of their own.  But a subordinate thesis is that Conant has a lot to teach Republicans about making humane public policy and behaving in a sane and sensible way.

Recovering James Conant

James Conant was one of the most prominent scientists, political figures and educational leaders of mid-twentieth century America.[i]  A life-long Republican who was seriously considered for the 1952 presidential nomination, Conant was also a precursor of postmodernism, an avowed social democrat, and a professed progressive educator.  A self-proclaimed member of the Power Elite that ostensibly ran the country, he was at the same time a devotee of John Dewey, exclaiming in a parody of Voltaire’s comment about God that “if John Dewey hadn’t existed, he would have had to be invented.”  A traditionalist in science, politics and education at the start of his career, Conant evolved into a self-styled radical for whom “conservative” was a dirty word and who combined liberalism and Republicanism in ways that might seem oxymoronic today.[ii]

Conant was an innovative thinker in science and education who has for too long been a lost figure in progressive educational theory.  Highly regarded by many progressives during the 1950’s and 1960’s for his defense of comprehensive high schools,[iii] Conant’s reputation among progressives has fared poorly since.[iv]  Educationally, Conant is generally portrayed as an elitist who proposed tracking students according to the needs of the military-industrial complex.[v]  Politically, he is derided as either a one-time liberal turned conservative or a life-long conservative who sometimes pretended to liberalism.  Intellectually, he is discounted as an arch-empiricist whose statistical studies of high schools during the 1950’s and 1960’s have little theoretical value.  Conant, who died in 1978, is at this point almost routinely classified as a “conservative” who promoted “traditional schooling” based on disciplinary curricula and social control methods.[vi]

The thesis of this article is that Conant’s ideas have been widely misconstrued by critics who have focused on his later works written in the midst of the Cold War during the 1950’s and 1960’s, and who have failed to place those books in the context of his earlier works from the 1930’s and 1940’s.  Conant’s is a story about the interaction of politics, ambition and theory.  It is in part an all-too-common tale of progressive theory warped by conservative political pressures and personal ambitions.  It is also, however, an example of the importance of theory and the staying power of progressive educational theory.  It is my contention that in the midst of all his personal and political peregrinations, Conant’s core educational theories remained progressive.  The purpose of this article is to examine Conant’s educational theories in the context of his life and times with the goal of demonstrating their importance for educators today.

Conant and His Critics

Conant’s progressive critics have generally focused three charges against him which they think demonstrate his anti-progressive and anti-democratic tendencies.  First, they say, Conant was a petty-bureaucrat who sought to consolidate small community-based schools into centralized schools that reduce students and teachers to mere cogs in a giant machine.  Second, he was an elitist who promoted stratified schools that ignore slower students in favor of the faster.  Third, he was a Cold Warrior who favored repressing dissenters and subordinating schools to the military-industrial complex.  Conant’s response to these critics was to plead guilty to their premises – he was a bureaucrat, elitist and Cold Warrior – but to deny their conclusion that he was anti-progressive and anti-democratic.

With respect to school consolidation, Conant argued that community-based schools too often fostered racism and ethnic exclusion, and that small schools did not provide enough ethnic or intellectual diversity.  In his best known educational work, The American High School Today,[vii] published in 1959, Conant called for eliminating almost half of the nation’s high schools as part of what he considered a progressive defense of comprehensive high schools as the cornerstone of a democratic educational system.  Writing in reply to Admiral Hyman Rickover’s highly publicized call for the establishment of separate high schools for high-achieving students as a necessary means to fight the Cold War,[viii] Conant vehemently rejected Rickover’s proposal as undemocratic and unnecessary to achieve educational excellence.[ix]

Responding to Rickover’s attack on the intellectual deficiencies and administrative inefficiencies of the public schools, Conant contended that centralized schools would be more efficient and more likely to have diverse student bodies.  In turn, larger schools would be better able to offer more advanced courses for advanced students but would also be able to offer a greater variety of courses to meet the needs of a diverse student population.  Contrary to Rickover and other conservative critics of comprehensive high schools, Conant rejected any form of tracking students into separate programs according to ability or achievement and any ability grouping in general education classes.  Ability grouping was acceptable to Conant only in the most advanced and specialized courses and this would be achieved largely through self-selection by students for these courses.[x]

With respect to the charge of elitism, Conant contended that democracy required well-educated leaders, and that American schools were not providing enough leadership education.  Conant defined democracy as “government by and for the people” but not of the people,[xi] and believed in what could be called plebiscitary democracy or democracy from the top-down rather than the bottom-up.[xii]  Conant’s ideal was a society in which the best and brightest people propelled themselves to the fore and then pulled the masses along.  He rejected the idea of what today would be called participatory democracy in which leaders are ostensibly pushed to the fore and pushed along by the people.[xiii]  At the same time, he believed that democracy required well educated followers who could check and balance, support but also critique, their leaders.[xiv]  As a result, he advocated not only advanced programs of specialized education for those who would become the scientific and political leaders of the country, but also rigorous programs of general education in which all students would participate together, hoping thereby to establish a common understanding and basis for communication between the elite and the masses.

With respect to the Cold War, Conant argued that public service was the foundation of democracy and the goal of progressive education, and that in times of national crisis, it is necessary for people to support their leaders even if they do not entirely agree with their policies.  Conant had doubts about the Cold War from its inception, but suppressed his doubts to support America’s leaders against what they defined as the menace of Communism.  In so doing, Conant admittedly subordinated some of his progressive ideas to Cold War imperatives.  But he did not abandon them and believed that in supporting some repressive aspects of the anti-Communist crusade, he was preventing it from becoming worse and was saving a place for progressive values.

The premises that led Conant to support centralized schools, elitist programs and Cold War repressions are very different from the more egalitarian, participatory democratic premises of most present-day progressives, including the author of this article.  It is my contention, however, that Conant’s primary concerns – with promoting ways that individuals and institutions can transcend and transform themselves – do not depend on these premises.  In turn, Conant’s core theories – models of science as social studies in a pluralistic universe, politics as social democracy in a multicultural world, and education as social problem-solving in a diverse community – remain relevant to theorists today.

Social Studies and Postmodern Science

Conant was a peripatetic polymorph who took on many different roles, and enjoyed a career that moved successfully from science to administration to politics to educational policy.[xv]  He was an openly ambitious person who sought power and status as a means of doing good for himself and for the world.  Born in 1893 to a middle class family in Dorchester, Massachusetts, Conant attended Roxbury Latin School and Harvard University, gaining a B.A. in 1913 and a Ph.D. in 1916.  He then worked as a chemistry professor at Harvard, becoming President of the University in 1933 and serving in that capacity until 1953.  From 1941 to 1946, Conant was also Chair of the National Defense Research Committee and overall coordinator of the effort to produce an A-bomb.  He continued as an advisor on atomic weapons until 1953.  From 1953 to 1957, he was first the U.S. High Commissioner and then Ambassador to Germany.  From 1957 until his death in 1978, he worked primarily on educational research and writing.

Conant was first and foremost a scientist and continued working on science even as he did other things.  During the 1910’s and 1920’s, he was a laboratory chemist doing pure research.  In the 1930’s, he turned to the history and theory of science.  During the 1940’s, he worked in applied science, mainly on building the first A-bombs and other nuclear weapons projects.  Finally, in the 1950’s and 1960’s, he returned to the history and theory of science.  During his career, Conant’s theories of science evolved from the positivist philosophy that characterized most of his colleagues to what could be described as a post-modern relativism.  Conant described the evolution of his philosophy as “a mixture of William James’ Pragmatism and the Logical Empiricism of the Vienna Circle with at least two jiggers of skepticism thrown in.”[xvi]  Becoming less sure about science as he became more powerful as a scientist, Conant eventually came to the conclusion that scientific theories were influenced by social circumstances as much as empirical evidence.  And he argued that studying social science was almost as important to understanding physical science as studying physical science itself.[xvii]  The development of Conant’s scientific ideas greatly influenced his educational theories.

Conant’s scientific interests began in his childhood home.  His parents were devotees of the eighteenth century scientist and theologian Emanuel Swedenborg.[xviii]  Swedenborg argued that matter and mind are two sides of the same spiritual coin.  He sought to extend the physical theories of Newton and the psychological theories of Locke, and to solve the spirit/body problem posed by Descartes, through what was essentially a pantheistic explanation of the universe.  Although ostensibly a Christian, Swedenborg claimed that different cultures have different ways of explaining the universe and each may be valid in its own way.  Rather than demanding doctrinal purity or ritual uniformity, God, according to Swedenborg, wanted humankind to cooperate in socially useful work.[xix] Although Conant eventually joined the Unitarian Church, his interdisciplinary approach to science, trying to consolidate the various physical sciences and combine the physical and social sciences, and his pluralistic and social democratic approach to society were similar to Swedenborg’s views.

Conant’s interest in science was given direction by his high school chemistry teacher, Mr. Black.  Conant greatly appreciated the teacher’s hands-on methods and personal concern for students, and Mr. Black often served as an example of a good teacher in Conant’s later educational writings.  At Harvard, Conant pursued a double undergraduate major and did a dual doctoral thesis in physical chemistry and organic chemistry, considered an innovative combination at that time.  Mentored by Theodore Richards, who was one of the most prominent chemists of the day and whose daughter Conant later married, Conant was initiated into the American scientific elite at Harvard.[xx]

Upon earning his doctorate, Conant was encouraged by Richards to work in what was then considered an area of vital national interest: developing poison gas.  Making his first essay into weapons of mass killing, Conant worked initially with colleagues on some private research and then spent World War I working for the Army.  He was highly praised for his work and was well regarded within high-level military circles.[xxi]

After the War, Conant returned to Harvard and during the 1920’s undertook “pioneering efforts to apply the techniques of physical chemistry to the study of organic reactions.”[xxii]  In 1933, he published a textbook, The Chemistry of Organic Compounds, which became the “standard work in the field” during the 1930’s and 1940’s.[xxiii]  He was frequently consulted by major corporate and government officials and thereby gained entre to the industrial and political elites of the day.

Conant left his laboratory in 1933 to become President of Harvard and turned to working in science on a more theoretical level.  Influenced by his contact as President with scholars from the humanities and social sciences, Conant began to develop a humanistic approach to science, taking what was then the radical step of using case histories in teaching physical science courses.  Conant was worried during the 1930’s about what he saw as the debasement of science by ideologically driven scientists, and particularly the environmental genetics being promoted by the Soviet scientist Lysenko and the racist genetics of the Nazis.  He was also concerned that ignorance of how science and scientists actually work left ordinary people open to the pseudo-scientific charlatanism of ideologues such as Lysenko and the Nazis.  Conant hoped that a historical approach to science, one that examined the relationship between social and scientific developments, would help both budding scientists and the general public to appreciate the nature of science and the need for enlightened scientific leadership.[xxiv]

Conant’s work on scientific theory was cut short by the start of World War II and his work on the A-bomb.  It was a project Conant undertook with typical thoroughness but also characteristic ambivalence.  Fearing that nuclear weapons would lead to a nuclear holocaust, Conant fervently prayed until the moment the first A-bomb was successfully tested that it would not work.[xxv]  At the end of World War II, Conant hoped that he could atone for the A-bomb by working on peaceful uses of science and atomic energy.  But with the advent of the Cold War, he was asked to advise the government on building newer and better atomic weapons.  Although he answered what he saw as the call of duty, the nature of the work affected his thinking.  He could no longer accept the positivists’ beliefs in the inevitability of progress through science and science as progress, or in the neutrality of scientists.[xxvi]

Continuing his work on the history of science, Conant developed innovative ideas that anticipated and coincided with the theories of Thomas Kuhn, who worked with Conant as a graduate student at Harvard during the 1940’s and whose work became a foundation of much postmodern theory.  Postmodernism has been described as a revolt against the positivists’ doctrine of res ipsa loquitur, that facts speak for themselves and that the more facts you have the better your conclusions.[xxvii]  Like Kuhn later, Conant claimed that science develops through paradigm shifts rather than incremental changes and that these shifts result mostly from cultural changes rather than new evidence.  New evidence, Conant contended, leads merely to the amendment of old theories.  New theories result from new questions, questions that reflect changes in social structure, problems and philosophies.  In Conant’s view, scientists are “revolutionists” who arise out of the prevailing culture, transcend it, and then pull the culture up with them.  Scientific revolutions, in turn, require a high level of popular education so that the public can intelligently support the work of creative scientists.[xxviii]

In his historical case studies, Conant contrasted the social and intellectual circumstances under which scientists worked before a new scientific discovery and after, focusing particularly on the questions they asked.  Based on his method, the successful overthrow of Aristotle’s theory of motion (that things will stop moving unless force is continuously exerted on them) by Newton’s theory (that things will continue moving forever unless a force is exerted to stop them) could, for example, be explained in part as a result of differing social circumstances: Aristotle lived in a traditional society in which stasis was the norm and the primary question was how anything changed; Newton lived in a dynamic society in which change was the rule and the primary question was how anything stayed the same.  Ancient and modern scientists asked different questions and got different answers, but both were useful to the societies in which they were conceived.  Taking this argument even further, Conant contended that people no longer believe in Homer’s myths because Greek gods are not useful in answering present-day questions, not because the myths are untrue.  Conant claimed that there is no reason to think that Zeus and the other gods did not in some sense actually exist for the people for whom Homer’s myths were useful answers to important questions.[xxix]

Rejecting the idea of a universal science which is good for all people at all times, Conant tended toward what might be called a soft postmodernism grounded in relativism rather than nihilism.[xxx]  Picking up on William James’ notions of an open universe and the effects of theory on reality,[xxxi] he promoted a vision of scientists transcending their cultures and transforming the world thereby.  At the same time, continuing his debate with Lysenko, Conant insisted there is a vital difference between partisanship and objectivity – that while scientists cannot be neutral, they can be objective and need not be mere propagandists.  Scientists can and must fairly consider all of the relevant evidence and pertinent points of view on a subject.  They can and must consider opponents’ arguments in ways that the opponents would recognize, and not merely set up straw men to knock down.  In sum, while there may be more than one right answer to any important question, there are also wrong answers, answers that do not fit the evidence or meet opponents’ arguments.  While there may be no final Truth, scientists must cooperatively strive for the broadest working consensus on what may be right and what is wrong under the prevailing circumstances.[xxxii]  In a pluralistic universe, Conant concluded, the goal of science is not certainty but contingency, not merely answers to questions but also new questions to answer so that the quest for a better life can continue.[xxxiii]

Conant’s iconoclasm extended to rejecting the prevailing notion that the physical sciences were radically different from and inherently superior to the social sciences.  Conant indicated that they were essentially the same and that there were only two main differences between them.  The first difference lay in the range of choices that their subjects enjoy.  Physical scientists study things that have relatively little variation or choice as to what they will do.  An electron might, for example, at any given moment act like a particle or a wave, or go through one hole or another in a screen, so that the individual electron’s behavior cannot be predicted.  But electrons have relatively few options, so that their behavior is for the most part a matter of simple probabilities that can be accurately predicted in the aggregate.  Humans are not so simple and neither their individual nor group behavior is easily predictable.  The choices that humans can make and the variations in their behavior are enormous, making social science more complex but also more important to study than physical science.[xxxiv]

The second difference between the physical and social sciences lay in the role of politics in their workings.  While physical science is fraught with political issues, social science deals with political issues per se.  As a consequence, social scientists have historically been less willing and able than physical scientists to agree upon common frameworks for research and development, and to work cooperatively within those frameworks.  In turn, while clear-cut paradigm shifts have occurred in the physical sciences, so that it is possible, for example, to say that the Copernican view of the universe has replaced the Ptolemaic view, such shifts have not been as clear cut or conclusive in the social sciences.  This greater degree of cooperation among physical scientists – their willingness to work together and to accept each other’s findings and conclusions – is a major reason for the greater success and public acceptance of the physical sciences.  It is something that the social sciences need to develop and that schools could help foster with a more pro-social, social problem-centered curriculum.[xxxv]

Conant’s concerns about the interplay of the social and physical sciences, and the relationship between scientists and the general public, were not merely academic matters for him.  Although he was himself a member of the scientific policy elite, he worried about the tendency of scientists to become “exalted and isolated” to the detriment of democracy and their own best judgments.[xxxvi]  In the wake of the astonishing development of the A-bomb, Conant warned that science was being glorified as magic and scientists as demigods.  He fretted that lay people could not understand the scientific issues of the atomic age and that decisions involving science would by default be made by scientists alone.  Conant worried that unchecked power and popular adulation could corrupt science and scientists.  These concerns are reflected in his curricular proposals for a program of general education that connects scientists and laypeople.[xxxvii]

Conant’s concerns were exacerbated during the Cold War by the unparalleled secrecy imposed by the government on scientists who in any way worked in areas that might have some military application.[xxxviii]  The troublesome consequences were exemplified in Conant’s own flip-flopping positions on the H-bomb.  Conant initially opposed the production of an H-bomb on the moral grounds that it was a genocidal weapon.  He also initially supported his friend Robert Oppenheimer when Oppenheimer spoke out publicly against the H-bomb.  But Conant backed off when Edward Teller and other H-bomb proponents accused Oppenheimer of being a national security risk and effectively destroyed Oppenheimer’s career.[xxxix]  Conant refused to make public either his concerns about the H-bomb or his support for Oppenheimer, seemingly for fear of jeopardizing his own standing within the inner circles of power.

Conant has been accused of hypocrisy and cowardice for these actions,[xl] but I think the roots of his contradictions are more subtle.  On the one hand, Conant was genuinely concerned for the security of the United States if the Soviets forged ahead in the development of nuclear weapons.  On the other hand, he was fearful of what might happen to the world if nuclear militants such as Edward Teller ran things unchecked.  Given what had happened to Oppenheimer when he dared to speak publicly about nuclear policy, Conant decided that the best thing he could do for humankind was to stay silent and stay part of the Power Elite, hoping thereby to exert some salutary influence on American policy.  The exigencies of the Cold War plus his own top-down view of society arguably led him to choose power over principle in these matters.  A prime architect of atomic weapons, Conant could, nonetheless, sincerely yearn for the simpler world of his youth and exclaim: “I do not like the atomic age or any of its consequences.”[xli]

Social Democracy, Republicanism and the Cold War

Conant’s youth was spent in Massachusetts at the turn of the twentieth century, living in a region and household in which the Civil War was still a current event, Republicanism was synonymous with patriotism and progress, and Democrats were considered traitors and reactionaries.  To the young Conant, Republicans stood for enlightenment and industry, Democrats for racism and feudalism.[xlii]

For most of his career, Conant portrayed politics in fairly simple terms: liberals were the good guys and conservatives were not.  Conant rejected what he saw as the conservative ideal of a laissez-faire economy in which every person must fend for him/herself and government exists to protect private property.  He supported, instead, the liberal idea of a regulated economy in which government guarantees each person a decent job and standard of living.  Conant also decried what he saw as the conservative ideal of a traditional culture enforced through censorship to ensure that each generation follows blindly and blandly in the footsteps of the last.  He supported, instead, the liberal idea of a laissez-faire culture in which each generation develops its own way of life and the government encourages diversity and creativity.[xliii]

Citing Jefferson as his mentor, Conant combined meritocratic views of leadership with social democratic views of public policy.  Politics, Conant claimed, is social science in action, a process in which officials experiment with hypotheses as to what will best serve the public interest and the people register their support and dissent at the polls.  Democracy is a form of permanent revolution in which enlightened leaders with the support of educated followers continually transcend the status quo and continuously move the country toward a more creative and cooperative society.[xliv]

In the early stages of his career, Conant found a relatively comfortable home for these political views within a Republican Party that harbored such liberals as Robert La Follette, Sr., Robert LaFollette, Jr. and Henry Wallace, Sr.  As time passed, liberals were more likely to be found in the Democratic Party, but Conant stayed a Republican.  He seemed more comfortable with the Republican constituency of business people and others who identified with the upper classes than with the Democrats’ primary constituency of small farmers, workers and those who identified with the lower classes.[xlv]  He sought to fight on behalf of the masses, but wanted to work primarily with his own kind within the elite.[xlvi]  As the Republican Party became more conservative, Conant tried to guide the party to the left while fighting the increasing power of the Right.[xlvii]

As a young man, Conant believed that Weimar Germany might provide a model for American development.  Supported by scientific and educational systems that were the best and most meritocratic in the world, German institutions during the 1920’s were governed by social democrats and led by a technocratic elite.[xlviii]  The rise of the Nazis in Germany was a great shock to Conant.  He reluctantly conceded that science and education do not guarantee political rationality, and concluded that fascism underscored the need for a pro-social democratic education for both the elite leaders and the masses in a liberal society.[xlix]

Conant’s revulsion toward the Nazis led him to buck the prevailing isolationism within the Republican Party during the late 1930’s, and he became a fervent advocate of military preparedness and militant action against fascism.[l]  With the coming of World War II, Conant supported the most vigorous prosecution of the war by any means available, even to the point of suppressing civil liberties at home and using weapons of mass killing abroad.  His view of Nazism as fundamentally evil led him to conclude that in times of war, liberal measures must be suspended: “All war is immoral” and, therefore, all is fair in war.[li]  He later applied this same doctrine to the Cold War.  At the same time, Conant saw the social cooperation required for prosecuting World War II as an opportunity to advance social democracy[lii] and he called for an upsurge of radicalism in the United States.[liii]

As World War II ended, Conant believed that conflict between the United States and the Soviet Union could be avoided.  While he thought the U.S. and U.S.S.R. would inevitably compete, he believed the Soviets wanted to win ideologically and economically, not militarily.  Once again bucking mainstream opinion within the Republican Party, Conant proposed sharing A-bomb secrets with the Russians to forestall a nuclear arms race,[liv] and as late as 1948, he was still forcefully arguing that “there is little or no analogy between the Nazi menace and the Soviet challenge.”[lv]  Conant similarly argued that the challenge of domestic Communism should be met through intellectual competition, rather than repression.  In Conant’s view, Communists were wrong but not evil, their methods misguided but their goals relatively benign.  Conant warned that “reactionaries” will try to use anti-Communism “as an excuse” to attack liberals.  Citing the attacks on Alger Hiss as an example of this tactic, Conant initially came out strongly in Hiss’ defense when Hiss was accused of being a Communist spy.[lvi]

But by the end of 1948, Conant was taking a very different stance.  Under intense pressure from his colleagues within the foreign policy elite, and under the pressure of events such as the Communist coup in Czechoslovakia and the Berlin Blockade, Conant joined the Cold War and anti-Communist crusades with the fervor of a convert.  Seemingly concerned with protecting his status as a member of the Power Elite, he acted like someone who felt the need to prove his loyalty.  Rewriting his own history, Conant retrospectively claimed that he was “one of the first of the Cold War warriors” when in fact he did not join their ranks until late 1948.[lvii]  He also retroactively chastised those who had in 1948 “still clung to the belief in cooperation” with the Soviets, when he had done so himself.[lviii]  In any case, Conant now portrayed Communism as a fundamental evil and significant threat.[lix]

Abandoning his previous analysis, Conant fell into line with the prevailing Cold War analogy between Nazi Germany and Soviet Russia.  While he still did not think the Soviets posed any immediate military threat, neither, he said, had Nazi Germany in the early 1930’s.  Conant concluded that it was the failure of the West to challenge the fascists militarily when they were weak during the 1930’s that had emboldened and enabled them to start World War II.  Elected officials in the West, faced with strong pacifist sentiments amongst the public, had lacked the will to undertake a military buildup, thereby encouraging the fascists.  Similarly, Conant believed, it was necessary to challenge the Soviets militarily before they could move toward conquest.  Toward that end, Conant founded in 1950 the Committee on the Present Danger, an organization of political, business and educational leaders, for the purpose of propagandizing the public and lobbying Congress in favor of a peacetime draft, an expanded nuclear arsenal, and a large-scale military buildup.  Abandoning objectivity as well as neutrality on this issue, Conant joined with other Cold Warriors in deliberately exaggerating the immediate threat from the Soviet Union.  He rationalized this deception on the grounds that scare tactics were necessary to build public support for a show of military force that would forestall the Soviets and prevent another world war in the long run.[lx]

Conant also fell into line on the prevailing opinion of domestic Communism.  In the wake of Alger Hiss’ perjury conviction and Senator Joseph McCarthy’s assault on Communist sympathizers, Conant took a strong anti-Communist stance.  Contradicting his previous statements that attacking Communists would open the door to attacks on liberalism, he rationalized his cooperation with McCarthy and other anti-Communists as a means of protecting liberals from attack.  Despite his previous opposition to loyalty oaths as a violation of free speech, Conant became a firm supporter of loyalty oaths.  And, contrary to his previous support of academic freedom for all political opinions, Conant campaigned for a ban on Communist teachers in the public schools.  While privately dismissing any threat from domestic Communism, he publicly contended that Communists had abdicated their intellectual freedom in becoming mouthpieces for the Communist Party and agents of the Soviet Union, and therefore had no place in a free marketplace of ideas.[lxi]

The Cold War strained Conant’s liberal commitments more than any other crisis in his life.  He seemed at times during the 1950’s and 1960’s to abdicate his own judgment in favor of automatically rejecting anything Communists might support or might be construed as pro-Communist.  While Conant predictably liked Ike but not Goldwater or Nixon, he was a strong supporter of the Vietnam War after most liberals, including many Republicans, turned against it.  Publicly condemning the anti-war movement and New Left as traitorous, Conant in effect practiced McCarthyism beyond the McCarthy era.[lxii]  Privately, however, he criticized the war and voted for the anti-war presidential candidate George McGovern in 1972. [lxiii]

Conant has been condemned as a hypocrite and coward for his political actions.  From issues of anti-Semitism and free speech for radicals at Harvard during the 1930’s to McCarthyism and the Cold War during the 1950’s to student radicalism and the Vietnam War in the 1970’s, Conant held private views that were more liberal than his public positions.[lxiv]  But Conant’s problem was neither cowardice nor hypocrisy but social theory.  Conant believed that if you were not a member of the Power Elite, your principles were impotent and irrelevant.  This was a pragmatic judgment from one who contended that liberal values could only be implemented from the top-down.  So, when it came to risking his status within the elite for his principles, Conant generally found a way to rationalize his principles.

Meritocracy, Democracy and Public Education

Consistent with his training as scientist during the early 20th century, Conant began his career as a staunch traditionalist in education, favoring a strictly disciplinary curriculum, teacher-centered teaching methods, and rote learning and testing.  He came to the presidency of Harvard in 1933 with a low opinion of the university’s School of Education as a den of progressive anti-intellectuals.  In Conant’s view, teaching was something any well-educated person could do and he initially hoped to abolish the School but was convinced otherwise because it was a moneymaker for the University.  He decided, instead, to try to reform the School and in the process was converted to progressivism.[lxv]

Conant described progressive education as a system of student-centered pedagogy with teaching methods that focus on students’ interests and activities; social-centered curricula based on interdisciplinary subjects that focus on social problems of concern to students; and practical forms of evaluation, or what today would be called authentic assessment.  Progressivism was a means of encouraging students to transcend their backgrounds, engage in critical and reflective thinking, and transform themselves and their society.  Consistent with his top-down vision of democracy, Conant promoted a top-down version of progressivism.  He projected four main educational goals: (1) a high level of civic education to prepare every student for the rights and duties of a social democracy; (2) a high level of specialized education for those who will be the elite scientists and leaders of tomorrow; (3) a high level of general education to prepare the masses to evaluate the work of their leaders; and (4) a high level of vocational education to prepare non-elite students for gainful and socially useful employment.[lxvi]

Although Conant is best known for his empirical studies of schools from the 1950’s and 1960’s, he claimed four “inventions” from the 1930’s and 1940’s as his primary contributions to the field of education.  These were: initiating the Masters of Arts in Teaching (MAT) degree at Harvard in the mid-1930’s; supporting standardized testing in the late 1930’s, which led to the foundation of Educational Testing Service (ETS); organizing the Harvard report on General Education in a Free Society (GEFS) in the mid-1940’s; and participating in the National Education Association report on Education for ALL American Youth (EAAY) in the late 1940’s.[lxvii]  The MAT, GEFS and EEAY represent progressive innovations that have not yet had the impact for which Conant hoped.  ETS is an innovation that Conant thought would be progressive, later concluded was not, and has had a far greater impact than he desired.

The MAT was for Conant a model of progressive teacher education.  Jointly developed and administered by academic and education professors, it divided prospective teachers’ coursework evenly between academic subjects and pedagogy.  Working on the MAT brought Conant a new respect for teaching as an art that needed to be taught by professional educators.[lxviii]

ETS was for Conant a vehicle for establishing what he thought would be progressive means of assessment.[lxix]  Standardized testing appealed to Conant’s democratic, meritocratic and scientific orientations.  Testing, he claimed, is democratic because it is the same for all.  It is meritocratic because it aims at identifying the best students.  And, it is scientific because it is quantifiable and ostensibly objective.  Conant hoped that standardized testing would undermine the advantages that wealth and cultural background give to students from upper class families, and would open the doors of higher education and higher social position to middle and lower class students.  He also hoped that standardized testing would encourage progressive methods of teaching.  Conant’s support for testing rested, however, on two assumptions that he later questioned: that standardized aptitude tests measure some sort of generalized intelligence common to everyone, and that standardized achievement tests measure genuine knowledge of a subject.[lxx]

By the 1950’s, Conant had concluded that there is no such thing as a singular intelligence or a singular measure of intelligence, but that people are endowed with what today would be called multiple intelligences, and there is no universal way to measure these aptitudes.  He also seemed to conclude that achievement tests are self-defeating and self-invalidating, seeming to presage present-day concerns about standardized testing.  To be valid, an achievement test must be based on a random sample of knowledge from a generalized subject.  Standardized testing, however, leads schools to teach to the test, narrowing their curricula to the questions that are most likely to be asked on the test.  The results are that students no longer get the benefit of a general education and standardized tests no longer measure the general education they were intended to evaluate.  Students end up merely learning how to take the test and the test merely measures that ability.  While continuing to support testing as an adjunct method of evaluation, Conant became a proponent of what would today be called authentic assessment – observing students perform real world activities – as the best measure of aptitude and achievement.[lxxi]

GEFS and EAAY were essays in social democratic curricula.[lxxii]  Although GEFS was produced by an elite corps of professors and EAAY was produced largely by a group of schoolteachers, Conant claimed that the core recommendations of the two reports were essentially similar.[lxxiii]  Both proposed that schools focus on “life education” rather than merely the academic disciplines.  Both proposed that schools develop diversified curricula to meet the needs of diverse students and diversified extra-curricular activities to encourage students toward progressive social change.  And both proposed that the primary goal of education be “cultural literacy,” defining that goal in pluralistic and pragmatic rather than mono-cultural and absolutist terms.[lxxiv]  Cultural literacy is the understanding of different cultures through comparing and contrasting each with the others, transcending your own culture, and working with others toward common social goals.[lxxv]

GEFS argued that schools should help students transcend their everyday experiences and environments, deal with a diverse and changing world, and transform themselves and their society.  The report recommended a curriculum based on the “five fingers of education:” Language Arts, or transcending oneself through communication; Fine Arts, or self-transcendence through self-expression; Mathematics and Science, or transcending common sense through scientific methods; Social Studies, or transcending the here and now through history, geography and the social sciences; and the Vocations, or “putting into practice the bookish theory of the classroom.”[lxxvi]  While rejecting any standardized national curriculum, the report recommended a common core curriculum for students within each school so that every student “should be able to talk with his fellows…above the level of casual conversation” and students will be better able to organize themselves for social action.[lxxvii]

EEAY proposed to supplement the traditional academic curriculum with courses that start with everyday problems and then proceed to more complex intellectual issues, serving as an introduction and inducement to academic work by adapting the academic disciplines to everyday life.  Under EEAY, all students would take a “common learnings core” consisting of cultural education dealing with issues of family life, health, consumerism, and leisure, and citizenship education dealing with social problems, human rights and civic responsibilities.  All students would participate in community service to develop pro-social attitudes, vocational work to explore career choices, and political campaigns to “develop competence in political action.”  EEAY rejected educational tracking and ability grouping of students, proposing, instead, that students be placed in heterogeneous classes in which they would work on individual and group projects that reflect their varying interests and abilities.[lxxviii]

EEAY was intended as a proposal for continuous educational reform.  Calling for a “grass roots approach to improving programs in local schools,” the report proposed an ongoing series of community-school surveys of parents, teachers, students and community members that would help determine how schools operated and what should be included in the school curriculum.[lxxix]   The surveys asked adults what they thought they needed to know to be successful as adults, and asked children what they thought they needed to know to be successful as children.  This procedure was used with success in many school districts during the late 1940’s and early 1950’s.  It was a method of creating what today could be considered an authentic curriculum.[lxxx]

From the mid 1940’s through the early 1950’s, Conant vigorously campaigned in support of the proposals in GEFS and EAAY.  With his friend Dwight Eisenhower as a member of the EEAY board that Conant chaired,[lxxxi] Conant argued that education must focus on “the study of the economic, political, and social problems of the day” and promote the principles of liberal democracy.  To develop a social democracy, Conant insisted, you must have a social democratic educational system.[lxxxii]

With the advent of the Cold War and the McCarthy era in the early 1950’s, progressive educators came under withering attack as part of an overall assault on liberals and liberalism,[lxxxiii] and progressive education was maligned as an anti-intellectual and even subversive scourge on American education.[lxxxiv]  During this period, Conant never went back on his support for GEFS and EAAY and repeatedly cited them as curricular models.  But, under the pressure of the Cold War, he subordinated these proposals to arguments that the first priority of American schools must be “the education of their gifted students,” those who will become the scientists and leaders needed to defeat the Soviets.[lxxxv]

It was in this context that Conant produced his three best known reports on education, The American High School Today in 1959, Slums and Suburbs in 1961, and The Education of American Teachers in 1963.  They are empirical studies of social problems affecting schools – inadequate staffing and curricula in small schools, poverty and racial segregation in inner city schools, and inadequately educated teachers – in which Conant ties social policy recommendations to the progressive educational theories of GEFS and EAAY.  While the reports are distinctly elitist in tone, they emphatically reject the even more elitist proposals that were popular at that time.

In The American High School Today, Conant was concerned that Americans, particularly those in the middle and upper classes, suffered from delusions of self-sufficiency, unable to see the connections between themselves and other people, especially the less affluent.  As a result, Americans are often unable to see why they should support social institutions that benefit others, especially at expense to themselves.  Conant thought that heterogeneous classes and general education courses in comprehensive high schools would help remedy this problem.[lxxxvi]  He naively underestimated the invidious effects of social class and academic competition on school life, and overestimated the democratizing effects of heterogeneous homerooms and general education classes.  While school consolidation was widely undertaken during the 1960’s and 1970’s, few of the newly consolidated schools promoted the pluralism or adhered to the restrictions on ability grouping that Conant proposed. [lxxxvii]

In Slums and Suburbs, Conant rejected the idea that black schoolchildren were genetically inferior to whites in intelligence and called for expanded jobs programs, better social service programs, and greater spending on inner city schools.[lxxxviii]  In The Education of American Teachers, Conant repeated arguments he had previously made in favor of the MAT program, proposing that prospective teachers take a relatively equal number of courses in pedagogy and academic subjects, and insisting that in order to obtain permanent certification, teachers should demonstrate their knowledge of the social and cultural backgrounds of the students in their schools. [lxxxix]

While the tone of Conant’s educational writings changed during the Cold War, the substance did not.  As a top-down democrat, he consistently throughout his career placed greater emphasis than would participatory democrats on the education of higher achieving students, and this emphasis was even greater during the 1950’s and 1960’s.  But Conant continued during this period to promote progressive principles of interdisciplinary and problem-solving curricula, student-centered teaching methods, pluralistic schools and heterogeneous classrooms, and greater equality in both educational opportunities and outcomes.

Conant’s Educational Legacy

Although Conant was widely considered to be successful at almost everything he did, he did not agree.  Commenting in 1977, a year before his death, on trends in politics and education, Conant complained that “Everything I’ve worked for has been rejected.”[xc]  He had good cause to lament.

Politically, liberal Republicans were a dying breed by the late 1970’s[xci] and Conant was having second thoughts about the Cold War.[xcii]   By his own standards, many of Conant’s actions during the Cold War were not exemplary.  He frequently said one thing privately while publicly doing the opposite, and orchestrated a massive campaign of deception in order to gain popular support for the government’s Cold War policies.  His behavior seems even more reprehensible to those, like the author of this article, who are old enough to have lived through the Cold War and who viewed it as Conant initially did in the 1940’s: that the Soviet Union did not pose a military threat sufficient to warrant an all-out arms race and the militarization of American foreign policy; and that domestic Communists were a relatively benign group – sometimes helpful in the labor and civil rights movements, sometimes harmful, mainly to themselves, in their blind support for the Soviet Union.  To those who see the breakup of the Soviet empire and the historical revelations of the last ten years as further confirmation of this view, the Cold War was a terrible mistake, a mistake for which Conant bears significant responsibility as I think he began to see toward the end of his life.  In education, Conant was not doing much better.  Shopping mall high schools, racial segregation, standardized testing, traditional curricula and mechanical teaching methods – all of which he had opposed – were the norm in the late 1970’s, as they still are today.

In evaluating Conant’s failure to achieve his goals, or even sometimes to practice what he preached, I think that the major flaw in his theories and practices was his elitist concept of leadership.  Conant’s belief that social transformation comes from the top down, and his determination to stay within the policy elite at almost all costs, forced him into all sorts of theoretical and practical contradictions.  Conant expected from his Power Elite the sort of long-term thinking and pro-social consciousness that may have been realistic to expect from the Bob LaFollettes and Henry Wallaces of his youth, but were not very evident by the 1970’s and have not been since.  Conant’s peers let him down, but so did his premises.  Because there is strong reason to believe, and I would argue strongly, that the sorts of progressive educational and social reforms Conant wanted can only be achieved from the bottom up.[xciii]

I conclude, nonetheless, that Conant’s ideas can be purged of their elitist undertones and still resonate with progressive theorists today.  Among other things, General Education for a Free Society and Education for All American Youth remain two of the most interesting and promising proposals of the last hundred years.  Moreover, Conant faced during the 1940’s and 1950’s the same sorts of questions about school choice, privatization, tracking, standardized curricula and standardized testing that educators are facing today.  At a time when standardized curricula and testing have become the rage, and technology and quantification have become the standards for all knowledge, having opinions to the contrary from someone like Conant – a founder of ETS, world-renowned scientist, and Republican stalwart – constitutes important support for those who would buck the current trends.

Finally, Conant represents a time, not so long ago, when progressive reform was at least on the left bank of the mainstream, and a broad coalition of educators rallied around a common program of reform, even if from somewhat different perspectives – top-down for those like Conant, bottom-up for others.  A reconsideration of James Conant recovers a time when “radical” was even for many Republicans an ideal, “liberal” a term of praise, “conservative” a dirty word, and social democracy the goal of education.  This should be considered a valuable legacy for educational theorists today.

[i] Gordon Swanson, “The Hall of Shame,” Phi Delta Kappan, Vol.74, 10 (June 1993): 797.

[ii] James Conant, The Child, the Parent and the State (New York: McGraw Hill, 1959), 94; James Conant, Scientific Principles and Moral Conduct (Cambridge: Cambridge University Press, 1967), 37; James Conant, My Several Lives: Memoirs of a Social Inventor (New York: Harper & Row, 1970), 536; James Hershberg, James B. Conant: Harvard to Hiroshima and the Making of the Nuclear Age (New York: Alfred Knopf, 1993), 9, 13, 120, 512.

[iii] Daniel Tanner, Secondary Curriculum (New York: Macmillan Publishing Co., 1971), 17.

[iv] Diane Ravitch, The Revisionists Revisited: A Critique of the Radical Attack on the Schools (New York: Basic Books, 1978); Walter Feinberg, Harvey Kantor, Michael Katz & Paul Violas, Revisionists Respond to Ravitch (Washington, DC: National Academy of Education, 1980).

[v] For example, see Edgar Gumbert & Joel Spring, The Superschool & the Superstate: American Education in the Twentieth Century, 1918-1970 (New York: John Wiley & Sons, 1974), 40, 78-79, 137-139; David Tyack, The One Best System (Cambridge, MA: Harvard University Press, 1974), 276; Walter Feinberg, Reason and Rhetoric: The Intellectual Foundations of 20th Century Liberal Educational Policy (New York: John Wiley & Sons, 1975), 153-155; Paul Westmeyer, A History of American Higher Education (Springfield, IL: Charles Thomas, 1985), 102; Joel Spring, The American High School, 1642-1985 (New York: Longman, 1986), 287; Clarence Karier, The Individual, Society and Education (Urbana: University of Illinois Press, 1986), 255; Swanson, “The Hall of Shame,” 797-798; Peter Hlebowitx & Kip Tellez, American Education: Purpose and Promise (Belmont, CA: West/Wadsworth, 1997), 257; Dean Webb, Arlene Metha & Forbes Jordon, Foundation of American Education (Upper Saddle River, NJ: Prentice-Hall, 2000), 220; but, to the contrary, some recent appreciations of Conant are: Fred Hechinger, “School for Teenagers: A Historic Dilemma,” Teachers College Record, 94, 3 (Spring 1993): 522-539; Jurgen Herbst, The Once and Future School: Three Hundred and Fifty Years of Secondary Education (New York: Routledge, 1996), 181; John Brubacher & Willis Rudy, Higher Education in Transition (New Brunswick, NJ: Transactions Press, 1997), 424-426.

[vi] Larry Cuban, “Managing the Dilemmas of High School Reform,” Curriculum Inquiry, 30, 1 (Winter 2000): 106.

[vii] James Conant, The American High School Today (New York: McGraw Hill, 1959).

[viii] Hyman Rickover, Education and Freedom (New York: E.P. Dutton & Co., 1959).

[ix] Conant, American High School Today, 37, 63.  He later raised his proposed minimum school population to 750 students in James Conant, The Comprehensive High School (New York: McGraw Hill, 1967), 2.

[x] Conant, American High School Today, 20, 37, 46, 48, 63.

[xi] Conant, Education in a Divided World, 234.

[xii] See Benjamin Barber, A Passion for Democracy (Princeton, NJ: Princeton University Press, 1998), 95-110; Benjamin Barber, Strong Democracy: Participatory Politics for a New Age (Berkeley, CA: University of California Press, 1984).

[xiii] Barber, Strong Democracy, XIV; Barber, A Passion for Democracy, 6, 10.

[xiv] James Conant, Slums and Suburbs (New York: McGraw Hill, 1961), 109.

[xv] Paul Bartlett, “James Bryant Conant,” Biographical Memoirs (Washington, DC: National Academy Press, 1983), 107.

[xvi] Quoted at Hershberg, James B. Conant, 578.

[xvii] James Conant, Two Modes of Thought: My Encounters with Science and Education (New York: Trident Press, 1964), 13-14.

[xviii] Conant, My Several Lives, 10; Hershberg, James B. Conant, 13.

[xix] Gregory Baker, Religion and Science: From Swedenborg to Chaotic Dynamics (New York: Solomon Press, 1992), 13-14, 21-25; Inge Jonsson, Emanuel Swedenborg (New York: Twayne Publishers, 1971), 14, 40, 72, 79.

[xx] Conant, My Several Lives, 15, 19; Hershberg, James B. Conant, 27.

[xxi] Conant, My Several Lives, 44; Hershberg, James B. Conant, 38-39, 48-49.

[xxii]  Martin Saltzman, “James Bryant Conant and the Development of Physical Organic Chemistry.” Journal of Chemical Education, 49, 6 (June 1972): 411; Hershberg, James B. Conant, 55-56.

[xxiii] Harry Passow, American Secondary Education: The Conant Influence (Reston, VA: National Association of Secondary School Administrators, 1977), 3; George Kistiakowsky, “James B. Conant, 1893-1978,” Nature, 273, 5665 (June 29, 1978): 793.

[xxiv] James Conant, Germany and Freedom (Cambridge, MA: Harvard University Press, 1958), 26-27; Conant, My Several Lives, 140-145, 373.

[xxv] James Conant, On Understanding Science: An Historical Approach (Hew Haven, CN: Yale University Press, 1947), XII; Conant, My Several Lives, 236, 242, 272, 274, 298; Hershberg, James B. Conant, 157, 170, 325.

[xxvi] James Conant, Anglo-American Relations in the Atomic Age (London: Oxford University Press, 1952), 17-18; James Conant, Modern Science and Modern Man (New York: Columbia University Press, 1952), 12-16.

[xxvii] Stefan Morawski, The Trouble with Postmodernism (London: Routledge, 1996),

2; Stanley Grenz, A Primer on PostModernism (Grand Rapids, MI: William B. Eerdmans

Publishing Co.), 7, 34, 40-46.

[xxviii] Conant, On Understanding Science, 25, 36, 91; Conant, Science and Common Sense, VIII; Hershberg, James B. Conant, 410, 860-footnote 84; Thomas Kuhn, The Copernican Revolution (New York: Vintage Books, 1957), IX.

[xxix] Conant, On Understanding Science, 11-12; Conant, Science and Common Sense, 8, 10, 15, 25-26; Conant, Modern Science and Modern Man, 19, 22, 23, 54, 58, 62; Conant, Two Modes of Thought, 13, 14,15-17, 18, 83; Conant, Scientific Principles and Moral Conduct, 15-16, 25; Philip Kitcher, “A Plea for Science Studies,” in A House Built on Sand: Exposing Postmodernist Myths About Science, ed. Noretta Koertge (New York: Oxford University Press, 1998), 34, 36; Bruno Latour, Science in Action (Cambridge, MA: Harvard University Press, 1987), 2-4, 12.

[xxx]  S. Morawski, The Trouble with Postmodernism (London: Routledge, 1996), 2; Richard Rorty, Consequences of Pragmatism (Minneapolis: University of Minnesota Press, 1982), XXXIX; P. Feyerabend, Against Method (New York: Verso, 1988), 189.

[xxxi] William James, “The Will to Believe,” in Essays on Faith and Morals, ed. R.B. Perry  (Cleveland: World Publishing Co., 1962), 32-62.

[xxxii] Conant, On Understanding Science, 30; Conant, Science and Common Sense, 10, 17, 30-31; Conant, Modern Science and Modern Man, 82, 88; Conant, Two Modes of Thought, 33; Conant, Scientific Principles and Moral Conduct, 38.

[xxxiii] James Conant, Science and Common Sense (New Haven, CN: Yale University Press, 1951), 25-26; Conant, Modern Science and Modern Man, 54, 62; James Conant, Scientific Principles and Moral Conduct (Cambridge: Cambridge University Press, 1967), 8, 29.

[xxxiv] Conant, Anglo-American Relations in the Atomic Age, 32, 37-38; Conant, Scientific Principles and Moral Conduct, 34-35; John Gribbin, Almost Everyone’s Guide to Science (New Haven, CN: Yale University Press, 2000), 45-47.

[xxxv] Conant, On Understanding Science, 22; Conant, Science and Common Sense, 38-39; Conant, Two Modes of Thought, 82-83; Garvin McCain & Erwin Segal, The Game of Science (Belmont, CA: Brooks/Cole Pub. 1969), 80.

[xxxvi] Conant, Modern Science and Modern Man, 66-67.

[xxxvii] Conant, On Understanding Science, l1; Conant, Modern Science and Modern Man, 66-67.

[xxxviii] Conant, Anglo-American Relations in the Atomic Age, 17-18, 23; Conant, Modern Science and Modern Man, 12-13, 16, 30.

[xxxix] Hershberg, James B. Conant, 466, 474, 482.

[xl] Hershberg, James B. Conant, 82, 93, 325, 404.

[xli] Conant, Modern Science and Modern Man, 6.

[xlii] Conant, My Several Lives, 11; Hershberg, James B. Conant, 14.

[xliii] James Conant, Education in a Divided World: The Function of Public Schools in Our Unique Society  (Cambridge, MA: Harvard University Press, 1948), 30-31, 172-173, 178.

[xliv] James Conant, “Wanted: American Radicals,” The Atlantic Monthly, 171, 5 (May 1943) 43; Conant, Education in a Divided World, 4-7; James Conant, Germany and Freedom (Cambridge: Harvard University Press, 1958), 67-69.

[xlv] Conant, “Wanted: American Radicals;” D.W. Brogan, Politics in America (Garden City, NY: Doubleday & Co., 1960), 37-54; Clinton Rossiter, Parties and Politics in America (Ithaca, NY: Cornell University Press, 1960), pp.107-151; Joyner, The Republican Dilemma; Rae, The Decline and Fall of the Liberal Republicans.

[xlvi]  Conant, The Child, the Parent and the State, 102.

[xlvii] Conrad Joyner, The Republican Dilemma: Conservatism or Progressivism (Tucson, AZ: University of Arizona Press, 1963); Nicol Rae, The Decline and Fall of the Liberal Republicans: From 1952 to the Present (New York: Oxford University Press, 1989).

[xlviii]  Conant, My Several Lives, 41, 68-69, 71; Hershberg, James B. Conant, 38, 42, 61.

[xlix]  Conant, Germany and Freedom, p.4.

[l] Conant, My Several Lives, 212, 308, 320-322.

[li] Conant, My Several Lives, 49; Hershberg, James B. Conant, 120.

[lii] Conant, My Several Lives, 364, 374-381.

[liii] Conant, “Wanted: American Radicals;” James Conant, General Education in a Free Society (Cambridge, MA: Harvard University Press, 1945), 34.  Conant claimed for himself a somewhat larger role in GEFS than some historians have described for him.  For purposes of this article, the exact extent of Conant’s role in producing the report is not as important as the fact that he continuously thereafter supported the recommendations of the report.  Hershberg, James B. Conant, 236.

[liv] Conant, My Several Lives, 300.

[lv] Conant, Education in a Divided World, 21, 24, 218.

[lvi]  Conant, Education in a Divided World, 172-173; Hershberg, James B. Conant, 435.

[lvii]  Quoted at Hershberg, James B. Conant, 322.

[lviii] Conant, The Child, the Parent and the State, 33.

[lix]  Hersherg, James B. Conant, 360, 462.

[lx]  Conant, 1970, My Several Lives, 506, 509, 512; Hershberg, James B. Conant, 384, 390, 493, 498, 521, 674.

[lxi]  Conant, My Several Lives, 456; Hershberg, James B. Conant, 431, 435.

[lxii]  Conant, My Several Lives, 640-642; Hershberg, James B. Conant, 746, 751.

[lxiii] Hershberg, James B. Conant, 752.

[lxiv]  Hershberg, James B. Conant, 82, 89, 93, 276, 404.

[lxv] Conant, My Several Lives, 137, 189.

[lxvi]  James Conant, “Education for a Classless Society,” The Atlantic Monthly, 165, 5 (May 1940): 596.

[lxvii] Conant, My Several Lives, XV-XVI.

[lxviii] James Conant, The Education of American Teachers (New York: McGraw Hill, 1963),1-2; Conant, My Several Lives, 181, 185.

[lxix] Conant, My Several Lives, 417, 419.

[lxx] Conant, My Several Lives, 417, 424, 432: Nicholas Lemann, The Big Test: The Secret History of the American Meritocracy (New York: Farrar, Straus & Giroux, 1999), 3.  Conant claimed for himself a bigger role in the founding of Educational Testing Services than described by Lemann in his seminal book.  For purposes of this article, the exact extent of Conant’s role is not as important as the fact that Conant initially supported standardized testing and then questioned it, citing what he considered to be progressive educational principles in both cases.

[lxxi]  Conant, The American High School Today, 62; Conant, My Several Lives, 419; Robert Hampel, “The American High School Today: James Bryant Conant’s Reservations and Reconsiderations,” Phi Delta Kappan (May 1983): 608-609; Lemann, The Big Test, 38, 78-79, 228..

[lxxii] James Conant, “American Remakes the University,” The Atlantic Monthly, 177, 5 (May 1946): 41-45; Patricia Graham (New York: Teachers College Press, 1967), 136.

[lxxiii] Conant, General Education in a Free Society; Educational Policies Commission, Education for ALL American Youth (Washington, DC: National Education Association, 1944); Paul Elicker, Planning for American Youth (Washington, DC: National Association of Secondary School Principles, 1951); Conant, Education in a Divided World, VII.  While some historians have characterized General Education in a Free Society as a conservative defense of the traditional academic disciplines – for example, Paul Westmeyer, A History of American Higher Education (Springfield, IL: Charles Thomas, 1985), 102 – most have described it as a progressive proposal for interdisciplinary and student-centered education – for example, Daniel Tanner & Laurel Tanner, Curriculum Development: Theory into Practice (New York: Macmillan, 1980), 445.

[lxxiv] Conant, General Education in a Free Society, pp.IX, 135; Educational Policies Commission, Education for ALL American Youth, 21, 102, 225-226; Elicker, Planning for American Youth , 19.

[lxxv]  Conant, General Education in a Free Society, 4, 58; Conant, My Several Lives, 366, 368.

[lxxvi]  Conant, General Education in a Free Society, 10, 32, 33, 118, 128, 139, 153, 171.

[lxxvii] Conant, General Education in a Free Society,  33, 77, 114, 171, 192.

[lxxviii] Educational Policies Commission, Education for ALL American Youth, 71-71, 85-87, 234-238, 299; Elicker, Planning for American Youth, 8-9, 19; Harold Hand, “The World Our Pupils Face,” Science Education, 31, 2 (Summer 1947): 55-60; Harold Hand, “The Case for the Common Learnings Course,” Science Education, 32, 1 (Spring 1948): 5-11.

[lxxix] Educational Policies Commission, Education for ALL American Youth: A Further Look (Washington DC: National Educational Association, 1952), 88-89, 380.

[lxxx] Harold Hand, “Local Studies Lead to Curriculum Change,” Educational Leadership, 8 (January 1951): 240-243; Harold Hand, “Making the Public School Curriculum Public Property,” Educational Leadership, 10 (January 1953), 261-264.

[lxxxi] Educational Policies Commission, Education for ALL American Youth: A Further Look, V.

[lxxxii] Conant, Education in a Divided World, VII, 100, 106, 110.

[lxxxiii] Cremin, The Transformation of the School, 348-351.

[lxxxiv] Mortimer Smith, And Madly Teach (Chicago, Henry Regenery Co., 1949), 90; Albert Lynd, Quackery in the Public Schools (Boston: Little, Brown & Co., 1950), 35; Arthur Bestor, Educational Wastelands (Urbana IL: University of Illinois Press, 1953), 81-100.

[lxxxv] James Conant, The Citadel of Learning (New Haven, CN: Yale University Press, 1956), V, 40, 42; Conant, The Child, the Parent and the State, 16, 34, 48, 76, 94; Conant, Slums and Suburbs, 136, 140; Conant, The Education of American Teachers, 6; James Conant, Shaping Educational Policy (New York: McGraw Hill, 1964), 4, 21-24.

[lxxxvi]  James Conant, Thomas Jefferson and the Development of American Public Education (Berkeley, CA: University of California Press, 1962), 61; Conant, The American High School Today, 7.

[lxxxvii]  Landon Beyer, “The American High School Today: A First Report to American

Citizens,” Educational Studies, 27, 4 (1996-1997): 319-337.

[lxxxviii]  Conant, Slums and Suburbs, 3, 4, 12, 36-37, 39; Hershberg, James B. Conant, 726-727.

[lxxxix] Conant, The Education of American Teachers, 7-8, 15, 71, 113.

[xc] Quoted at Hershberg, James B. Conant, 754.

[xci] Rae, The Decline and Fall of the Liberal Republicans, 155-156.

[xcii] Hershberg, James B. Conant, 752.

[xciii] For example, John Goodlad, Educational Renewal (San Francisco: Jossey-Bass,

1994). Goodlad, nonetheless, regards Conant as “one of my mentors” (29).  Also, Barber,

Strong Democracy.

 

 

 

 

A Note on “History as Choice” Gone Wrong: Taking Choices Out of Context and Blaming the Victim.

Burton Weltman

The idea of looking at history as people making choices –i.e. history as choice — has gained some popularity in recent years as an alternative to conventional historical narratives.  Conventional historical narratives, and especially history textbooks, generally approach history as a process of causation, a series of causes and effects in which one thing seems inevitably to lead to the next and in which human choice seems to have little play.  In this view, people seem to be merely cogs in a big historical machine.

Promoters of history as choice, including myself, seek to humanize history by focusing on the role that people and their choices play.  In emphasizing the drama of people debating options and making decisions, this approach makes history more interesting to students.  In relating the decision-making processes of people in the past to the social problems and choices we face in the present, this approach also makes history more relevant.

But, as is often the case with intellectual and cultural developments when they are popularized, the idea of history as choice has been diluted and misdirected by some of its practitioners.  In turn, there has been some backlash against the idea of history as choice from people who are reacting against the misconceptions of those practitioners.  In particular, representatives of historically oppressed peoples have objected to history as choice as essentially a way of blaming the victims for their oppression.  They complain that the implications of this approach are that since history is a result of people making choices, then oppressed peoples must have chosen to be oppressed or made the choices that led to their oppression.  Oppressed peoples are, thereby, responsible for their own oppression.

 

The idea that the poor and oppressed are responsible for their own problems is an old one that dates back to ancient times and recurs periodically in history.  In modern times, the idea resurfaced in the population theories of Thomas Malthus during the early nineteenth century and then in the Social Darwinian theories of Herbert Spencer and others during the late nineteenth century.  The idea regained impetus during the late twentieth century in the United States largely through the work of Edward Banfield in his influential book The Unheavenly City Revisited (1974), and through his successors.

Blaming the poor and the oppressed for their problems is not the purpose of approaching history as choice.   Properly understood, this method has the opposite effect.  Approaching history as choice requires one to delineate the feasible options that people had within the circumstances that those people faced.  Oppressed peoples do not have unlimited options and unlimited resources.  Like everyone else, they have to work within the circumstances in which they find themselves.  The key to approaching history as choice is to look at what people did with the options and the resources they had.  This is a means of humanizing them and seeing the oppressed as not merely victims of their circumstances but also creators of culture and history within their circumstances.

One of the most amazing stories in American history is the way in which African-American slaves created a thriving culture within the restricted circumstances in which they lived.  The moral of their story is not that African-Americans were to blame for their enslavement or that their oppression was somehow a good thing, but that they were able to make profound meaning and beauty in spite of their oppression.  They made great choices within the limited range of feasible options and with the limited resources they had.

So, the problem is not with approaching history as choice but rather with failing to consider the context within which people made their choices.  This has been the modus operandi of Social Darwinians and others who blame the poor and oppressed for being poor and oppressed.  And this has been the implication, often unintended, of some of those who have been promoting history as choice.  But this is not the method as it has been intended.  The solution to this problem is not to abandon the method of history as choice but to apply it properly.

Limiting the Sum of Lincoln’s “Some:” Democracy, Mobocracy and Majority Rule.

Limiting the Sum of Lincoln’s “Some:” Democracy, Mobocracy and Majority Rule.

Burton Weltman

Abraham Lincoln and Democracy as a Fools’ Paradise? 

Abraham Lincoln famously said that you can fool all of the people some of the time and some of the people all of the time, but you cannot fool all of the people all of the time.  This is, I think, a great formulation of political wisdom, and the conclusion that you can’t fool everybody all of the time has been taken as an expression of hope for the world.  Demagogues and shysters may have their way for a while, Lincoln conceded, but we cannot all be fooled by them all of the time.  So, his is a statement of hope for democracy, but it is also a statement of concern.  Lincoln’s underlying concern lies in the word “some” that appears in the first two clauses of his statement, that you can fool everyone some of the time and you can fool some of the people all of the time.  The word “some” in these clauses begs the question as to how long can you fool all of the people, and what proportion of the population can be perpetually fooled?  Most important, Lincoln leaves us with the all-important question of how can we minimize the sum of these “somes?”  How can we minimize number of people who are fooled and the duration of their being fooled?  That was one of Lincoln’s main concerns for American democracy, and remains a concern for us today.

To be foolish is to think that you are expert in something that you know little about, and to think that you are wise in ways that you lack good sense.  It is a failing to which most of us all-too-easily fall prey.  Most people are knowledgeable about things with which they are regularly involved, their jobs for example.  People are generally willing and able to think about such things in complex terms and to reach nuanced conclusions.  At the same time, most people know very little about things with which they are not regularly involved, for example politics and government.  But that does not stop them from thinking that they need to know all about those things, or that they do.  What most people do in that situation is to insist that things about which they are ignorant are really quite simple, and to latch onto some simplistic slogans that supposedly embody the truth about those things. With respect to government and politics, for example, they may latch onto Fox News talking points about government and politics.

The tendency of people to think simplistically about politics and government is a problem in a democracy such as ours.  As citizens of a democracy, we are all supposed to participate in choosing our government, deciding what actions it should take, and monitoring its performance.  Few people, however, actually know anything about how government works.  They don’t study it, and they don’t have government jobs that might familiarize them with the workings of government.  Their personal contacts with government are minimal and usually involve unpleasant matters, such as paying taxes and traffic tickets and complaining when something goes wrong.  At the same time, most people take for granted the services that government routinely and regularly provides, such as clean water, paved roads, street lights, etc.  They assume those things are simple, and many don’t even realize they are a product of government.  It is this ignorance of how government works, coupled with the arrogance of thinking that government is simple, that leaves people open to being fooled by demagogues and shysters who preach bumper sticker slogans and sell simplistic half-truths.

How can democracy function with a public that can be fooled in total some of the time and in part all of the time?  That is the gist of Lincoln’s concern.

Edmund Burke and the Conservatives’ Case against Democracy. 

Edmund Burke, who is generally considered the father of modern conservatism, believed that the general public was not capable of playing a constructive role in government.  That is why he favored an aristocratic and elitist form of government over democracy.  Cultivate an elite class of leaders, Burke claimed, give them the reins of government, and all will be well.  Give power to the people, and you will end up with what we might today call a mobocracy — violent and oppressive rule by the ignorant masses.  Burke warned that the ignorance and impatience of the masses would leave them open to demagoguery — the simplistic sloganeering of malicious leaders who appeal to peoples’ fears and hatreds.  Subservience to demagogues would lead the masses to violence and society to ruin.  In this context, Burke railed against the mobocracy that he claimed was destroying France during the French Revolution of the late eighteenth century.

Burke was particularly opposed to majority rule.  Majority rule, he warned, can quickly devolve into mobocracy in which racial, religious, ethnic and political majorities oppress minorities, impose the majority’s ideas on everyone else, and give no one else a chance for power.  This is a significant issue for us today.  We have seen in recent years many countries around the world where dictatorships have been succeeded by mobocracies which, in turn, may soon be succeeded by new dictatorships.  This was a vicious cycle of dictatorship and mobocracy that worried many of the ancient Greeks, and that worried Burke about democracy.

Burke was also concerned about destructive social changes that he thought would inevitably result from democracy.  The masses, he warned, would be misled by ideologues and demagogues into supporting first one and then another futile radical reform.  This is also a significant issue for us today.  Burke claimed that the ability of humans to predict and control the consequences of their actions was limited, and that the potential for unintended negative consequences outweighed the potential for positive consequences in any radical reform.  Burke was, therefore, unwilling to support radical reforms that proposed to make things better because he believed the resulting unintended negative consequences were likely to make things worse.  He was not, however, opposed to all social change, and he was willing to support modest social reforms that were intended to ameliorate hardship and keep a bad situation from getting worse.

Burke was a pragmatic conservative.  He was not a right-wing ideologist of the sort that today parades as conservative, but is actually radically reactionary and regressive.  Self-styled Tea Party conservatives, religious conservatives, and social conservatives of today want to radically change society back to ways that they believe were better in days long past.  They are not conservatives but radical right-wingers, ideologues and demagogues of the sort Burke feared would dominate in a democracy.  Burke would likely cite their popularity as proof of his arguments against democracy.  Burke and conservatives since his time have generally claimed that liberty and equality, and freedom and democracy, are distinct and incompatible.  Liberty, freedom, minority rights and social stability are safe, they say, only in the hands of an entrenched and enlightened elite.  In a democracy, demagogues of the left and right will invariably take advantage of the masses to destroy liberty, and to wreak havoc on society.

The Founders’ Democratic Faith.

The Founders of this country agreed with Burke about the dangers of majority rule, but rejected his vision of democracy.  Democracy as it was envisioned by them, and embodied in the Constitution, is more properly conceived of as majority rule with minority rights, with the most important of these rights being the right of the minority to someday become the majority.  Guaranteeing the right of minorities to freely function and someday possibly become the majority was a key concern of the Founders, and is the underlying rationale for the Bill of Rights.  While the Founders believed in government of the best and brightest, they also believed that ordinary people could and would recognize and choose the best and brightest as their leaders.  And they believed that the constitutional system they had established provided an institutional framework for successfully combining liberty and democracy.

At the same time, the Founders did not think that the history of democracy in the United States would be an easy progression.  They anticipated a perpetual struggle over the terms and practices of democracy, and they provided in their Constitution for the ways and means of amendment and interpretation that would embody that struggle in what they hoped would be peaceful conflict.  Things did not always work the way the Founders hoped, as exemplified by the Civil War, the various Red Scares and other episodes of intolerance and oppression of minorities in our history.  But as the Founders expected, there has been an ebb and flow of liberty and democracy in the history of this country.

Conventional History, History as Choice and the Case for and against Democracy.

You would not, however, get from conventional histories of the United States the impression that democracy has been a highly contested term and precarious practice.  To the contrary, while conventional histories generally admit that the country has moved from being less democratic to being more democratic, they generally describe American history as an inevitable march toward democracy.  And in so doing, most histories simplistically define democracy as majority rule, and describe the rise of American democracy simplistically in terms of the rise of majority rule.  This is a one-dimensional definition and a one-dimensional narrative that not only makes for bad history, it is undemocratic history.

The problem is that conventional history is essentially winners’ history.  It focuses almost solely on what happened, and leaves out what could have happened.  It provides you with little idea of what the various options were in any given situation, why a given option was chosen over the others, and what might have happened if a different option had been chosen.  The losers are lost in this sort of history.  The minorities who did not prevail are dismissed.  This sort of history does not portray the struggle over democracy, or the struggles within a democratic system.  It provides people with little education in how politics and government actually work, and how they might go about making choices as democratic citizens.  And it plays into the sort of one-dimensional ideological narratives promoted by right-wing and left-wing demagogues.

History as choice resurrects the options, debates, choices, consequences (both intended and unintended), and alternative possibilities of the past, and portrays a multi-dimensional past that looks and feels like the multi-dimensional present.  In so doing, it recognizes the importance of the losers in history, the minorities who may eventually became the majority, sometimes for the better — as in the case of the civil rights advocates who lost in the 1860’s-1870’s, but came back to win during the 1960’s-1970’s — and sometimes for the worse – as with laissez-faire economics which was completely discredited and discarded during the early twentieth century, but has regained credibility and influence during the early twenty-first century.  Approaching history in this way helps you to understand how government and politics work, and helps prepare you for the choices that you need to make as a democratic citizen today.

History as Choice and Limiting the Sum of Lincoln’s “Some.” 

But history as choice is not simple or simplistic.  And most people do not have the time or energy to engage in complex historical studies.  Which brings us back to Burke’s challenge to democracy.   How can we expect ordinary people to be informed and intelligent citizens?  How can we minimize the sum of perpetual fools that worried Lincoln?  What is to be done?  I can suggest at least three things:

First, I think that those of us who are students and teachers of history have an obligation to approach history in ways that I characterize as approaching history as choice.  This method is by no means unique to me.  Even the phrase “history as choice” is used by many others, and is the way that most students and teachers approach history when they are doing what they consider their best work.  Studying history can and should be practice for living in a democratic society.

Second, I think that we all have to recognize and acknowledge that we cannot know everything about all the things we want and need to know about.  We have to rely for most things in this world on the experience and expertise of others.  That does not mean we have to depend on a permanent class of elite leaders as Burke would have us.  The democratic alternative to elite leadership is revolving leadership in which those in the know on a given issue can and will be allowed to take the lead on that issue.  But those who lead on one issue may not necessarily lead on the next issue.  On the next issue, different leaders may emerge.

In this context, possibly the most important question that we all have to answer in deciding most issues is “Whom do we trust?”  And a key to answering that question is to look for people to trust who neither claim that things are simple nor provide simplistic answers.  You would not rely on a doctor who merely mouthed the advertising slogans for medicines that you see on TV.  So why would we want to rely on politicians who do likewise?  We should look, instead, for people who approach problems pragmatically rather than dogmatically, deal with them in depth, and provide more than bumper sticker solutions.  Studying history as choice can help us develop the skills needed to identify those who can be trusted to lead on important issues.

Third, while Franklin Roosevelt may have exaggerated when he claimed that “We have nothing to fear but fear itself,” the politics of fear and hate is the province of demagogues who appeal to our lowest emotions in order to fool and manipulate us.  We must, instead, look for ideas, policies and leaders that emphasize the politics of hope, inclusion and cooperation.  Studying history as choice can help us to distinguish the politics of hope from the politics of fear.  We must understand how and why sometimes bad options are chosen with good options ignored, and vice versa.  We must understand how and why sometimes bad people flourish, but other times good people do.  We must explore the ways in which sometimes losers become winners, and winners become losers.  This is the ebb and flow of our daily lives and of history, and these studies can help us distinguish between what we should fear and what we can hope and trust.  Yes, there are things to fear, but fear is the worst of these things, and the thing we should fear the most.  That we can overcome our fears and our ignorance is the hope that Lincoln left us with, and the best way to meet Burke’s challenge to democracy.

What do you think?

Postscript:  For further discussion of history as choice and democracy see my book Was the American Revolution a Mistake? Reaching Students and Reinforcing Patriotism through Teaching History as Choice (AuthorHouse, 2013).

Would the United States still have slavery if the South had not seceded in 1861? Part III. Conclusion: Very likely.

Would the United States still have slavery if the South had not seceded in 1861?  Part III. Conclusion: Very likely.

Burton Weltman

Slavery had been on the decline in the Western Hemisphere during the late eighteenth and early nineteenth centuries.  It had been abolished in Haiti in 1791 and Canada in 1793.  And as Latin America countries gained their independence from Spain during the early nineteenth century, they abolished slavery: Argentina in 1813; Peru in 1821; Chile, Ecuador, Columbia, Panama, and Venezuela in 1823; Guatemala, El Salvador, Honduras, Nicaragua, and Costa Rica in 1824.  Mexican revolutionaries proclaimed the abolition of slavery in Mexico in 1810 and slavery was officially abolished there in 1829, although the practice continued illegally in the area of Mexico that became Texas.  Britain abolished slavery in her colonies during the 1830’s.

But slavery still thrived during the mid-nineteenth century in Brazil, by far the largest holder of slaves in the New World, and in Cuba.  And slavery expanded in Ecuador, Peru, Colombia and Central America after 1850 in the midst of a boom in those countries in the growing and processing of rubber.  Various forms of involuntary servitude were also widely practiced in India, China and the Middle East throughout the nineteenth century.  When the South seceded from the Union in 1860-1861, slavery was still a going concern in the United States and elsewhere in the world and might have gone on further and farther but for the consequences of the Civil War.

The abolition of slavery in the United States had a profound effect on the history of slavery in the world.  If slavery had not been abolished here during the 1860’s, the United States would have emerged during the late nineteenth century as the world’s largest economy, the world’s largest and leading democracy and the world’s leading slave-holding country.  The power and prestige of the United States could have given the institution of slavery a legitimacy and impetus that could have carried the institution into and through the twentieth century.

It cannot be assumed that the development of democracy in the United States during the twentieth century, including the right to vote for women, would somehow have led to the end of slavery.  Slavery has existed alongside democracy in several societies in the world, including ancient Athens as well as the early United States.  It has even been argued that the emergence of democracy in both of those societies was a product of slavery.  Slaves performed the societies’ demeaning tasks which enabled the free men to associate with each other on the relatively equal terms necessary for democracy.

Nor can it be assumed that the industrialization of the North during the late nineteenth century was incompatible with slavery in the South.  The industrialization of the North during the early nineteenth century had been perfectly compatible with slavery in the South and even depended to some extent on slavery.  Southern slaves produced cheap cotton that was manufactured into cloth and clothes by free northern workers.  This sort of division of labor could have continued.  It also seems likely that slaves could have been used as factory labor in an industrializing South and, given the potential effects of the Dred Scott decision which seemed to have opened the whole country to slavery, possibly even in the North.

Nor, finally, can it be assumed that the refinement of morals and manners that has occurred in the United States during the twentieth century would have somehow produced an environment incompatible with the continuance of slavery.  Americans and people elsewhere have been all too able to compartmentalize separately their high-tone feelings and their low-life prejudices.  I am reminded, although it is an extreme case, of the Commandant of Auschwitz who was able to record the noblest thoughts about his family, friends and flowers in his diary alongside statistics and comments about his day’s work exterminating human beings.

There were thirty-four states in the United States in 1860 of which fifteen were slave states.  It takes the support of three-quarters of the states to approve a Constitutional Amendment.  Thirteen southern slave states seceded to form the Confederate States of America.  In their absence, anti-slavery northerners mustered enough votes in Congress and among the remaining states to ratify the 13th Amendment and abolish slavery.  When the Confederacy lost the war, the Confederate states were required to ratify the 13th Amendment as a condition of their regaining their rights and powers as members of the Union.

The bottom line is that if slaveholders in the South had not made what was for them a disastrous blunder in seceding from the Union in 1860-1861, the votes in Congress and among the states to abolish slavery would not have been there during the late nineteenth century and might still not be there today.  There are fifty states today and the negative votes of fifteen slave states would still be more than enough to squelch an amendment to abolish slavery.  In any case, the United States would almost certainly have entered the twentieth century as the world’s leading superpower with slavery as a thriving institution in an otherwise democratizing society.  And might still be today.

Note: This issue is discussed at greater length with citations and quotations in the chapter entitled “Choice #9: The Coming of the Civil War: Why Didn’t the North Secede and Why Did the South?” of my recently published book Was the American Revolution a Mistake? Reaching Students and Reinforcing Patriotism through Teaching History as Choice (AuthorHouse, 2013).

Would the United States still have slavery if the South had not seceded in 1861? Part II: Why did the South secede?

Would the United States still have slavery if the South had not seceded in 1861?  Part II: Why did the South secede?

Burton Weltman

Conventional history has it that secessionist sentiment was rampant in the South during the 1850’s and that the election of Lincoln in 1860 was the straw that broke the camel’s back and led to a secessionist stampede.  This was not so.  Secession was not popular in the South before the attack on Fort Sumter in April, 1861 that began the Civil War.  Prior to that attack, the great majority of slave states had rejected secession and even within those states that had seceded following Lincoln’s election, large minorities of white people, and possibly even majorities, opposed secession.

Most slave owners in the South felt comfortable with the political and economic situation in 1860.  Lincoln had won the presidential election with only 40% of the vote, with 60% going to pro-slavery candidates.  Congress was effectively stalemated between pro and anti-slavery members.  The Fugitive Slave Act and the  Dred Scott decision were the laws of the land and there was very little chance of these laws being changed in the foreseeable future.  This seemed especially the case since “Cotton was King” and the North was economically dependent on Southern trade.  Most slave owners felt that the North needed the slave South economically.  They also felt that the South needed the North to help control and contain the slaves.  For most southern supporters of slavery, including prominent figures such as Alexander Stephens, who later became the vice-president of the Confederacy, the Union was slavery’s best protection.

So, how did it happen that almost the whole slave South seceded by the spring of 1861?

A relatively small but very vocal group of southern “Fire Eaters,” led by Robert Barnwell Rhett and James Hammond of South Carolina and James Loundes Yancey of Georgia, were convinced that the North was out to abolish slavery and that if the South did not get out of the Union soon, it would soon be too late.  Comparing their situation to that of the colonies before the American Revolution, and taking a position that mixed overwrought fear with unfounded self-confidence, they promoted secession during the 1850’s and especially after the election of 1860 as a preemptive strike to forestall the tyranny of the North before it could get started.

As bad as political developments of the 1850’s seemed to anti-slavery northerners, they seemed worse to southern Fire Eaters, almost as though the two groups were living in alternate universes and were not experiencing the same events.  From the Fire Eaters’ perspective, the pattern of significant events of the 1850’s had begun with the acrimonious debate over the Wilmot Proviso, which was intended to prohibit slavery in any new territories, had proceeded with the formation in 1854 of the Republican Party, which was dedicated to restricting and maybe even ending slavery, and had culminated in John Brown’s terrorist raid on Harper’s Ferry in 1859, which was intended to start a bloody slave revolution.  The election of Lincoln in 1860 was seen as a sign they must make a move to save slavery through secession before it was too late.

Although the North had not yet done anything to overturn slavery, and was in no position to do so, the Fire Eaters stirred fears in white southerners that the North was growing faster than the South and would eventually overwhelm it.  They warned that northerners were continuously agitating among the slaves, promoting runaways and provoking rebellions.  They complained about northern assistance to runaway slaves and, ironically, thereby helped publicize the Underground Railroad to potential runaways.  Essentially feeding their own fears while trying to provoke the fears of their southern white compatriots, Fire Eaters reinforced the conclusion with which they had started: that the South must make a pre-emptive move to secede.

Fire Eaters were also afraid of the potential spread of abolitionism among southern whites if they stayed within the Union.  Most southern whites were hurt by the slave system.  Only some 25% of white southerners owned any slaves and fewer than 10% of these owned over 75% of the slaves.  This small minority of large-scale slave owners lived on big plantations and monopolized most of the best land in the South.  Given their use of slave labor and their ownership of the most fertile land, these plantation owners were able to produce larger crops at lower cost than the mass of small farmers.  As a result, small farmers were paid lower prices for their crops and made less money than if they weren’t competing against slave labor.  Similarly, southern white craftsmen and white workers earned less for their labor because they were competing against slave labor.  Southern whites before the Civil War had a lower standard of living and a lower life expectancy than both northern whites and northern blacks.

Fire Eaters countered economic arguments against slavery with racial and cultural appeals.  They stoked fears among whites of blacks taking over the South if slavery was abolished and portrayed abolitionism as a clear and present danger, especially after the election of Lincoln.  They also made the protection of slavery the focal point of a broad-based opposition to what they portrayed as liberal northern attitudes and policies that favored big government, high taxes, wasteful social and economic programs, costly public education, free speech, egalitarian gender relations, and other hot-button political and cultural issues. Fire Eaters portrayed themselves as the protectors of a romantic conservative tradition that was being undermined by northern liberalism, and they portrayed threats to the expansion of slavery as threats to this southern way of life. White people were harangued to support this heroic tradition by defending slavery.

Fire Eaters compounded their assertion of southern cultural superiority with an inflated faith in southern military prowess.  They believed that the South was better prepared militarily than the North, since southerners were a larger percentage of the officer corps of the United States Army and a larger percentage of southerners had guns and used guns both to hunt animals and to defend themselves against slaves.  So, if northerners wanted to fight against southern secession, the South would whip them.  Fire Eaters also believed that the South would get support from England in any war against the North since England was so dependent on southern cotton.

At the beginning of their campaign, Fire Eaters had hoped that the Wilmot Proviso, which prohibited slavery in territories gained in the late 1840’s from Mexico, would be enacted by Congress in 1850 because it might serve as a provocation for southern secession.  Thereafter, they sought to goad South Carolina, historically the most radically pro-slavery colony and state, into secession.  They hoped that this would provoke a northern overreaction similar to the British reaction to the Boston Tea Party and, thereby, provoke a general southern insurrection similar to the American Revolution.  With the election of Lincoln, they hysterically portrayed the situation as a now or never crisis.  This time their cries of “Wolf” worked.

With South Carolina leading the way, seven states seceded in the aftermath of the election of 1860 but even then pro-union southerners such as Senator Crittenden of Kentucky tried to propose a compromise that would bring those states back and keep others from seceding.  Those efforts were thwarted by southern radicals and finally ended with the attack engineered by Fire Eaters in secessionist South Carolina on the federal Fort Sumter.  This attack was ironically portrayed by Fire Eaters as an act of aggression by the North on the South.  With the South ostensibly under attack, other slave states seceded from the Union and the Civil War was on.

As with the American Revolution, the war known in the North as the Civil War but in the South as the War for Southern Independence was the result of an assiduous campaign by a determined minority that believed it knew better than the majority what was best for their country.  But the results of this attempted revolution were very different from those of the last and the war to save slavery became the war that ended slavery.    

Note: This issue is discussed at greater length with citations and quotations in the chapter entitled “Choice #9: The Coming of the Civil War: Why Didn’t the North Secede and Why Did the South?” of my recently published book Was the American Revolution a Mistake? Reaching Students and Reinforcing Patriotism through Teaching History as Choice (AuthorHouse, 2013).

Would the United States still have slavery if the South had not seceded in 1861? Part I: Shouldn’t the North have seceded from the Union instead of the South?

Would the United States still have slavery if the South had not seceded in 1861?  Part I: Shouldn’t the North have seceded from the Union instead of the South?

Burton Weltman

Conventional histories invariably portray the secession of the South from the Union as an almost inevitable response to Abraham Lincoln’s election as President in 1860.  In fact, there was a stronger argument for the North to secede in 1861 and very little reason for the South to do so.

The decade of the 1850’s was an almost complete disaster from the point of view of anti-slavery northerners, starting with what they saw as an infamous appeasement of the South in the so-called Compromise of 1850 and ending with a complete abdication to slavery in the Dred Scott Case of 1857.  As a result of these laws and legal decisions, anti-slavery northerners felt that no one, white or black, was safe from enslavement and no place would be free from slavery.

The Compromise of 1850 both expanded the territory within which slavery could legally exist and contained a Fugitive Slave Act.  This Act provided that anyone could be accused by a slave-catcher of being a fugitive slave and then had to prove that he or she was not a slave.  If the person could not present this proof, he or she could be taken away as a slave.  Since many “black” slaves were the product of sexual relations between white masters and slave women, many “blacks” had complexions that were as light, and even lighter, than those of “whites.”  As a result, a free white person could be accused of being an escaped black slave and if the person could not prove that he or she was not a slave, the person could be taken away as a slave.

The safeguards provided in the Fugitive Slave Act against mistakenly identifying a freeman as a slave were not very safe.  If someone was accused of being a fugitive slave, the person had the right to a hearing in which the person could try to prove that he or she was not a slave.  Those hearings were not, however, conducted in a regular court with a judge but in front of a special United States Commissioner who would be paid five dollars for each case in which a person was found to be a freeman and ten dollars for every case in which a person was found to be a slave.  As such, the system encouraged Commissioners to find that people were slaves.

Finally, under the Fugitive Slave Act, every northern free person was required to help capture fugitive slaves, and was thereby required to be a participant in and a supporter of the slave system.  The law made every northerner a servant of southern slave owners for purposes of keeping the southerners’ slaves in captivity.

The Compromise of 1850 was seen by anti-slavery northerners as the subjugation of the North by the South.  In subjecting white people to the possibility of being taken as fugitive slaves, and making every northerner an accomplice in the slave system, the law was seen by northerners, even by many who were not against slavery, as an incursion of the slave system into the North.

If the Compromise of 1850 represented an incursion of slavery into the free states, the Dred Scott decision of 1857 represented an invasion of slavery into the North and an end to freedom in the United States.  In striking down the Missouri Compromise and holding as a matter of constitutional law that a person may take his property, including his slave property, anywhere in the United States, the Supreme Court effectively held that there was no such thing as a free state.

If, as the Supreme Court held, a southern slave owner could take his slaves into a northern “free” state and retain title and control of them as slaves, then slavery was seemingly legal and protected by the Constitution everywhere in the United States.  In sum, the United States was a slave country in its entirety and only a Constitutional amendment overturning the Dred Scott decision could change the situation.

While the election of Lincoln as President in 1860 was a victory for anti-slavery advocates, it was a hollow victory that could have had no effect on the status of slavery in the country and that provided no hope whatsoever that slavery could be limited in the country let alone eliminated.

Lincoln got only some 40% of the votes in the election of 1860, almost all from the North.  The other 60% of  the votes were divided among three other pro-slavery candidates.  Since Lincoln’s Republican Party was a regional party that was strong only in the North, there was little hope that it could become a national party that could influence slavery politics in the country as a whole.

The South had a big advantage in national politics because under the Constitution each slave was counted as three-fifths of a person for purposes of allocating members of the House of Representatives and presidential votes in the Electoral College.  Under this system, eight of the first fifteen presidents of the United States were from the South and the others were essentially elected by the South.  Five of the nine Supreme Court Justices during the 1850’s were southerners which meant that the Constitution was firmly controlled by proponents of slavery.  Despite Lincoln’s election, there was no reason to believe that this would change.

In any case, a Constitutional amendment affecting slavery seemed foreclosed forever.  A Constitutional amendment must be approved by 2/3 of the House and the Senate and by 3/4 of the states.  Congress in 1860 was about evenly divided between pro-slavery and anti-slavery advocates.  This gave no hope of getting the 2/3 majorities in both the House and the Senate needed for proposing a Constitutional amendment affecting slavery.  Even more important, there were thirty-four states in the United States in 1861 of which fifteen were slave states.  There was no way that a Constitutional amendment limiting or eliminating slavery was going to be approved by 3/4 of the states in 1861 or at any time thereafter.

In the face of these facts, influential anti-slavery northerners such as William Lloyd Garrison, Wendell Phillips, Theodore Parker, Horace Greely and Ralph Waldo Emerson called for the separation of the North from the South in order for the North to escape what they saw as the stranglehold of “The Slave Power” over the United States.

So why didn’t the North secede?  There were probably a combination of reasons.  One reason was patriotism  — the belief in American’s preeminent role in bringing peace, prosperity, liberty and democracy to the world — of which there was an upsurge in the North during the mid-nineteenth century before the Civil War.

Economics was another reason.  Southern and northern economies were intertwined.  Southern cotton fed northern mills and northern food crops fed southern slaves.  Cotton was also the major American export which paid for goods imported from Europe.

Another reason was democratic idealism which Lincoln articulated in his Gettysburg Address: the desire to prove that democracy could work and endure.  The prevailing opinion in Europe at that time was that democracy could not last, that democratic countries would inevitably descend into factional and sectional conflicts and eventually fall apart.  Northerners needed to prove that theory was wrong.

Still another reason was geopolitical.  If the North seceded and the slave South became a separate nation, the South would likely become a dependency and ally of England.  That would leave the North surrounded by English Canada and a South dependent on England.  Since the United States and England were not on friendly terms — the United States had tried to stir up Canadian rebels for independence during the 1830’s and had engaged in a vehement dispute with England over the boundaries of the Pacific Northwest during the 1840’s — this was not a desirable prospect.

Finally, there were those who did not want to run away from the fight over slavery and thereby leave the southern slaves in the lurch.

Who do you think they had the better of the argument?  Should the North have seceded in 1861?

Note: This issue is discussed at greater length with citations and quotations in the chapter entitled “Choice #9: The Coming of the Civil War: Why Didn’t the North Secede and Why Did the South?” of my recently published book Was the American Revolution a Mistake? Reaching Students and Reinforcing Patriotism through Teaching History as Choice (AuthorHouse, 2013).