Article # 4
American and United States History
I. Evolution of Native American Cultures
About 17,000 years ago, with the world
at the height of its last Ice Age, an icy bridge across today’s Bering Strait
served as the conduit for people following their game herds from Siberia into
Alaska. From there, they trekked
southward to the Rocky Mountains by about 10,000 years ago and from the Rockies
moved eastward to the Atlantic coast.
These peoples fashioned bows and arrows, war clubs, and spears from wood
and stone; with the wooly mammoth
extinct as they were in the midst of their migrations across the plains and
prairies of the upper West and Midwest, they hunted small game, deer,
moose, and the American bison
(buffalo). By about 1500 B. C. (CE),
some of these hunter-gatherers learned to cultivate crops on land along rivers,
settling into small villages and developing distinct cultures.
The Aleut of southwestern Alaska lived
in sod houses, going forth to fish and hunt wildlife, including sea
mammals; women used an original
two-strand twining technique to weave clothes and blankets. The Inuit people spread out from southeastern
Alaska to Greenland, hunting whale, seal, and caribou; in response to their Arctic climate, they
built ice igloos, constructed the versatile kayak, and wore footwear highly
adapted to ice and snow.
The Ottawa people originally lived
north of the Great Lakes before moving to an inlet (Georgia Bay) of Lake Huron
in southeastern Ontario. The Huron
people also lived in Ontario, as did a portion of the Iroquois confederacy
(comprised of the Mohawk, Oneida, Onondaga, Cayuga, and Seneca); the Iroquois
also settled in today’s Quebec Province and New York State. The Narragansett people constructed their
wigwams (domed houses with sapling frames covered with bark and deerskin) in
Rhode Island.
Arapaho, Blackfoot, Comanche, Cree,
Crow, and Flathead tribes lived and hunted in various areas of the Great
Plains; the Cree also lived in the
woodlands along and north of today’s Canadian border. To the northwest (today’s Idaho, Oregon, and
Washington) lived the Nez Perce, while across the expanses of today’s Midwest
lived Cheyenne, Dakota, Kickapoo, Ojibwe, Osage, and Shawnee.
The Cherokee were skillful farmers who
lived in the southern Allegheny Mountains of Alabama, the Carolinas, Georgia,
and Tennessee. The Delaware built their
rectangular, bark-covered houses in the woodlands areas extending from the
Atlantic. The Miami also were an Eastern
Woodlands people who burned forest land to clear fields and control brush to
abet their agricultural economy; they
also, though, hunted buffalo (unusual for those living so far east). The Shawnee lived in Kentucky and West
Virginia after being pushed southeastward from Ohio.
The eight clans of the Seminole lived
in Florida. The Chickasaw hunted
panther, deer, bear, beaver, and otter in northern Mississippi; males distinctively shaved both sides of the
head, leaving a central crest. In other
areas stretching across the American South were Natchez, Choctaw, and Creek,
the latter composed of 50 distinct bands.
The Kiowa were a highly mobile people, based in Oklahoma but roaming and
raiding far enough to bring back parrots and monkeys from South America. In the American Southwest lived Apache, Hopi,
Navaho, and Utes. The Yaqui were ardent
warriors dominating northern and northwestern Mexico.
Native American societies responded to
their natural environments and local circumstances of life in ways that
produced a variety of cultural traits, economic activities, and artistic
expressions.
The Iroquois used wampum, belts or strings with knots and beaded designs serving as
mnemonic aids for chroniclers of stories and legends; wampum
also served as currency and as a unit of measure. Pueblo tribes crafted items associated with kachina dancers, including masks that
the dancers wore and dolls representing the dancers themselves; they also produced turquois and shell jewelry
and exquisite pots. In a fascinating
practice comparable to those of Tibetan Buddhists, the Navajo created colorful
sand paintings from sand, charcoal, cornmeal, and pollen to depict religious
symbols and to create a sense of spiritual reverence--- then, as comment on the evanescence of
physical existence and material objects, they scattered the coloring agents
back into nature.
Native American groups made logically
adaptive decisions in matters of diet, physical security, and economy. The Inuit (Eskimo) cured and stored meat and
fish for the winter. Tribes of the
Pacific Northwest fished from 50-foot long dugout canoes that were perfect for
seafaring. The Eastern Woodlands people
used hoes and digging sticks to work fields productive of maize and
tobacco. The Dakota and other Plains
people adroitly whipped up a stampede among buffalo and drove these over cliffs. Many peoples of the American Southwest ground
acorns into flour that they made into dough, which they flattened and placed on
heated stones for producing wafer-thin bread.
Certain concepts and practices
undergirded most Native American belief systems, while others were highly
distinctive to particular groups. Most
Native American groups gave high status to a shaman who was perceived to mediate between human beings and the
gods, spirits, and souls of the dead.
The special powers of the shamans
were often associated with a striking physical appearance, including features
that we today think of as being those of the handicapped or disfigured. The shaman
would typically acquire dramatic insight while suffering a physical ordeal
and drifting into a trance. Realization
of her or his powers would come with the sensation of leaving the body to soar
through the realms of the gods and the dead.
The shaman was many things to
her or his people: spirit medium, mystic
seer, wise sage, eloquent poet. The shaman was also a physician who could
both apply curative herbs and oust an offending spirit. Curing disease by
expelling a malevolent spirit would involve an emotional array of
activities: swaying, drumming, chanting,
sighing, groaning, and laughing hysterically---
with the emotional state deepening and the sound rising as the healing
rite moved through successive stages.
Native Americans of the Pacific
Northwest told tales of land and sea, bear and salmon, military victories and
dramatic historical events. Clan-based
totemic societies formed to express the mystic
relationship between one’s group and an emblematic figure.
The clan’s legends and history would be told by masked dancers or master
storytellers among the elders. Totemic
societies performed rites venerating the
Sun, Moon, Sky Being, and Creator (in the form of a Trickster Raven).
The Cree people venerated spirits
associated with the hunt, and they revered an Earth Goddess who gave life and
maternal attention to all animals.
Generations past, present, and future existed in close association: The souls of the ancestors lingered in close
proximity to their living descendants.
Legends featuring talking animals and the Four Directions gave testimony
to Cree belief in the unity of Nature.
The Inuit (Eskimo) conveyed myths of
the whale, the walrus, mysterious ghosts, and fantastic creatures. During long winter months, the Inuit would
often sit waiting for caribou, or they would situate themselves by blowholes
for hunting fish or seal; such scenarios
could animate imaginations productive of wondrous spirits and startling
occurrences. Sitting and peering into
the winter sky, the Inuit would see family and friends in the aurora borealis (Northern Lights)
dancing in a realm beyond Earth, the life to come.
Native Americans on the Great Plains
revered the Spirit of the Buffalo and the Earth Mother. Men organized themselves into ritual
societies that prepared them and sustained them in the activities of
governance, war, and hunting. Frequently
under the counsel of a spirit guide, young people would come of age with a
Vision Quest in which they would dwell under conditions of fasting and physical
seclusion, productive of dream-states giving insights into their future life
missions and roles in family and society.
The Iroquois believed in an impressive
but remote “All Father” who dwelled in every aspect of Nature.
Spirits in the natural world were thought to be more active in daily
life and in annual events, controlling the seasons and animating major
festivals associated with the agricultural calendar.
Pueblo people sat atop mesas, the
rocky tableland of the American Southwest, peering into a world in which
visionary beings brought the blessings of life and received love and veneration
in return. Native Americans living in
Pueblo communities told tales from a vast assortment of myths conveying the
relationship between humankind and the plants and animals of the Natural World.
Among those Native Americans dwelling
and farming in the American South, the Natchez were notable for belief in a
Sacred King. Creeks, Choctaws, and
Chickasaws told tales of a Trickster Spirit, a personified Rabbit, and the
origins of tobacco and maize.
In the American Southwest, shamans communicated intensely with
gods, ancestors, and the spirits of Maize, Rainbow, Sun, and Thunder. Around communal fires, storytellers told of
dramatic events associated with the hunt and the experiences of the ancestors. Boys at eleven or twelve years of age
typically went on the Vision Quest.
The Native American peoples were the
first inhabitants of what we now call the United States. Their logical rhythms of life, intimate
connections to nature and ancestors, and rational economic responses to
environmental circumstances were severely disrupted by the arrival of
Spaniards, British and French in the course of the 16th and 17th
centuries.
II. Arrival and Colonial Settlement of
Europeans
Between 1492 and 1630, the explorers
Columbus, Dias, Da Gama, Magellan, Cabot, Frobisher, Cartier, Ribault, Cortez, Pizarro, De Leon, De Vaca, Coronado,
De Soto, Hudson, and Champlain arrived, explored, and created pathways for
others to follow to what we now call the United States; and the Spaniards explored and established
settlements in the areas now covered by Florida, Texas, New Mexico, Arizona,
Nevada, Utah, and California.
As the Spaniards went busily about
governing Florida and the great expanses of the American Southwest under their
control, the British established settlements all along the Atlantic coast: Virginia (1607), Massachusetts Bay (1620),
New Hampshire (1623), Maryland (1634), Connecticut (1636), Rhode Island (1636),
Delaware (1638), North Carolina (1663), South Carolina (1663), New York (1664),
New Jersey (1665), Pennsylvania (1681), and Georgia (1732).
The London Company founded Great
Britain’s first permanent colony along the Atlantic seaboard in Virginia. On 6 May 1607, three ships carrying 100 men
arrived at Chesapeake Bay, moving inland 40 miles to erect a fort, thatched
huts, as storehouse, and a church. Tales
of the Americas providing abundant sources of gold had impelled these settlers
on their journey. When they did not find
the gold for which they sought, there was a period of dissolution in which their
lack of woodland skills (many of these people were aristocrats who had little
experience in fishing, hunting or farming) left them hungry, languid, and
demoralized. The Captain John Smith
projected himself as leader and laid down the law that “he that will not work
will not eat.” Jamestown survived
largely because Smith cultivated good trading relations with the indigenous
Powhatan group and roused the community for economically productive activity,
including cultivation of maize in the manner taught them by those native to the
area.
In 1620 a Separatist group of
Protestants who became known as the Puritans, seeking to escape the dictates of
the Church of England, journeyed from England via Holland aboard the Mayflower across the Atlantic with Virginia
as destination. These Pilgrims (as they
are also known as religious journeyers to North America) were blown off course,
landing at Cape Cod and coming ashore in Plymouth Harbor at a site known as
Plymouth Rock. In the absence of any
official governmental jurisdiction, forty-one of these men on 21 November 1621
signed the Mayflower Compact, a
formal agreement to abide by laws as established by their leaders.
At first suffering terribly from exposure and disease, survival again
came on the strength of skills in maize cultivation taught to them by the
indigenous peoples, in this case the Wampanoag (Narragansett) group. A member of this group by the name of Squanto
has loomed particularly large among those Native Americans given credit for
friendship and instruction rendered to the Pilgrims.
In spring 1630 John Winthrop led six
boatloads of settlers ashore to found the Massachusetts Bay Colony and before
year’s end he welcomed voyagers aboard 17 other boats to the colony for which
he served as governor. Winthrop himself
had traveled aboard the Anabella and
while aboard that craft had delivered a ringing sermon proclaiming that he and
his fellow Puritans would build a “a shining city on a hill” that would do
God’s work on earth. Winthrop did in
fact lead a theocracy in the quest to cleanse the Church of England of the last
vestiges of Catholicism.
But some were not enamored with the
Puritan brand of Protestantism. One
member of the Massachusetts Bay Colony, Roger Williams recoiled at the fact
that land had been taken from Native Americans without compensation; and he objected to the rigid conformity
imposed by Winthrop and his fellow theocrats.
He readily moved on when invited to leave in 1635, trekking to
Narragansett Bay to found Providence upon land for which the indigenous people
of the area were given fair return. This became the colony of Rhode Island, the
first among the original thirteen to legislate religious freedom. This sort of freedom in Rhode Island became
a study in contrasts when zealous Puritans in Massachusetts executed four young
women in the aftermath of the Salem Witchcraft Trials.
In 1663 Charles II, the reigning
monarch in the aftermath of the English Civil War, granted the Carolinas to
eight lord proprietors; in 1669 he sent
forth three ships with a total of 100 settlers that arrived in at coastal South
Carolina. They moved several miles along
the Ashley River to a site they dubbed Charles Town (later Charleston). The community shifted downstream to Oyster
Point, overlooking Charleston Harbor, in 1680, and in 1719 the colony reverted
from proprietorship to direct control by the royal court. Originally governed in common, the territory
was thereafter divided into the two separate colonies of North Carolina and
South Carolina.
In 1681 William Penn endeavored to
claim and lead an area to which his father had been granted proprietary
rights. Penn was an energetic and
religiously motivated leader who circulated advertisements that attracted 1,000
English settlers to join a population that already included settlers from the Netherlands and
Sweden. Arriving himself aboard a ship
with 100 settlers, Penn established Philadelphia (“City of Brotherly Love”) at
the confluence of Schuylkill and Delaware Rivers. In 1682 the Charles II granted control of
Delaware to Penn, which had functioned effectively as a colony since 1636. Under the leadership of Penn and his fellow
Quakers, both Pennsylvania and Delaware were known for religious tolerance, ethical
governance, and good relations with the indigenous people.
General James E. Oglethorpe governed
the last colony established, Georgia (1733, granted to 21 proprietors), as a
philanthropic experiment that began with 120 colonists who had been
impoverished, persecuted, and formerly imprisoned. The colony was centered on Savannah, near the
mouth of a river given the same name.
That location made Georgia a buffer against the territorial ambitions of
the Spaniards.
Such a buffer recognized the reality
of competition among European powers on North American soil. Queen Anne’s War (1701-1713) was one of those
extended, multi-participant, territorially extensive wars in which aspiring
imperialists worked out their ambitions in violent conflict. In 1702, forces out of Charleston (assisted
by Yemassee and Creek Native American contingents) successfully attacked the
first permanent European fort in North America, St. Augustine, destroying this
hallmark of Spanish rule in Florida.
The French and their own Native American supporters raided villages from
Maine to Massachusetts in 1704, sacking several in the latter state. The British struck back and secured,
according to the terms of the 1713 Treaty of Utrecht, recognition of their
sovereignty over Hudson Bay, Newfoundland, St. Christopher, Acadia (later Nova
Scotia)--- and the Iroquois
Confederacy.
But ambitions near the Canadian border
did not abate. In King George’s War
(1744-1748), Massachusetts Governor William Pepperell superintended victory
over the French at Fort Louisbourg on Cape Breton, but the war ended in a
stalemate. What’s more, agreed in the
Treaty of Aix-la-Chapelle (1748), the British to return Fort Louisbourg in
exchange for Madras, the city on the coast of India that had been taken by the
French.
Then, in the North American manifestation
of the Seven Years’ War, the French and Indian War (1755-1763), French and
British forces once again squared off, this time in competition over the Ohio
Valley. The French captured Forts
Oswego, George, and Ticonderoga, but then in succession they lost Forts
Louisbourg, Frontenac, and Duquesne. But
most momentously, and at the same time that the war against the British in
India also was going badly for the French, James Wolfe moved on the forces of
General Louis Joseph de Montcalm in Quebec
and after five days of fighting capture Quebec. By the Treaty of Paris (1763), the French
ceded almost all North American and Indian territory to the British, a dramatic
event that forever changed the course of history in the United States, Canada,
and India.
On the cusp of the American
revolution, a national identity was forming,
Three contending social forces are notable:
The first force was religiosity. Many people who came to the Atlantic coast
were deeply religious. By the 18th
century, pious Americans bristled at the manner in which worldliness and
materialism were encroaching on religiosity.
The pious struck back in the Great Awakening, the spirit of which is
captured in the classic Jonathon Edwards sermon, “Sinners in the Hands of an
Angry God.’
The Great Awakening shook the
religious establishment and rattled the barons of secular society, resulting in
depleted old orders and giving rise to new denominations: Baptist, Presbyterian, Methodist.
The second force was rugged frontier
individualism. Living far from European
homelands that included not only British but Dutch, German, Scottish, Irish,
Welsh, Swiss, French, Danish, Portuguese, Spanish, Italian, Bohemian (Czech),
and Polish national origins; these
people sought freedom to go where they wanted to go, worship as they wanted to
worship, and work in the jobs that could maximize their personal and familial
fortunes.
The third force was the spirit of the
Enlightenment, that great age of science and reason that found experimental ground
on the soil of the emerging young nation.
As that young nation incubated over the course of the 17th
and 18th centuries, John Winthrop, Jr. (1606-1676), a member of the
Royal Society (an association of the scientifically inclined, based in London),
brought the first telescope to the colonies.
John Winthrop IV (1714-1779) introduced calculus to the colonies and
emerged himself in astronomy, geology, chemistry, and electricity.
The latter subject was famously an
interest of Benjamin Franklin (1706-1793), who identified the lightning force
of nature with electricity in his famous kite experiments (1751). Franklin was by trade a printer, publishing
the Pennsylvania Gazette and Poor Richard’s Almanac. He also established a library, founded a fire department, founded
the University of Pennsylvania, and founded a debating society that in time
became the American Philosophical Society.
Franklin brought all of his intellectual acumen to assemblages convened
to consider the break with Great Britain and to construct a government when
that step had been taken.
The construction of a governmental and
societal framework for the United States was one of the most audacious
experiments of the Enlightenment, with the new nation serving as a testing
ground for the ideas of political philosophers Locke, Rousseau, and
Montesquieu. As Benjamin Franklin
entered the productive final decades of his fully-lived life, the forces of
religiosity, individualism, and Enlightenment would blend and contend in the
upstart United States of America.
III. American Revolution and National
Independence
Military victory entails costs in
people and currency. The French and
Indian War was costly, and inasmuch as this was just one geographic
manifestation of the global Seven Years’ War, all sides paid dearly in both wa
ys. British national debt doubled during
in the course of 1755-1763: George III
had bills to pay, so he turned to the colonies via the British Parliament. Five acts of that legislative body raised
high the ire of the colonists:
Proclamation of 1763, forbidding colonists to settle west of the
Appalachian Mountains; Revenue Act of
1764, identifying revenue gathering as the main purpose of the colonies; Currency Act of 1764, criminalizing the
printing of paper currency in the colonies;
Quartering Act of 1765, requiring the colonists to quarter (house) and
feed British troops as needed; Stamp Act
of 1765, imposing a tax on all printed goods.
The latter act affected powerful
colonists working in paper-heavy professions:
lawyers, publishers, merchants, shopkeepers, estate owners, speculators,
and tavern-keepers. They resented the
monetary imposition, which required payment in gold and silver. And they further bristled at trials being
made exclusive to vice-admiralty courts---
without benefit of juries.
These acts impelled colonist to boycott
British products: They eschewed tea in
favor of sage and sassafras brews; they
wore homespun garments in patriotic pride.
In October 1765, a Stamp Act Congress convened in New York, with
colonial delegates formulating a moderate message to the British
government. In Resolutions of the Stamp
Act Congress, delegates expressed loyalty to British government and
institutions but explained in very measured terms the unacceptability of being
taxed by a body that offered them no representation.
Parliament did rescind the Stamp Act
in 1766 but then passed the Declaratory Act reasserting absolute supremacy over
all matters pertinent to colonial governance.
Then during May and June 1765, Parliament passed the Townshend Acts
suspending the New York Assembly pending full compliance with the Quartering
Act; placing taxes on glass, lead,
paint, paper, and tea; and creating a
Board of Customs Commissioners in Boston to curtail smuggling; and adding four of the already despised
Vice-Admiralty Courts. In response,
colonists continued their boycott and eagerly embraced the message in John
Dickenson’s Letters of a Pennsylvania
Farmer (1767) conveying acceptance of Parliaments’ right to regulate
commerce but strenuously objecting to the levying of taxes on colonists to
boost the coffers of the British government.
In Boston, the situation grew
increasingly tense. Among a citizenry
reliant on trade, resentments ran deep.
In May 1768, a crowd attempted to prevent customs agents from seizing
John Hancock’s ship, the Liberty. Agents requested troops for their protection,
and the following September two regiments of British troops (Redcoats) arrived
alongside foreign mercenaries for encampment on Boston Commons. Time passed.
Tensions rose further. On 5 March
1770, a Bostonian mob began taunting sentries outside the customs office in
Boston. In panic, soldiers fired,
killing five colonists and wounding six in an incident that was prompt dubbed
the “Boston Massacre.”
The colonial population stood at two
million in 1770. The colonists were
dependent on the export of raw materials and importation of manufactured
goods. But acts of Parliament
increasingly favored British imports until the colonists were limited to those
good alone, curtailing free trade and economic liberty. Trade restrictions and the economic pain felt
from increased taxes energized a movement toward colonial revolt.
In the incident known as the Boston
Tea Party (December 1773), a group of Bostonians in Native American dress
dumped a shipment of East India Company Tea into the sea. British authorities closed the port and
imposed what colonists dubbed the “Intolerable Acts,” restricting
self-governance in Massachusetts. Fifty-five
delegates from all of the colonies but Rhode Island met as the First
Continental Congress on 5 September 1774, calling for rescission of all
offensive legislation since 1763, protesting punishment of Massachusetts,
vowing to continue collection of taxes but without payment until repeal of the
objectionable acts, and advocating the assemblage of arms for colonial
defense. In a draft of the Declaration
of American Rights, delegates affirmed Parliament’s authority to regulate
commerce but rejected the imposition of taxes.
George III considered colonial convening of the Congress and issuance of
the Declaration to constitute a state of rebellion.
With the incidents of Lexington and
Concord (April 1775), the “Shot Heard Round the World” projected colonists
toward war. But the Second Continental
Congress (May 1775), Battle of Bunker Hill (June 1775), Olive Branch Petition
(July 1775), and Declaration of the Causes and Necessity of Taking Up Arms
(July 1775) ensued without a formal declaration f war. Thomas Paine’s revolutionary pamphlet, Common Sense (January 1776) though, stirred patriotic emotions. Fellow revolutionaries prevailed upon Thomas
Jefferson to author the Declaration of Independence, which delegates
reconvening in the Second Continental Congress approved on 4 July 1776.
Jefferson’s exposition restated John
Locke’s compact theory of government as a social contract necessitating
recognition of citizen prerogatives, which Jefferson formulated as the rights
to “life, liberty, and the pursuit of happiness” (the last right altering
Locke’s ”life, liberty, and property”).
Jefferson outlined the oppressive acts of the British that had impelled
colonists toward rebellion. In making
these assertions, Jefferson set a new precedent for asserting the right of
people to disavow allegiance to monarchical authority and establish a new and
different form of government.
But though the revolutionary sentiment
was widespread, not all colonists supported the revolutionary cause. Those preferring continuance of British rule
became “Loyalists,” as opposed to “Patriots” supporting revolution. Loyalists typically clustered in
seaports. They included governors,
judges, and royal officials whose livelihood depended on the British
government. Merchants who were not
greatly affected by taxes and trade restrictions might remain supportive of
British rule. Owners of large
plantations had to think hard about a Patriot versus Loyalist stance: They sold agricultural raw materials to
British buyers, making a Loyalist stance appealing; but they also owed debts from which they
could gain relief as Patriots. Backwoods
folks tended toward the conservative Loyalist position.
But Patriot sentiment dominated and
prevailed. When war ended at Yorktown in
1781, the United States of America effectively took its place among the nations
of the world, a status made official by the 1783 Treaty of Paris.
IV. The First Half-Century of the United
States as a Practicum for Enlightenment Thought
The Framing of a Constitution as a
Firm Guide to Centralized National Governance
John Dickenson and a committee
appointed by members of the Second Continental Congress (1775) took the first
crack at devising a constitution for the new nation. Their product was less than stellar. Approved in 1781, the Articles of
Confederation created a governmental framework whereby a unicameral legislature
assumed authority for foreign diplomacy but did not have power over foreign
trade, interstate commerce, or taxation.
Each of the thirteen states were considered sovereign and independent. The national government had no dependable
source of revenue, dependent upon voluntary contributions from the states. The nation had no president or single figure
of wide-ranging authority: An Executive
Committee took responsibility for superintending the national government and
elected a chairperson to take the lead.
The greatest centralizer in the early
years of the United States, Alexander Hamilton, knew that this document could
never stand as a guide for national governance.
He called delegates from five states together in Annapolis, Maryland, in
September 1786; this group then
prevailed upon a very willing Congress to call a convention for revising the
Articles. But goal view of Hamilton and
other centralizers was not revision but creation: They advocated for writing an entirely new
constitution that could serve as a viable guide to national governance.
Thus it was that delegates to the
Constitutional Convention (May-September 1787) deliberated on the basis of
Hamilton’s advocacy: They would be
creating a new constitution. James
Madison wrote the document, adroitly balancing concerns about popular
participation versus mob rule; the
relative power of states with large versus those will small populations; and
the role of the national government versus states’ roles in taxation. Madison’s Constitution of the United States
of America was based on Montesquieu’s principle, “separation of powers,” so
that executive, legislative, and judicial branches had certain “checks and
balances” on each other. The thirteen
states acted quickly to ratify the Constitution in late 1787, and the new
supreme law of the land went into effect in 1789.
Alexander Hamilton served as the first
Secretary of the Treasury, the department of the executive branch given
authority over national financial matters in the United States
Constitution. Congress passed his
proposal to fund the prevailing $54 million in federal debt and to assume the
$21 million in state debt; to establish
an excise tax on distilled liquor, setting a precedent for federal (national)
revenue-generating tax collection; and
the creation of a national bank to abet the payment of national financial
obligations and to facilitate any borrowing necessary in meeting the exigencies
of the national budget.
In 1791, the first ten amendments to
the Constitution were made; known as the
Bill of Rights, these amendments protect freedom of speech, press, and
religion; guard against unreasonable
search and seizure; and provide for
right to counsel and trial by jury.
The last echoes of the Articles of
Confederation reverberated with an assertion of states’ rights, now in the
context of a much more powerful national government: The Tenth Amendment states that, “The powers
not delegated to the United States by the Constitution, nor prohibited by it to
the states, are reserved to the States respectively, and to the people.”
But then, twelve years later, came an
opinion of the Supreme Court that swung the focus back on the importance of the
central government and the Constitution of the United States as the supreme law
of the entire nation, to be contravened neither by laws passed by state
legislatures nor by the United States Congress:
So it was that in 1803 Chief Justice
John Marshall wrote the main opinion in Marbury
v. Madison that strengthened the role of the central governmental and the
power of his court to interpret the Constitution. This meant that justices of the Supreme Court
must have authority to nullify any acts passed by Congress that run contrary to
the Constitution as supreme law of the land:
It is a
proposition too plain to be contested, that the Constitution controls any
legislative act repugnant to it; or it
is on a level with ordinary legislative acts, and like other acts, is alterable
when the legislature shall please to alter it.
If the former
part of the alternative be true, then a legislative act contrary to the
constitution is not law: If the latter
part be true, then written constitutions are absurd attempts, on the part of
the people, to limit a power in its own nature illimitable…
Certainly all
those who have framed written constitutions contemplate them as forming the
fundamental and paramount law of the nation, and consequently, the theory of
every such government must be that an act of the legislature, repugnant to the
constitution, is void.
Thus, the
particular phraseology of the Constitution of the United States confirms and
strengthens the principle, supposed to be essential to all written
constitutions, that a law repugnant to the Constitution is void; and that courts, as well as other
departments, are bound by that instrument.
Centralization and Party Politics in
the First Presidential Elections
The degree of centralization in the
national government was a key issue in the first presidential elections of the
United States. George Washington, who
was elected first president in 1789, belonged to no party and distrusted the
very idea of political parties, but when his vice-president (John Adams) ran
for president in 1796 he ran as a Federalist.
Federalists favored a strong central government, while Anti-Federalists
touted states’ rights. Adams won the
presidency in 1800 but lost in 1804 to Thomas Jefferson.
Jefferson ran as a
Democrat-Republican, which incorporated Anti-Federalists into their ranks and
counseled caution in going too far with centralization. In office, though, Jefferson was vigorous in
asserting the national interest in territorial expansion, overseeing the
Louisiana Purchase in April 1803; and
sending Meriwether Lewis and William Clark to explore the vast region of the
United States beyond the Mississippi River to the Pacific Northwest during
1804-1806. Recent accounts of the Lewis
and Clark Expedition have emphasized that in doing this they obtained
considerable assistance from Sacajawea, a Shoshone woman who served as liaison
and interpreter when they crossed the territory of various Native American
groups, guiding them through present-day North Dakota, Montana, Idaho,
Washington, and Oregon.
So by the early 19th
century the most important national issues required more nuanced views
concerning the prerogatives of the national government versus those of the
states. The Federalists accomplished
much in establishing the foundations for a strong central government with
finances handled via a Central Bank; but
they never won another national election after 1796, although they did run
against Democrat-Republican candidates in 1800, 1804, 1808, 1812, and
1816. The Federalists lost respectively
to Thomas Jefferson (in 1800 and 1804), James Madison (1808 and 1812), and
James Monroe (1816).
The elections of 1820 and 1824 were
entirely intraparty competitions among fellow Democrat-Republicans, won
respectively by James Monroe and John Quincy Adams. Then in 1828 Andrew Jackson won the
presidency under the banner of a party that called itself simply Democrat. Jackson defeated John Quincy Adams, the
person for whom he had served as vice-president; Adams, who had been a Democrat-Republican as
president, ran in that election of 1828 under the banner of a short-lived party
calling itself National Republican.
From 1832 through 1852, presidential
contests were between Democrats and a party that appeared on the political
scene as the Whigs. Democrats in these
years were generally more populist politicians who appealed to the hard working
southern farm laborers. But the
Democrats also formulated a program that spoke to the interests of plantation
owners in that same region of the American South. The Whigs cultivated a constituency among the
urban manufacturers and financiers of the Northeast. Jackson won reelection in 1832 and fellow
Democrat Martin Van Buren won in 1836.
In 1840 William Harrison won as a Whig;
Zachary Taylor would record one more victory for the Whigs in 1848,
after they had lost to Democrat James Polk in 1844. The last year of Whig participation in
national elections was in 1852, when they lost to Democrat Franklin Pierce.
Then in 1856 the enduring Republican
Party was formed, with its fundamental initial principle being opposition to
slavery. The Republicans lost to
Democrat James Buchanan in 1856, but Abraham Lincoln initiated a string of
presidential victories for the Republicans in 1860 that he would continue in
1864; and that Ulysses Grant (1868 and 1872);
Rutherford B. Hayes (1876); and James A. Garfield (1882) would continue
well into the last half of the 19th century.
The United States was a unique national
experiment, a testing ground for the ideas of the Enlightenment. There was no guarantee that people considered
citizens of a republic, rather than subjects of a monarchy, would exercise
their civic duties wisely. Power vested
in one person or a few people can be much more efficient and is not necessarily
used for abusive purposes. So
Paine, Jefferson, and Madison were
implying that rather than hope that a good ruler landed on a throne, they would
assume that Locke, Rousseau, and Montesquieu were right about the propensity of
people to draw upon reason if given freedom to do so, acting then as citizens
for human good. And as students of
natural science, Franklin and Jefferson were willing to conduct this grand experiment
in self-government experiment, implying that deductive reasoning based on the
Constitution and inductive reasoning in response to unforeseen issues and
situations would lead over time to improvements in the human condition.
The first half of the 19th
century was a time of huge contradictions, a period when admirable successes
and courageous ventures and adventures ran concurrently with stunning incidents
and abiding situations of cruelty. The
grand experiment in self-government worked and was often conducted well, but
the experimenters were far from perfect.
One must remember that most of those who voted in national elections and
participated civically for many decades into the 19th century were
middle class and wealthier white men.
Women did not secure suffrage until 1920 (19th Amendment).
Thomas Jefferson was until his death
in 1823 a slave owner; he did not think
that his famous assertion of the human right to “life, liberty, and the pursuit
of happiness” applied to slaves, and he must not have considered them “created
equal” to other human beings.
James Madison mostly avoided the issue
of slavery in writing the United States Constitution, thereby tacitly condoning
human servitude. His only reference to
slavery in the Constitution was oblique:
his explanation that membership in the House of Representatives should
be
determined by
adding the Whole number of free persons,
including those
bound to Service
for a Term of Years, and excluding Indians not taxed,
three-fifths
of all other persons.
Bottom line, a slave--- the economic and social condition for most
African Americans until 1866 (13th amendment) populace--- was considered three-fifths of a person.
So active citizenship mostly involved
white males, and during the four decades following the first presidential
election (1789), civically engaged and empowered people tended to be those of
substantial means. But by 1829, a new
sense of democracy flourished among those constitutionally able to participate
and vote. As new states entered the
Union, they produced constitutions amenable to a wider franchise, so that more
adult free males could vote--- and more
of those of humble economic means did so.
Politicians then had to have more mass appeal, so that they began to
host barbecues and parties, and to use posters, ribbons, and buttons in their
campaigns
Parties put forward excellent
candidates in the elections that resulted in the first six presidencies, and
the electoral process produced an interesting and transformative president in
Andrew Jackson; the next six
presidencies were less distinguished, but the sixteenth was arguably the best
of all: Abraham Lincoln--- who did in fact address the issue of slavery
as leader of a party the initial purpose of which was to advocate for
abolition.
So the faith of the Founders in
Enlightenment ideals relevant to natural rights and the power of reason in
political processes seemed evidentially sound during the first half of the 19th
century. There was abominable judgment
on the matter of slavery, fathomable only in the social context of the times
and in the very partial evolution of human ethics. But a viable economic and political system
based on participatory republican democracy was prosperous and well-governed.
Successes in many realms provided
justification for faith in liberty, reason, and science.
The litany of technical and
technological breakthroughs and innovations was impressive: the first efficient, productive manufacturing
plant at Slater’s Mill (1791); Eli
Whitney’s cotton gin (1794); Cyrus
McCormick’s reaping machine (1831);
first electric telegraph (Samuel Morse, 1835); Samuel Colt’s revolver (repeat-fire pistol,
1836); development of anesthetic ether
(William Morton, 1846); Isaac Singer’s
sewing machine (1851; invention of the
telephone (Alexander Graham Bell, 1876).
The economy grew at an admirable pace
and, while the domestic market absorbed most production, the growth of foreign
trade was explosive: imports of
manufactured goods grew from about $30 million annually during the 1790s to
$360 million In the year 1860. Foreign
investors recognized the potential for profit:
Foreign commerce financed many American entrepreneurial ventures during
the first half of the 19th century.
Textiles, shoes, and other products from factories in New England and
the mid-Atlantic states increased by the decade. Sold initially to supply the domestic market
in the South, Midwest, and West, these items also increasingly found markets
overseas: Only 5% of manufactured items
in the United States were exported in 1820, but this figure rose to about 12.5%
by 1850.
This was a time of enormous
geographical expansion. In 1803 came the
massive addition of Louisiana (then encompassing the states of the Upper
Midwest and West from the Mississippi River to the Oregon Country [today’s
Oregon and Washington]). Representatives
of Spain signed Florida away to the United States in the 1819 Adams-Otis
Treaty. Territorial acquisitions
following victory in the War with Mexico (1846-1849) resulted in confirmed
annexation of Texas by the United States government and added 500,000 square
miles in New Mexico, Arizona, Colorado, and Wyoming; the southern portions of Arizona and New
Mexico came with the Gadsden Purchase, settling remaining boundary
disputes.
Maine and Missouri became states in
the Missouri Compromise (1820), with Missouri gaining entrance as a slave state
and Maine as a free state; the southern
border of Missouri (the 36-30 line) thereafter served as the dividing line for
the westward extension of slavery and demarcated the slave
holding South from the North (only in
Missouri itself among those states north of the line was slavery legal). The Oregon Territory was claimed under the
principles of the Monroe Doctrine (1823), stating that United States interests
in the Americas precluded further intrusion by nations external to the Western
Hemisphere; in this case, Britain relented
to United States governance in the 1846 Oregon Treaty.
The American spirit to push westward,
exploring the expanses of the Upper West;
claiming preeminence in the Western Hemisphere; and pressing territorial claims over Spain,
Mexico, and Great Britain; seemed to United
States Magazine and Democratic Review editor John Louis O’Sullivan to
entail a sense of “Manifest Destiny,” a term he coined in 1845. So startling had the successes of this still
very young nation been that in O’Sullivan’s view citizens had already come to
see their relentless movement westward as
a divine process inspired and justified by the superiority of the United
States.
But treatment of indigenous peoples
and those of African origins represented counterpoints to this talk of divinity
and brought heavy burdens to the people of the United States at
mid-century. Successes in the national
experiment to put the ideals of the Enlightenment into action were abundant but
could not obscure abiding contradictions that caused many to suffer and
threated the very unity of the states avowedly united in the experiment.
Consequences of the Limits of Reason
The Founders of the United States had
towering intellects, supreme dedication, and enormous foresight. The Constitution that they generated is the
world’s greatest statement of democratic republican governance. The people who dwelled in the United States
during the first seven decades of national existence were energetic,
adventurous, and frequently successful.
But the Founders, the people whom they led, and their guiding
document--- for all of their
greatness--- were deeply flawed.
American society ran roughshod over
Native and African American people.
In 1830, Congress passed the Indian
Removal Act; by 1835, representatives of
the United States government had coerced, cajoled, or convinced Native American
leaders to sign 94 separate treaties. The culturally adaptive and
diplomatically skilled Cherokee filed two Supreme Court cases to contest the
Indian Removal Act: Cherokee Nation v. Georgia (1831) and Worcester v. Georgia (1832).
In the former case, the Justices ruled that the Supreme Court had no
jurisdiction over a “domestic dependent nation”; in the latter, the Justices declared the
Cherokee a “distinct political community” within which state law could not be
enforced. These decisions left some
scope for Cherokee claims to rights as a self-governing (although “dependent”)
community, but President Andrew Jackson and officials in Georgia were inclined
to emphasize Cherokee dependence on their guidance, and that guidance convinced
the Cherokee to capitulate in 1835.
Members of the Cherokee confederation departed for Indian Territory
(Oklahoma) in 1838; they were followed
by Choctaws, Chickasaws, Creeks, and Seminoles in a treaty-induced exit from
the lands of their ancestors and their homes of recent habitation.
But there had already been
considerable armed opposition on the part of Native Americans to encroachment upon their territory, and so there
would be to attempts at removal. During
the War of
1812, the Shawnee leader Tecumseh
(1768-1813) was partially successful in his grand attempt to unite tribes from
Florida into Canada. Seminole leader
Osceola (c. 1804-1838) answered the Indian Removal Act with an uprising
vigorously waged by his people. And
Chiricahua Apache leader Cochise (c. 1812-1874) waged a persistent and
protracted campaign against the policies of the United States government,
doggedly conducting raids on settlements and encampments from the 1850s until
his capture in 1871.
In the years just after the capture of
Cochise, fellow Chiricahua Apache Geronimo (1829-1909) fought federal troops
and settlers in southeast Arizona and New Mexico. Particularly active between 1875 and 1885,
Geronimo finally was corned by United States troops and surrendered in the
latter year. He spent two years in a
Florida prison, was moved to Alabama, then was transported to Fort Sill in
Oklahoma. He spent his last fourteen
years confined to Ft. Sill, except those times when he made appearances at
events such as the Omaha Exposition (1898), Pan American Exposition in Buffalo
(1901), the St. Louis World’s Fair (1904), and Theodore Roosevelt’s inaugural
parade (1905). By then Geronimo had become a sad caricature. He drank himself into an alcoholic stupor one
cold evening early in 1909, fell from his horse, contracted pneumonia, and died
on 17 February 1909.
Comanche chief Quanah Parker refused
confinement to a reservation according to the terms of the 1867 Treaty of
Medicine Lodge. Over the course of seven
years, he and his band raided sites throughout Texas and Mexico. But in the dogged Red River campaign of
1874-1875, the United States pursued Quanah and the others until they
surrendered in exhaustion at Fort Sill on 2 June 1875.
After gold was discovered on land to
which they had agreed to retreat in 1868, Dakota Chiefs Crazy Horse and Red
Cloud rebelled when the United States army tried to reclaim the property. They massacred General George Armstrong
Custer and all 264 of his soldiers in the Battle of Little Bighorn (25 June
1876) but could not prevail: Crazy Horse
was captured and executed in 1877.
Opposition on the part of Native
Americans to removal, treaty terms, and treaty violations would continue as an
impediment to United States policy until 1890.
That was the year of the last major confrontation, at Wounded Knee
(South Dakota); when the forces of the
United States Army prevailed in that violent struggle, the Native American heart
for and ability to mount viable resistance waned.
During the early 19th
century, agricultural products dominated United States exports. So sought after was American cotton. and
therefore so good the prices, that both Northern and Southern economic
interests induced heavy cultivation of this labor-intensive crop. During 1816-1820 cotton shipments represented
40% of total exports, and in some years thereafter that figure rose to 67%. Capitalists in both Great Britain and New
York lent funds for cotton cultivation to plantation owners in the American
South, promoting a level of specialization that impeded economic
diversification.
Slaves did the work for plantation
owners and by definition, once costs for purchase were recovered, constituted
free labor. Despite the vested interests
on the part of Northern investors and cotton exporters and Southern plantation
owners, by 1817 an abolitionist movement was underway. In that year, a group pf abolitionist
organized the American Colonization Society, with the goals of gradual
emancipation, the eventual assemblage of former slaves into colonies, and
monetary compensation for erstwhile plantation owners. Moderate abolitionists of this sort
established the Liberty Party in the mid-1840s;
more radical abolitionists, though, were unwilling to settle for
gradualism: They encouraged slaves to
destroy agricultural equipment, feign illness, deliberately slow their pace of
work, run away, or revolt violently as various opportunities materialized.
Slaves were considered portable
property, and the Fugitive Slave Law of 1850 made the return of any fleeing
property a legal imperative. This
statutory law was implicitly given constitutional approval in the 1857 Supreme
Court ruling, Dred Scott v. Sanford,
whereby Chief Justice Roger Taney wrote for the majority, “The Negro has no
rights that the white man is bound to consider.”
Such a ruling could be said to flow
from deductive reasoning upon a racist body of assumptions, but such reasoning
was unreasonable by the loftiest standards of the Age of Reason: The racist assumptions were absurd and ran
counter to human progress. Activist
abolitionists themselves countered by organizing the Underground Railroad,
metaphorically uniting the well-planned escape routes of runaway slaves to safe
houses stretching from the Deep South through Kentucky and Ohio and on to
havens beyond the Canadian border.
Former slave Harriet Tubman (1820-1913) is credited with having assisted
more than 300 slaves make such escapes during 185o-1861.
Tubman soon would commit her efforts
in the context of Civil War (1861-1865).
When Abraham Lincoln was elected as President in 1860 at the behest of a
Republican Party founded (1858) to achieve abolition, those atop the power
structure in the American South heard cracks in the foundation.
Seven southern states seceded from the
Union. The first was South Carolina
(December 1860), followed by Alabama, Mississippi, Florida, Georgia, Louisiana,
and Texas (the latter in February 1861).
In the course of 1861, these leaders of these states formed the
Confederate States of America, with President Jefferson Davis heading a
government at Montgomery, Alabama, that would alter move to Richmond, Virginia.
On 12 April 1861, Confederate General
P. G. T. Beauregard ordered cannon to fire on federal ships endeavoring to
supply Fort Sumner in Charleston, Carolina.
President Lincoln put the United States on notice to be ready for armed
conflict. On 17 April the state of
Virginia left the Union, and in the course of the next five weeks the
Confederate States of America was fully formed with the addition of Arkansas,
Tennessee, and North Carolina.
IV. Civil War and Reconstruction
Civil War
(1861-1865)
The Civil War harnessed the energies
of those committed to the abolition of slavery, in opposition to those
whose livelihood depended on continuance of what C. Van Woodward identified
as the “Peculiar Institution.” Matters of livelihood made the conflict about
much more than the ethical drive to eliminate human bondage.
Economics very much undergirded the
conflict. The North still maintained a
predominately rural population, but this population was mostly organized into
small farms and villages; what’s more,
many communities had factories churning out industrial goods, and the North by
now featured a number of major cities---
New York, Boston, Pittsburgh, Philadelphia, Cleveland--- wherein industry and commerce flourished. The South was overwhelmingly
agricultural. Most farms were small, but
the big planters dominated the economy, driving production toward cotton,
utilizing the labor of some four million slaves, and setting the pace for a
distinctive culture that valued gentlemanly conduct, feminine charm, fancy
balls, fine dining, courtly behavior, and attendance at churches that did not
emphasize the egalitarian features of Christianity nor in any way challenge the
precepts of slavery.
These had become two different
societies and arguably could have been two distinct nations. If they had been allowed to secede and exist
unchallenged as the Confederate States of America, the eleven states of that
national composition might have thrived on the basis of shared culture and
values. But inasmuch as those values
were largely corrupt and atavistic, they would have faced opposition from more
enlightened populations. Economically,
the going would have been tough, given the undiversified nature of the economy.
In any case, Abraham Lincoln was
adamant that the Union should remain intact.
He even eschewed the name, “Confederacy,” and he did not attitudinally
or conversationally acknowledge that the states that assembled under that
appellation had actually seceded. He
regarded those states simply to be in rebellion and his task to be that of pulling
them back into the Union. He kept the
bordering slave states of Kentucky, Maryland, Missouri in the Union under quick
threat of military action, but he could not persuade the states that had
already seceded to renew acceptance of governance under the United States.
Thus, in spring 1862 Lincoln ordered
Union occupation of 50,000 square miles in Tennessee and the lower Mississippi
Valley, forestalling confederate action in the places protected by these
troops. But Confederate armies were well
led by General Robert E. Lee, a brilliant tactician. The soldiers of the South had already won
several major battles in 1861 and had continued success in early 1862. Generals Joseph Johnson and P. G. T.
Beauregard joined forces in a pincer movement to achieve clear victory in the
Battle of Manassas. General Thomas
(Stonewall) Jackson led a force from the Shenandoah Valley to link with General
Lee’s Army of Northern Virginia to fend off a Union attempt to take the Confederate
capital at Richmond in June 1862.
In 1863, the Confederacy achieved
naval success with British-made ships, a technical violation of British
neutrality but an expression of British disgust with a Union blockade that
denied them access to cotton imports that made up 80% of the supply to their
home factories. The C.S S. Alabama and C.S.S. Florida sank or captured nearly
100 merchant vessels.
Despite adroit leadership, effective
tactics, and high motivation in the early stages of the Civil War, the
challenges facing the Confederacy were daunting. Only $27 million in hard currency was
available, and efforts to secure public loans and to collect income and
commercial taxes proved inadequate. The
solution of simply printing money, without backing in precious metal, and in
the context of low industrial supply of desired and needed goods, logically
engendered greatly inflated
prices for staples such as cloth,
beef, and flour. Confederate military
leaders resorted to appropriation of agricultural crops, livestock, and
machinery from private individuals, but because of shortages they frequently could not avoid sending soldiers
forth to march barefoot, stomachs growling from hunger, wearing uniforms of
Union origin, blue dyed over to brown.
The Union had more dependable sources
of revenue via taxes and loans. The
Internal Revenue Service placed a tax on incomes of more than $800, the first
such tax in United States history; the
federal government continued to collect the tax until 1872. Like the Confederate government, the United
States government also resorted to currency backed entirely on faith, not on
precious metal. But greater supply of
goods for which those dollars could be spent kept inflation at lower levels in
the North by comparison with the South;
still, at war’s end Union “greenbacks” were worth only 67 cents on the
dollar.
The North had a huge edge in
factory-produced goods, including those needed for the waging of war. Lincoln also could enforce a blockade of
southern ports, very important when the Confederacy appealed to Great Britain
for assistance; Great Britain and France
did try to broker an agreement between Confederate and Union forces, but at a
time when Lincoln saw no advantage in pursuing discussions not likely to
achieve national reintegration. In
addition to having advantages in industrial production and economic leverage,
the North also could command most of the railroad lines necessary to move
soldiers and materiel expeditiously to highly strategic points over the
expanses of the northern states, the border areas, and into the upper
South. The power that the Union held in
wide-ranging railway access was further symbolized with the formation of the
first intercontinental railroad , for which construction began in the midst of
the Civil War, in 1863--- to be
completed six years later when the Union Pacific Railroad (moving westward
from Omaha, Nebraska) and the Central
Pacific Railroad (moving eastward from Sacramento, California) linked 1,770
miles of track on 10 May 1869.
In 1863, the forces of Union military
leader Ulysses S. Grant captured a Confederate army contingent at Vicksburg,
Mississippi, and dissolved another at Chattanooga, Tennessee. At the Battle of Gettysburg
(Pennsylvania, 4 July 1863), Confederate
General Robert E. Lee went down to very
bloody defeat. Confederate armies still had victories ahead
of them, but the path to winning the larger contest now seemed obscure. Much blood flowed during 1863-1864 as the
conclusion of the war seemed inevitable.
In the course of 1864-1865, General William T. Sherman sent forth his
soldiers on a “March to the Sea,” burning and looting much that lay in their
way.
This was a violent phase in a bloody
war full of brutal encounters in which 620,000 Americans died. The development of the musket rifle in the
1850s made these weapons available for use just in time for the Civil War. Sharpshooters might be able to hit targets
even beyond those of 400-500 yards, the latter distance imminently possible for
the regular infantry soldier. This made
heroic assaults on horseback obsolete and reduced face to face encounters: Shots fired from entrenched positions became
much more common. And Dr. Richard Jordan
Gatling’s invention of the multiple-barrel, revolving machine gun (“Gatling
Gun”) made an even more powerful statement on the battlefield.
Sherman’s March to the Sea closed out
the Civil War for a Confederate army exhausted and demoralized. At Appomattox Court House on 2 April 1865,
General Lee surrendered to General Grant, ending the Civil War. Confederate President Jefferson Davis
surrendered to captors in Georgia on 10 May.
Reconstruction
(1866-1877)
On 14 April 1865, a disgruntled actor
and Confederate sympathizer, John Wilkes Booth, fired a bullet into the head of
President Lincoln, who was watching the play, Our American Cousin, with his wife Mary at Ford’s Theater in
Washington, D. C. Lincoln died the
following morning, succeeded in his office by Vice President Andrew Johnson,
the former governor of Tennessee, who entered office as the seventeenth
president.
In Congress, a group known as the
Radical Republicans pressed forward with a Reconstruction program, the purpose
of which was to integrate former slaves into the social and political life of
the nation. This necessitated stationing
federal troops in the South to protect African Americans from physical
violence, and to ensure that they had access to voting booths, schools,
libraries, and public offices where they might receive needed services to meet
life’s exigencies after decades of servitude.
During 1866-1870 Congress momentously
passed and the states ratified the 13th, 14th, and 15th
Amendments to the United States Constitution respectively abolishing slavery,
providing full rights of citizenship, and guaranteeing voting privileges. The 13th Amendment was passed and
ratified in 1866; the 14th
Amendment was passed in 1867 and ratified in 1868; and the 15th Amendment was passed
in
1869 and ratified in 1870. The Reconstruction program as it went forward
in 1867 required that states write new constitutions that protected African
American rights and included statements ratifying the 13th and 14th
Amendments (the 15th Amendment had not yet been passed by Congress).
The Reconstruction program represented
a real effort on the part of Congress to bring African Americans into the civic
life of the nation. So much work needed
to be done, that any such effort would have to be a work in progress, but
public offices throughout the South offered places where African Americans
could inquire about purchasing land apply for loans that could help them do
so. The federal government funded job training
programs that could empower people seeking work other than farm labor. Schools were established for the specific
mission of increasing literacy and providing basic education. Land grants were issued to educators aspiring
to start colleges for African American students; these multiplied and eventually formed the
network of Historically Black Colleges and Universities (HBCs or HBCUs) that
still play an important educational role today.
Under the protection of federal
troops, hundreds of African Americans voted for the first time. Many ran for office. Throughout the South, African Americans took
positions as state representatives, county commissioners, city council members,
and mayors. State electorates (which
during Reconstruction excluded the vast majority of the populace that had
supported the Confederacy) made decisions in favor of African American
candidates, placing a governor and two lieutenant governors in office. At the national level, 14 African Americans
won election to the House of Representatives and two gained election as
Senators.
But this hopeful time came crashing to
a halt in 1877. Already, the Republican
party was losing its abolitionist and Reconstructionist zeal, turning to
industrialists and big business interests in the urban North as their prime
political constituencies. Then in the
presidential election of 1876, the stage was set for the most critical event in
African American history since the slavery and the Civil War:
The election between Rutherford B.
Hayes (Republican) and Samuel B. Tilden (Democrat) was close. Tilden had the edge in the popular vote, but
there was essentially a deadlock in the Electoral College that turned attention
to Florida, where a dispute over the vote count ensued. The Democrats offered a deal that in essence
went as follows: We’ll concede Florida
and therefore the presidential election to you on condition that you withdraw
federal troops from the South.
The Republicans took the deal. Hayes took office as the 19th
president, succeeding Ulysses S. Grant (18th president,
1869-1877). He promptly recalled federal
troops from the South.
With this critical event, the way was
clear for white power establishments to resituate themselves in positions of
power, and for African Americans to endure ordeals that exceeded slavery in
cruelty: lynching mobs, homes put to the
torch, unjust court decisions, confinement to sharecropping and other ill-paid
labor, putative routes to a Promised Land in the cities of the North that
descended into hellish environments at the urban core.
V. United States History During an Age of
Imperialism and Industry
In the years between the Civil War
(1861-1865) and World War I (1914-1918), the United States surged into
international prominence, thrusting into the global ether an image of a dynamic
young nation in which democratic spirit and capitalist energy offered the high
quality of life for which people yearned throughout the world. There was of course an abiding irony in the
seeming triumph of this young exemplar of the liberal republic economic and
political formation: African Americans
and Native Americans were being brutalized, and no women could be full
participants in the civic life of the nation.
But if one could stare through or see around that impediment to behold
the idealized image of democratic capitalism working wonders, then there were
discrete examples that could be used to support that view.
Thomas Alva Edison (1847-1931)
received over 1,000 patents for items that included the quadruplex telegraph,
carbon-button telephone transmitter, phonograph, electric light bulb, and an
electrical generation and distribution system.
These devices caught the public imagination as powerful indication of
the United States as international technological leader, following a pattern
witnessed in the inventive spirits of Slater, Whitney, McCormick, Morse,
Singer, and Bell.
Technology, industry, and commerce
surged as factors in the American economy in the late 19th
century. The Gross National Product
(GNP) increased from $2 billion in 1859 to $9.1 billion in 1870 and to $13 billion in 1899; then, in just one year, the GNP tripled to
$37 billion by 1900. The total value of
exports from the United States rose from $858 million in 1870 to $1.4 billion
in 1900, a growth of , a 63% increase in just 30 years.
Inspired by the views of Alfred Thayer
Mahan at the United States Naval Academy, some Americans longed to join the
European powers in the quest for overseas empire. Mahan maintained that in an industrial age
markets for surplus goods produced in the United States must be found
overseas. Having located these markets, private
commercial ships would sail forth on seas that could be dangerous, potentially
necessitating the response of a strong navy.
In the course of the late 19th century, the naval fleet
expanded and the federal government oversaw the construction of numerous new
stations on the islands of many seas for the purpose of refueling and
reparation of ships.
Thus it was that Hawaii loomed large
in the American imagination. With the
stated immediate motivation of protecting contingents of American missionaries
(who had arrived on the islands as early as the 1820s), the United States
presidential administration of John Tyler invoked the Monroe Doctrine when the
British showed imperialist interest in the Hawaiian Islands in 1842. In the aftermath of an uprising of the
Hawaiian indigenous people in 1875, American plantation owners effectively
governed the islands. This agro-business
elite staged a bloodless revolution in 1893, setting in motion processes by
which Hawaii would be annexed as a United States territory on 7 July 1898 and
designated a state in 1950.
At the time of the Hawaiian annexation
the United States government and military were also pursuing imperialist
interests in the Spanish-held territories of Puerto Rico and the
Philippines. The American sugar industry
craved control of these islands for the opportunity to access sugar cane
without the intermediation of Spanish middlepersons. On 15 February 1898, an explosion of the USS Maine occurred in Havana Harbor that
killed 260 American sailors. In the
United States strident newspaper articles and political cartoons played a
significant role in steering public sentiment toward war. The refrain “Remember the Maine!” became
common among war-inclined patriots. On
19 April 1898, Congress did issue a formal declaration of war. The war lasted from April into August 1898,
at a cost of 5,000 lives, all but 379 of them due to soldiers’ contraction of
tropical diseases.
According to terms of the treaty that
succeeded the war, the United States gained the desired control over Puerto
Rice and the Philippines. The Filipino
people, though, had viewed defeat of Spain as a chance to gain independence,
putting them at odds with American imperialist and commercial interests. A violent struggle to resolve the dispute
took 4,300 American and 57,000 Filipino lives.
The United States prevailed and annexed the Philippines as a formal
territory in 1902.
United States imperialist actions and
aggressiveness across the oceans were given presidential justification under
the Roosevelt Corollary (1904), by which Theodore Roosevelt asserted that to
enforce the Monroe Doctrine, the United States must assume “international
police power” when situations arose that had the capacity to provoke European
intervention. The Roosevelt Corollary
could be applied retroactively to explain United States actions in the
Spanish-American War, and it was invoked many times during the years leading up
to 1930 to justify United States involvement in South American affairs. At the time that Roosevelt articulated his
corollary, his administration was already superintending construction of the
Panama Canal, with the motivation of moving ships more easily between the
Atlantic and Pacific Oceans.
Quest for overseas empire occurred in
the context of the closing of the American frontier, a phenomenon emphasized by
Frederick Jackson Turner in a paper that he gave at a meeting of the American
Historical Association in 1893. Turner,
using data from the United States census of 1890 wrote about the importance
that “the existence of an area of free land, its continuous recession, and
the advancement of American settlement
westward” had played in the nation’s history.
Jackson stressed the economic value in American natural resources: the abundance of arable land, the great
endowment in minerals and waterways, and the vast plains for grazing
animals. He also emphasized the
psychological value of having somewhere to go if one’s circumstances provoked a
desire to migrate, a safety valve for when times turned rough or
dangerous. This worked a hardship on the
American imagination that lent additional force to the impulse for imperial conquest.
In 1908, the very activist President
Theodore Roosevelt moved to conserve large acreages of land that had formerly comprised part of that
great frontier of which Turner had written.
Seeking to preserve a generous amount of land for protection of natural
resources and use for public purposes,
Roosevelt empowered Gifford
Pinchot, his chief of the Bureau of Forestry, to establish bureaus and
commissions staffed by geologists, hydrologists, foresters, and engineers. A National Conservation Congress met in 1908
that impelled most states to set up their own conservation commissions.
The restless desire for movement was
abetted by the invention of startling new means of transportation. At Kittyhawk, North Carolina, in 1903 Orville
Wright rode a flying machine (named
“Flyer”) powered by a twelve
horsepower gasoline engine; the feat,
which lasted just 12 seconds, is credited as the first human flight in a
heavier than air machine. Henry Ford
(1863-1947) utilized an assembly line along which workers used interchangeable
parts to boost production of the “Model T” from 10,607 automobiles (costing
$850 each) in 1908 to 730,041 (costing $360 each) in 1916. Ford’s plants began using conveyor belts in
1913, influencing industrialists who were seeking maximum productivity for the
manufacture of many different goods.
Registered automobiles increased from 8,000 to 1.2 million during
1900-1913.
The adult population that could serve
as a market for these vehicles had burgeoned in the course of the late 19th
century: In 1850, the population of the
United States was 23,191,876; the
figures for the next six decades were 31,443,321 (1860),39,818,449 (1870),
50,155,783 (1980), 62,947,714 (1890), 75,994,575 (1900), and 91,972,266 (1910); hence, the population had almost quadrupled
between 1850 and 1910 and nearly doubled between 1880 and 1910. Immigration was behind much of the population
increase. In 1869, 352,569 people came
as immigrants into the United States, by 1884, the comparable figure was 1.5
million. Before 1870 most immigrants
originated in Northern and Northwestern Europe.
During the last three decades of the century, immigrants tended to be
from Southern, Central, and Eastern Europe;
the immigrant population that originated in Asia also was sizable during
these years. Immigrants thought first of
settling in cities, abetting a trend toward urban growth: In 1880, just 26% of the United States
population resided in cities; by 1910,
that figure was 46%.
The rapid pace of change during the
last half of the 19th century and the beginning of the 20th
century brought stresses in American society of many sorts, engendering a
diversity of responses:
This was an era when big business was
getting bigger. Between 1894 and 1904,
mergers of smaller companies produced many corporations the names of which are
still familiar in 2015: DuPont, General
Electric, General Foods, Nabisco, and Westinghouse. In 50 industries, a single holding company
controlled 60 percent of factory production and employed the majority of
workers for certain goods. In 1882, John D. Rockefeller became the titan of the
big business era with his establishment of Standard Oil, pressuring 77
companies into transferring a majority of their stock to the nine-member board
of trustees firmly under his control. By
the middle 1980s, Standard Oil controlled 90% of oil production in the United
States.
In many industries workers (who
totaled 3.2 million in 1900) toiled long hours for low wages in unsanitary
conditions. To advocate for workers,
Samuel Gompers formed the American Federation of Workers in 1886. Gompers led his union toward a moderate
stance, not agitating for the demise of trust and monopolies and making only
limited use of strikes and boycotts; his
goal was to promote incremental change toward shorter hours, higher wages, and
safer conditions. As unions became a
political force, aspirants to government office and their supporters in the
business community did recognize the wisdom of forestalling more radical
demands, and worker conditions gradually improved.
In 1892, a group of farmers gathered
in Omaha, Nebraska, to found the Populist Party. They issued a manifesto demanding public
ownership of transportation and communications companies that were conspiring
to charge high rates for shipping agricultural goods. The manifesto also called for a secret ballot
in national elections, a graduated income tax, direct election of United States
Senators, and improved labor conditions.
And they demanded “free silver,” a clamor that had arisen during
1870-1890, when silver production had quadrupled with the opening of huge mines
in Nevada; owners of the mines and other
advocates maintained that coins should have a silver to gold ratio of sixteen
to one.
Grover Cleveland was the only
successful presidential candidate for the Democrats in the late-19th
century aftermath of Civil War (1861-1865) and Reconstruction (1866-1877). The Republican James Garfield won in 1880,
Cleveland won in 1884 but lost to Benjamin Harrison in 1888 before mounting a
comeback in 1892 (the only president to serve two nonconsecutive terms). The populists thought they saw a chance for
victory, first on their own and then by joining forces with the Democrats, who
had cultivated a constituency among the famers in the South. The Populist (People’s) Party fielded an
unsuccessful presidential candidate, James Weaver, in 1892. William Jennings Bryan likewise was defeated
as a candidate with a populist following, losing under the banner of the
Democrat-People’s Party in 1896; the
Democrat-Populist Party in 1900; and the
Democrat Party in 1908. By this time,
the populist movement was waning, but Bryan was a dynamic speaker who garnered
respectable totals in the popular vote.
A more middle class response to the
problems engendered by rapid urban growth, industrial expansion, and big
business dominance was the Progressive Movement. Like the populists, the progressives saw
government regulation as key to countering the tremendous power of big
business. But the progressives were more
urban in character than the populists and in general they had greater wealth
and higher levels of educational attainment.
They were patriotic nationalists who believed in America’s potential,
turning to William James as intellectual guide.
James was Harvard psychology professor who wrote the classic, Pragmatism, in which he asserted that
the way to solve social problems was to tackle them in the field of action,
finding what works and discarding what does not.
Progressives counterpoised themselves
to the Social Darwinism of social philosopher Herbert Spencer, who appropriated
the biological theories of Charles Darwin in the interests of big business.
Spencer maintained that natural social
processes only fail when a social agent such as the federal government interfered. He and those drawn to his ideas promoted
strict laissez faire economics and
opposed legislation to improve conditions for industrial workers.
Among those who chafed at Social
Darwinist ideas and endeavored to solve social problems in the field of action
were social reformers who founded settlement houses: Jane Addams and Ellen Starr (Hull House,
Chicago, 1889); Robert A. Woods (South
End House, Boston ,1891); and Lillian Wald (Henry Street Settlement, New York,
1895). Much in the mold of progressives,
most settlement house workers were middle class, idealistic young people, often
college-educated women. They worked to
make settlement houses into havens for the downtrodden, often serving
immigrants or others newly arrived to urban areas. They provided childcare, adult education,
recreation, and classes enhancing cultural development (dance, music, visual
art) and practical skills (cooking, sewing, carpentry).
VI. The Eventful First Half of the 20th
Century
During the first half of the 20th
century people in the United States experienced two world wars, and between
those two global conflicts came quarter after quarter, year after year of
relentless economic regression that is remembered as the Great Depression. Because the United States entered each of the
wars later than did the other major participants, and because the war was not
fought on American soil, the war was not as devastating as it was for
Europeans, North Africans, and the Japanese. In particular, United States
soldiers were not subjected to the abject stupidity of trench warfare strategy
that characterized the struggle on the Western Front in World War I. And American forces did not have to face the
brunt of the German blitzkrieg , the
sort of occupation that the French endured, or the intensive bombing campaign
that the Germans waged against the British.
But United States losses were heavy in the Pacific War in which
Americans took the lead for the Allies, and the entire first half of the 20th
century constituted a major test of the nation’s resiliency and
politico-economic foundations.
The United States
Experience in World War I
Even before the United States entered
World War I, American popular sentiment tended toward the Allies. Conflicts with the British in the American
Revolution and the War of 1812 now seemed temporally remote, and
British-American relations were good at the beginning of the 20th
century. And although those conflicts
with the British did seem distant, warm feelings toward the French had lingered
over the century and more since the Revolution for invaluable assistance
rendered during that time when Americans were striving against a great imperial
power to found a new nation. During the
period of American neutrality before 1917, the Lafayette Escadrille and other
private military units from the United States drew many young Americans into
the conflict on the side of the French in particular and the Allies in general.
Likewise, the propaganda machine of
the Committee on Public Information, led by George Creed, catalyzed American
energies for the war effort prior to formal participation. Creel superintended the production of a
series of “Hang the Kaiser” films, printing of numerous posters, and placement
of billboards in towns small and large
across the United States. The Committee
printed pamphlets numbering 75,000 in many different languages. And the
Committee paid a like number of “Four Minute Men” to whip up pro-war sentiment
with speeches delivered on street corners.
This propaganda could fall heavily on the 10 million immigrants from
Germany and Austro-Hungary.
Americans were inclined to regard
Germans as militaristically aggressive on the international scene, arousing
antagonist feelings that could become xenophobic, as when a 500-strong crowd in
St. Louis stripped a German American, bound him in an American flag, and
lynched him. Many large universities
discontinued instruction in the German language. Streets bearing German appellations were
renamed. German measles became “liberty”
measles and a young Dachshund became a “liberty” puppy. Xenophobes and cultural zealots ripped songs
of German origin out of songbooks.
Officials arrested and deported German-born Boston Symphony conductor
Karl Muck when he refused to play the Star
Spangled Banner at a 1917 concert;
the following year, the Boston Symphony hired its first American-born
conductor. This was very much in the
spirit of the popular slogan, “100% Americanism,” and followed congressional
restrictions on hiring foreigners.
The years of the World War I were in
general not good for free speech. In an
atmosphere in which Attorney General Thomas W. Gregory could say, “Free
expression of opinion is dangerous to American institutions,” the Wilson
government suppressed statements issued by pacifists and leftist organizations
such as the American Socialist Party and the Industrial Workers of the
World. Federal attorneys prosecuted more
than 1,500 individuals under congressional legislation aimed at treasonous
activity (Espionage Act [1917], Trading with the Enemy Act [1917], Sedition
Act[1918]).
The succession of Republican
presidencies had continued with William McKinley’s election in1896; Theodore Roosevelt’s assumption of the
presidency after McKinley’s assassination in 1901, followed by reelection in
1904; and William Howard Taft’s election
in 1908. But Democrat Woodrow Wilson
ended the streak in 1912 with his defeat of Taft. Roosevelt actually garnered more votes
than Taft as the candidate for the
Progressive Party (nicknamed the “Bull Moose Party”) in that year, and Eugene
Debs claimed a small percentage of the popular vote as the Socialist Party
candidate. In 1916, Woodrow Wilson won
again, and in so doing touted the slogan, “He kept us out of war,” for his
apparent success in terminating the provocative submarine warfare in which
Germany under Kaiser Wilhelm had been engaged.
But in the aftermath of Wilson’s
reelection, multiple events took place that impelled the United States toward
war. In December 1916 Wilson requested a
statement of goals for the war from both the Allies and the Central
Powers. Germany refused, but Britain and
France issued a clear response expressing their intention to hold Germany
responsible for the costs of the war.
Germany had continued provocative behavior on the seas following the
sinking of the British liners Lusitania
(7 May 1915, 128 Americans killed) and Arabic
(19 August 1915, two Americans killed).
Upon Wilson’s demand, the Kaiser publicly ordered the cessation of
attacks on passenger ships in 1915 but then on 10 February 1916 announced the
resumption of unrestricted submarine warfare.
The Kaiser’s latest stated change of
policy came after the Allies started arming merchant marines and ordering them
to attack German submarines. The German
sinking of the ferry Sussex in the
English Channel (24 March 1916), apparently an accident, nevertheless now
seemed suspicious. But the Kaiser issued
a new statement claiming that no merchant vessels would be sunk without
warning, and the seas were fairly calm through the election of Wilson in
November 1916.
But then in February 1917, British
intelligence intercepted a message from German foreign secretary Alfred
Zimmerman to the German ambassador in Mexico, instructing him to offer that
nation an alliance with Germany should the United States enter the war. This came just after Wilson had recalled
United States troops (January 1917) under the command of General John J.(
“Black Jack”) Pershing after a ten-month unsuccessful pursuit of the bandit
Pancho Villa across the border into
Mexico. The Zimmerman note pledged German assistance
in returning Arizona, New Mexico, and Texas to the map of Mexico, a provocative
promise given the tattered United States-Mexico relationship stemming from the
border raids of Villa and Pershing’s failed pursuit of him into the mountains
of northern Mexico.
On 2 August 1917, Wilson asked
Congress for a declaration of war against Germany, citing violations of war,
murder of innocent Americans, and the danger posed to United States security by
the promises in the Zimmerman Note. Wilson
furthermore charged the “Prussian aristocracy” with having declared war on
humanity. And, in language that would
have enormous appeal for Progressives in their optimistic aspirations for the
perfectibility of the human condition, Wilson stated that “The world must me
made safe for democracy through this war to end all wars.”
World War I impelled the United States
government toward a much more activist role in the life of the people than had
ever theretofore been the case. The
Wilson administration created almost 5,000 special government agencies to
oversee aspects of the war effort. The
Council on National Defense endeavored to unify business, labor, and government
behind the war effort. Congress
designated a seasonal daylight savings time;
enforced rationing of coal and oil;
and passed the 1917 Selective Services Act, according to which all males
ages 18 to 45 had to register for the draft.
During 1917-1919, the number of “doughboys” (soldiers) increased from
200,000 to four million. Women entered
the armed forces for the first time and took on many jobs traditionally
occupied by men. These female
contributions to the military effort and to economic production created
momentum for passage of the 19th Amendment (1920) granting women’s
suffrage.
The government spent much more money
than had ever been expended in any military effort
outside the territorial bounds of the
United States. The federal government
had lent $4 million to the Allies even before the United States entered the
war; by war’s end, expenditures had
risen to a total of $26 billion, then with payment of veterans’ benefits in the
aftermath of war that figure would rise to $112,000 billion.
Entry of the United States into the
war on the Western Front came as events moved swiftly toward dramatic change in
Russia, with implications for the military chessboard on which German
commanders would make their moves.
Following the Bolshevik Revolution of October 1917, V.I. Lenin and
others in the new government authorized signing of the Treaty of Brest-Litovsk,
thereby removing Russia from involvement in World War I. In this context of reduced pressure to the
East, German
commanders seized the moment to launch
a major military offensive on the Western Front. In March 1918 German troops moved toward the
Marne River, arriving in May.
An advance force then moved forward
toward Paris but, as they arrived at a point just 50 miles from the French
capital, the Germans for the first time faced opposition from the American
Expeditionary Force. Four military units
from the United States assiduously pushed the Germans behind the Marne, then
the Americans held positions in Belleau Wood through three weeks of tough
combat. By mid-July, American troop levels reached one million, bolstering an
offensive in the Meuse-Argonne Forest.
With the German attack stalled, a coup removed the Kaiser from the
throne; in place of the old monarchy,
leaders of the revolt established a new Weimar Republic.
With the success of the World War I
effort as background, the Wilson administration sent 8,000 United States troops
to Russia’s Arctic ports; troops landed
first on 2 August 1918, and another group landed in Siberia at the middle of
the same month as part of a 14-nation Allied effort to protect Allied provisions
and lend support to the White Russian opponents of the Red Russian
revolutionaries (Bolsheviks). The
Bolsheviks prevailed, though, their leaders thereafter harboring deep
suspicions of the Western Powers as intractable opponents of the Soviet regime.
Fighting ended on the Western Front on
11 November 1918. On 28 June 1919 came
signing of the Treaty of Versailles in the Hall of Mirrors at the famous palace
where Louis XIV had held court.
Arguments for a punitive treaty overcame most of Woodrow Wilson’s rhetorical
flourishes in behalf of his Fourteen Points.
The idea for a League of Nations was included, though, and in the next
few months went into operation with the ironic absence of United States participation.
Wilson went on an 8,000-mile, 22-day
campaign across the nation, making 36 speeches to make the case for United
States entry into the League of Nations.
The president collapsed from exhaustion in Colorado and a few days later
a massive stroke left him paralyzed on his left side. He served out his term with the energetic
assistance of his wife Edith and his secretary Colonel House. The United States Senate would never approve
the Treaty of Versailles or allow entry of the nation into the League of
Nations; the nation concluded a separate
peace with Germany. Wilson lived until
1924 as a man with broken health and a dimmed vision for making the world safe
for democracy.
The Roaring
Twenties and the Depressed Thirties
Americans looked for fun and profit in
the aftermath of war. During the war,
silent movies had become a force in American life, providing grim newsreels on
the reality of war but also launching the careers of stars such as “America’s
Sweetheart” Mary Pickford and movie cowboy William S. Hart. The musical form of jazz also took off as an
American fascination and cultural export to France. Composers and instrumentalists merged diverse
musical ideas from the surging American blues scene with inspiration from
European classical forms. Chicago,
Kansas City, and New York drew southern musicians northward into clubs that
teemed with physical and cultural excitement.
Newly enfranchised women felt a sense
of liberation during the 1920s, although they expressed it with the shallow
libertine self-absorption that characterized much of American life in
the 1920s. They smoked, drank, hiked up their skirts,
and danced frivolously to the Charleston and other popular dances, often in
“speakeasies.” The latter were saloons
featuring backrooms where bartenders filled the glasses and mugs of patrons,
despite the outlawing of intoxicating beverages in the supreme law of the
land:
In 1919 (a year before passage of the
19th Amendment granting women’s suffrage), the 18th
Amendment gained congressional and state approval for what became known as
Prohibition, forbidding the manufacture, sale, transport, import, and export of
intoxicating alcohol. The Anti-Saloon
League lobbied persistently and with ultimate success for this constitutional
amendment, but their noble aspirations were widely flaunted. Profit-seeking criminal organizations led by
ambitious and colorful characters such as Chicago-based Al Capone superintended
the illegal manufacture and sale of liquor.
In 1933 came congressional and state reversal on the issue: The 21st Amendment nullified the
Prohibitionist provisions of the 18th Amendment.
Given the wild and wacky nature of
life in the United States during the 1920s, Warren G. Harding’s slogan pledging
a “return to normalcy” in winning the 1920 presidential election proved to be
ironic. In addition to the departure
from normalcy in social mores during the early years of the 1920s, aspects of
Harding’s administration during that same period were decidedly not normal. President Harding governed inattentively and
his administration gained notice for an abnormal level corruption. The highest profile case of corruption was
the Teapot Dome Scandal: Interior
Secretary Albert Fall was convicted of taking bribes for allowing oil companies
onto land (Teapot Rock) in Wyoming that had been set aside as a naval
reserve. And Harding’s death was
atypical, inasmuch as he was one of only three presidents to die of natural
causes while in the presidency (William Henry Harrison and Franklin Roosevelt
were the others): He died in 1923 of a
heart attack. His vice-president Calvin
Coolidge took over and won reelection on his own in 1924. Herbert Hoover made the decade of the 1920s a
sweep for Republican presidential candidates with his victory in 1928.
In 1919, another Hoover--- Edgar J., a young Justice Department
official--- made the first of what was
to be many headlines for him, this one in the “Red Scare” episode. When the front porch of Attorney General A.
Mitchell Palmer’s home was destroyed by a mail bomb, he enlisted Hoover’s help
in conducting raids under the guise of searching for Bolsheviks who might be
responsible for Mitchell’s personal losses, and who might be guilty of more
generally dangerous activity due to their Communist ideology. As a result of the “Palmer Raids,” conducted
in a “Red Scare” atmosphere, 5,000 suspicious people were arrested and 249 were
deported. Palmer predicted rioting on a
wide scale for May Day (1 May 1920), but no such outbreaks of public disorder
occurred. The Red Scare abated for three
decades, but as Federal Bureau of Investigation (FBI) chief for most of the
last half of the 20th century, Hoover fanned the flames of
anti-Communism and created an environment in which similar fears pervaded
American society during the 1950s.
Labor unrest did in fact make a
generally conservative American society uncomfortable and open to assertions of
Communist infiltration. Numerous strikes
ensued in the aftermath of World War I;
four thousand workers stayed away from work in a multiplicity of
locations. Included among these were
controversial work stoppages by public employees, most famously the Boston
Police Strike of 1919.
Massachusetts Governor Calvin Coolidge
launched himself into national prominence with his conservative stance and
clear statement: “There is no right to
strike against the public safety by anybody, anywhere, anytime.” Coolidge called out the National Guard of
Massachusetts to break the strike and maintain order. None of this was surprising for a rising
political star in a Republican Party increasingly focused on its constituency
among corporate leaders. A few years
later Coolidge, known for terse verbal formulations, would state that “The
business of the American people is business.”
American society during the 1920s
featured opposing viewpoints and major tensions that were difficult to
reconcile. African Americans moving from
the South on a Northern Migration sought safe havens in urban centers that fell
very much short of expectations.
Americans who continued embedded in rural life found tales of the city
confusing and disturbing; urban society
in turn tended to view such folk as country bumpkins. Rural residents touted their fresh air and
intimate relationship with the land;
urbanites were proud of their bustling enterprises and skyscrapers with
steel girders that allowed them to reach toward the heavens (with 86 stories
completed on 1 May 1931, the Empire State Building would be the tallest for
many decades after the 1920s). Sinclair
Lewis’s Main Street (1920) was one of
the most famous examples of a literary tendency of the period to focus on the
banality of small town life; small town
folk returned the contempt, viewing urban culture as vice-ridden, corrupt, criminal,
alien--- and too full of aliens.
Urban centers were also the carriers
of strange new ideas. Sigmund Freud’s
theoretical formulations presented the human psyche as formed from past
experiences that lay suppressed for lack of processing; terms bearing his fundamental psychoanalytic
concepts--- libido, inhibition, id, ego,
superego, Oedipus complex, Electra complex, transference, sublimation,
repression, conscious, unconscious, subconscious--- circulated in social conversations. People were less adept at discussing the work
of Albert Einstein, but many fairly well-educated people were at least aware
that his mathematical models led him to question the reliability of Newtonian
physics in predicting pathways for objects moving at great speeds and in the
distant realms of the universe. Thus,
modernity seemed to hold unfathomable mysteries in both the inner world of the
human soul and the outer world of the cosmos.
Thus, relativism captured the
imaginations of intellectuals and artists.
Conflict gained acceptance as a catalyst for change. Abstract painting eschewed representation and
embraced symbolization. Creative writers
declared liberation from established rhyme schemes and organizational
structures: poets wrote in free verse,
novelists in streams of consciousness, dramatists from the whisperings of
interior monologues. Artistic bohemian
communities counterpoised themselves to bourgeois lifestyle and values, taking
residence in New York’s Greenwich Village, Chicago’s South Side, and San
Francisco’s Haight Ashbury.
But mainstream culture was also
undergoing major change. Technological
and commercial innovation resulted in a consumer revolution with an array of
new products as vanguard: matchbooks,
lighters, oil furnaces, wristwatches, antifreeze, reinforced concrete, paint
sprayers, dry ice, Pyrex, panchromatic film, rayon, cellophane. By 1930, Americans were putting an aggregate
of 20,200,000 telephones to use.
Electrical power output reached 117 billion kilowatt hours by 1929; ten holding companies controlled 72% of all electrical power in the United
States. With WWJ in Detroit, KDKA in
Pittsburgh, and WEAF in New York leading the way, 508 radio stations
transmitted programs by 1922.
The National Broadcasting Company
(NBC) commenced linkage of stations in 1926;
the Federal Radio Commission (as of 1934, the Federal Communications
Commission) began regulating the
communications industry in 1926, the same year that Philo Taylor
Farnsworth produced the first all-electronic television image. Movie theaters in both urban and small-town
settings attracted audiences who were mesmerized by the fantasy world of the
big screen. The Warner Brothers produced
the first “talkie”--- Don Juan--- in 1926;
a year later came the hugely popular The
Jazz Singer.
And the revolution in transportation
continued unabated. In 1925, the
factories of the Ford Motor Company churned out a new car every ten
seconds--- over two million automobiles
each year. Aviation technology had
improved rapidly over the two decades since the Wright brothers made their
flight. Very limited use had been made
of airplanes during World War I, but some daredevil pilots had dropped bombs
from these still fragile machines, while others had used dirigible airships and
balloons for the same purpose. By the
1920s, though, pilots were recording many first-ever feats in much more durable
airplanes. Richard Bryd made the first
flight over the North Pole in 1926 and accomplished the same feat over the
South Pole in 1929. Colonel Charles A.
Lindbergh completed the first solo flight over the Atlantic Ocean in May 1927,
traveling from New York to Paris in 33 hours, 30 minutes. Amelia Earhart flew trans-Atlantic in 1929,
the first woman to do so. Others gained
reputations among the pioneers of aviation for their skill as pilots; these included Jimmy Doolittle, Wiley Post,
and Howard Hughes.
Dramatic change in American life
impelled many people to seek security in the familiar and the fundamental:
Nativists feared immigration and stigmatized
foreigners. In 1920, police arrested
Italian immigrants and avowed anarchists Nicola Sacco and Bartolomeo Vanzetti
on suspicion of payroll robbery and murder.
The trial judge referred to the defendants as “anarchist bastards”; the verdict was guilty and the sentence
execution. News of the death of Sacco
and Vanzetti circulated internationally and appalled leftists and humanitarians
convinced that the two were convicted for their beliefs rather than for their
alleged crimes.
Religious fundamentalists, convinced
that modernism threatened American spiritual life and values, insisted on their
particular interpretation of the Holy
Bible as paramount over evidence
from biologists and other natural scientists.
They had long opposed the work of Charles Darwin in his Origin of the Species, with its detailed
presentation of evidence garnered from extended observations on the Galapagos Islands. Darwin’s work strongly indicated that animal
and plant species thrive and survive on the basis of adaptation to their
natural environments, evolving into new forms as situations demand. While his work focused most closely on plant
life and reptilian creatures on the Galapagos, he concluded from general
observations and careful reasoning that human beings evolved as highly adaptive
creatures in lineal relationship to the Great Apes. This contravened Fundamentalist belief that
God created the world and all forms of life over a seven-day span.
This issue of human origins gained
great notice in the trial of John Scopes, who accepted a challenge issued by
the American Civil Liberties Union to any teacher who would violate Tennessee
law proscribing curriculum based on evolution.
Agnostic labor lawyer Clarence Darrow defended Scopes. True Believer William Jennings Bryan
prosecuted the case for the state. As
the trial neared conclusion, Bryan appeared as his own expert in biblical
matters to declare that a “big fish” swallowed Jonah, Joshua made the sun stand
still, and God created the world in seven days.
Caustic newspaper columnist H.L. Mencken and other critics of
Fundamentalism derided Bryan’s biblical literalism. The jury found Scopes guilty, but the fine
was assessed at only $100. Bryan’s
oratorical exertions and the high-profile stress of the Scopes “Monkey” Trial
took a physical toll: He died of a heart
attack within days of the last arguments.
The case inspired liberals and civil libertarians to claim academia as a
realm for the free expression of ideas.
Thus the tensions of a fast-changing
society continued to produce stresses and strains within the American
citizenry. The momentum seemed to be
with those who optimistically embraced rapid change and rode swiftly forward on
currents of optimism. But then one kind
of optimism got a great come-uppance:
As Gross National Product (GNP)
increased from $88.9 billion in 1920 to $104.billion in 1929, and as per capita
income went from $672 to $872 during the same period, the national economy
seemed to be following a perpetual upper trajectory. Many investors regarded the stock market and
real estate as can’t-miss investment.
Speculators invested heavily in Florida real estate. The business-friendly administrations of
Harding, Coolidge, and Hoover freed up funds for investment by cutting
taxes. Many did so with reckless
abandon, making down payments and then covering the remainder of the investment
with funds borrowed from stock brokers.
Even as evidence accumulated that
residential construction was exceeding demand, the market for automobiles was
peaking, and consumers were reaching the limits of their ability and
willingness to spend, investment in the stock market continued to follow a
steadily upward curve. During
March-September 1929, the per-share price for stock in Radio Corporation of America
(RCA) rose 600%. But the market peaked
on 3 September 1929 and the bubble of optimism burst on Tuesday, 29 October
1929. Panicked sell-offs resulted in
16.4 million shares trading hands on what became known as “Black Tuesday”; that figure for shares traded was far above
the typical three million for a single day.
The stock market crash caused
widespread misery. Savings and fortunes
evaporated, unemployment climbed precipitously, and total personal income
declined from $82 million to $40 million during 1929-1932 as 9,000 banks
closed. Those operating factories and
mines of many kinds in many places called a halt to production. Thousands of farmers lost their land to foreclosure.
President Hoover had a history as an
economic activist of a certain kind. As
Secretary of Commerce during the Harding and Coolidge administrations, Hoover
promoted standardization in industry and encouraged the trade-association
movement. He added personnel and
functions at the Bureau of Foreign and Domestic Commerce and directed the
Simplified Practices Division of the Bureau of Standards to host conferences on
industrial design, production, and distribution of goods. In 1926, Hoover created the Bureau of
Aviation and in 1927 established the Federal Radio Commission (FRC).
But Hoover took these actions to
promote and increase efficiency in private enterprise. He did not believe in heavy government
expenditure to provide relief or economic stimulus. When the stock market crashed, Hoover’s first
impulse was to promote voluntarism. Only
reluctantly did Hoover fund a few public works projects, such as the Hoover Dam
(on the Colorado River between Nevada and Arizona). In 1932, the president created the
Reconstruction Finance Corporation (RFC) for dispensation of aid to
institutions and agencies but not to individuals. In that same year, Hoover approved of
Congress’s passage of the Federal Home Loan Bank Act, with the aim of providing
guaranteed loans for those demonstrating the economic potential to make
mortgage payments as homeowners.
As the 1932 presidential election
loomed, an incident occurred that would seal the outcome and change the course
of American history. In 1924 Congress
had passed legislation providing for veterans’ endowment life insurance,
payable in 1945 to those who had served in the armed services.
But as the Depression hit, many
veterans sought to claim their bonuses immediately. In the course of spring 1932, 15,000 veterans
entered Washington, D. C., setting up camp in empty government buildings and in
a shantytown near the White House. After
the United States Senate voted unfavorably on a bill for early bonus payments,
veterans felt dejected but many went home.
President Hoover prevailed upon congress to buy train tickets for those
who lingered, but 300 stayed for lack of any certain place to call home. An ugly confrontation ensued when police came
to remove those still in the Shantytown.
A shot was fired, provoking a military response from Secretary of War
Patrick J. Hurley. Hurley ordered United
States troops to remove the veterans and clear the area. The troops did so, setting the encampment on
fire and utilizing tear gas to disperse the downcast, weary, and mostly
desperate men who had held out till the end.
The American public blamed Hoover for
the incident and for what people saw as an abiding lack of compassion for the
veterans encamped in the Shantytown.
Hoover’s handling of what came to be known as the “Bonus Army” became
instrumental in his defeat by Democrat Franklin W. Roosevelt (a distant cousin
of Theodore Roosevelt) in the 1932 November election for president of the
United States.
Roosevelt lost no time moving a much
more aggressive federal government agenda to address the problems of Americans
during the Great Depression of the 1930s.
So famous are those first measures that they are recognized as the First
Hundred Days of the New Deal, a sweeping federal government initiative to
combat the economic ills plaguing the people.
Inaugurated on 4 March 1933, Roosevelt within two days closed all banks
in the United States and called members of Congress to Washington for a special
session. On 9 March 1933, Congress
passed the emergency Banking Act providing for Treasury Department inspection
of all banks before they could be reopened.
Within three days, 75% of all banks reopened and one billion dollars in
hoarded currency and gold were redeposited.
The New Deal was an exercise in the
sort of pragmatism associated with William James, much in the Progressive
spirit of the late 19th century and early 20th
century. The idea was to experiment with
various solutions, starting with logical steps to solve problems, then adding
and subtracting programs according to what worked and what did not. The programs included relief (aiding
suffering people), recovery spurring the economy), and reform (preventing another such financial crisis). Particularly important programs included the
Agricultural Adjustment Act ( AAA, 1933), limiting production of some crops and
providing subsidies to farmers; National
Industrial Recovery Act (NIRA, 1933) to facilitate cooperation between labor
and business; the Federal Emergency
Relief Administration (FERA, 1933); the
Tennessee Valley Authority (TVA, 1933) to generate reasonably priced electrical
power; the Rural Electrification
Administration (REA, 1935), providing power to people living in isolated rural
areas; the Social Security Act (1935),
providing retirement income; the Wagner
Act (1935) providing for an eight-hour work day and overtime pay for weekly
work beyond 40 hours; the Works Progress
Administration (WPA, 1935), providing government jobs for many different kinds
of work, including that done by artists, writers, and skilled tradespeople; and the Civilian Conservation Corps (CCC,
1935), another government employment program, providing public works projects
for constructing buildings, roads, bridges, and parks, and for improving
forestry and recreational areas.
The New Deal drew criticism featuring
a variety of perspectives, covering the political spectrum for left to right,
and including populist sentiments of murky ideology. Dr. Franklin Townsend (an elderly California
physician) organized a movement that pressed for action beyond the initial
programs of 1933 and impelled Roosevelt to support and Congress to pass the
Social Security Act. Father Charles
Coughlin gave weekly radio sermons calling that could be ideologically erratic
but most compellingly applied pressure for monetary reforms and full
nationalization of the banking system.
Louisiana Senator Huey Long advocated a Share-Our-Wealth program that
would have confiscated surplus wealth for redistribution to the needy. One of his colorful phrases called for
ensuring that there would be “a chicken in every pot.”
Protestant pastors in general opposed
the New Deal as unscriptural and therefore sinful. When Roosevelt ran for reelection in 1936,
70% of Protestant ministers supported his Republican opponent, Alf Landon. In the course of the 1930s and 1940s, an
emerging coalition of Jews, Roman Catholics, progressives, and southern
populists provided sizable majorities for Roosevelt, who was reelected in 1940
and 1944 for a record four times in all.
Under circumstances of economic
depression, the political message of leftists found significant support. In the run-up to the 1932 presidential
election, fifty-three artists and intellectuals signed an open letter endorsing
Communist Party candidate. Others on the
left formed John Reed Clubs, honoring and articulating positions consistent
with those of the American journalist who observed and wrote sympathetically
about the 1917 Russian Revolution. By
1935, many leftist groups articulated a “popular front” position for alliance
with socialist and liberal democratic groups in opposition to fascism. Such groups included the League of American
Writers (new incarnations of the John Reed Clubs), American Youth Congress, the
American Negro Congress, and the American League for Peace and Democracy.
The threat of fascism was very real as
the 1930s came to a close. Mussolini and
his Black Shirts stirred up supra-nationalist sentiments and established a
fascist regime in Rome. Francisco Franco
launched his movement for ouster of the Spanish monarchy and the establishment
of a fascist republic. Chauvinist
militarism took hold in Japan and impelled that nation forward for territorial
acquisition in Asia and the Pacific.
Hitler made his moves on the Rhineland, Austria, Czechoslovakia, and
Poland. When he invaded the latter
nation on 1 September 1939, France and Great Britain declared war on Germany
.
World War II had now begun. Attention in the United States now shifted
from an economy in depression to an economy in mobilization. The United States Congress passed neutrality
laws forbidding the sale of arms to belligerents and warning citizens to stay
off passenger liners owned by the nations contending on the Atlantic Ocean and
in Europe. But the United States was drawn
inexorably into another world conflict.
According to the Lend-Lease program, and with changes in neutrality laws
to allow transport of war materials, American ships sailed under threat from
German submarines. On 21 October 1941, a
torpedo shot from a German submarine sank the United States destroyer Reuben James, prompting Roosevelt to
issue a “shoot on sight” order; thus, an
undeclared state of naval war now brought the United States and Germany into
confrontation on the high seas.
Formal entry into World War II for the
United States came on 8 December 1941, the day after the 7 December 1941 attack
on Pearl Harbor. British Prime Minister
Winston Churchill flew to the United States later in the month, and on 1
January 1945 he and Roosevelt announced a formal alliance that included the
United States, Great Britain, and the Soviet Union.
Roosevelt’s New Deal had relieved
suffering among the people, put people back to work, and halted the economic
slide. But government and private
spending on wartime production energized the American economy and ended the
Great Depression. Participation in World
War II, though, came with the ugly costs in blood and lives that make such
conflicts always regrettable.
World
War II
Americans were especially oriented
toward fighting in the Asia-Pacific (1942-1945) theater of the war, but the
Roosevelt military commitment also lent American lives for energization of the
North Africa (1942-1943) campaign and for prime responsibility throughout the
Italian (1943-1944) campaign. Then
United States forces contributed mightily to victory in both the European and
Asia-Pacific theaters of the war: Dwight
D. Eisenhower directed the D-Day operation, and United States troops were
heavily involved in follow-up battles that compelled the Germans to capitulate
by 8 May 1945 (Victory in Europe [V-E Day]);
and island hopping maneuvers by United States naval, army, and
amphibious units took back the Pacific from the Japanese, setting up
conventional bombing of Tokyo and the fatal and fateful atomic bombings of
Hiroshima (6 August 1945) and Nagasaki (9 August 1945) that forced admission of
defeat from Emperor Hirohito (14 August 1945) and formal surrender in Tokyo Bay
(2 September 1945).
The United States benefited from the
influx of highly educated Jewish people and other refugees from Nazi
persecution. Most notably, the work of
physicist Albert Einstein (1879-1955) provided the theoretical underpinning for
development of atomic weaponry via the Manhattan Project, and he was among
those who convinced President Roosevelt that he must develop an atomic bomb in
order to counter the capability of the Germans to construct such a weapon. Research on atomic fission was conducted at
major universities (Columbia University, Princeton University, the University
of California (Berkeley and sites), University of Chicago). In 1943, the Roosevelt administration
established a laboratory under the direction of J. Robert Oppenheimer and
General Leslie R. Groves in Los Alamos, New Mexico, dedicated to the
development of an atomic bomb. At Los
Alamos 125,000 people contributed their talents on a project the results of
which were manifested on 16 July 1945 in a frighteningly successful test
explosion in the White Sands Desert outside of Alamogordo, New Mexico.
Despite contributing mightily to the
effort to end the Hitler regime, and benefiting tremendously from the talents
of Jewish refugees and immigrants, the United States was not an entirely
welcoming haven for Jews fleeing Nazi rule.
The ship St. Louis carried
nearly 1,000 escaped German Jews to the Atlantic Coast at Miami, only to be
turned away. The State Department
neglected to use 90% of its legal visa
quota. Allied bombers flew missions
close to the Auschwitz death camp in Poland, but United States military
decision-makers demurred when advocates sought bombing of the death camps, or
at least destruction of rail lines leading to the camps. This policy persisted, despite clear
understanding on the part of the Roosevelt administration that the Nazis were
systematically rounding up Jews--- as
well as Poles, Roman Catholics, Communists, homosexuals, gypsies, and others
deemed undesirable--- for
extermination. The death toll in the
concentration camps in eastern Germany and Poland tolled 10 million by war’s
end; six million (6,000,000 )of these
were Jews.
Forty-five percent (45%) of the funds
that the United States government spent on World War II were derived from tax
revenues. The remaining 55% came from
bond drives; the 1945 Victory Drive near
war’s end raised $150 billion. Still,
revenues never matched expenditures during the war: Government spending stimulated the economy
but also produced a national debt of $260 billion.
As the first peacetime draft in
American history took effect in September 1940 (three months before the bombing
of Pearl Harbor), Roosevelt and Congress cooperated on a budget that increased
defense spending from $2 billion to $10 billion.
Franklin Roosevelt prevailed upon
Congress to make an initial $7 billion dollars available to those nations whose
existence and economic well-being were vital to American interests. He articulated a role for the United States
as an “arsenal of democracy” via a “Lend-Lease” program providing munitions and
other articles of warfare to the Allies.
This Lend-Lease program, which ultimately delivered $50 billion worth of
military goods to the Allies, had a highly stimulative effect on the United
States economy. An even bigger boost
came with the War Powers Act (1941) and the Second War Powers Act (1942), which
empowered the federal government to direct industrial production as needed for
the war effort. The Roosevelt
administration established the War Production Board in 1942 to direct
industrial conversion, so that auto makers manufactured tanks, textile factories
made mosquito netting, toy factories manufactured hardware, and home appliance
factories produced munitions.
Wartime production boosted the total
value of goods and services, so that the Gross National Product (GNP) more than
doubled, from $100.6 billion in 1940 to $213 billion in 1945. Federal government expenditures increased
from $20 billion in 1941 to $97.2 billion in 1944, and total federal government
expenditures during the period encompassed by 1 July 1940 and 30 June 1946
followed a steep upward trajectory, reaching $337 billion by the latter
date. Of this total, spending for the
war effort took $304 billion, over 90% all expenditures during the period. The Great Depression ended because the demand
for military goods provided a tremendous boost to industry, driving profits upward,
inducing investment in capital goods for expansion, creating an abundance of
jobs, and putting money in the pockets of workers. Many of those jobs were essentially the same
as the roles given to people in the fighting forces, which absorbed a great deal
of previously unemployed and underemployed labor: The
number of service women and men increased steadily year by year, as
follows: 500,000 in 1940; 1.8 million in 1941; 3.8 million in 1942; 9 million in 1943; 11.4 million in 1944; and 12 million in 1945.
The need for workers to fill all of
the new jobs was so great that a special “Rosy the Riveter” campaign,
pictorially featuring a female welder, ensued to propel women into positions
traditionally occupied by men. By 1945,
six million women entered the workforce, including 24 percent of all married
women. Women took jobs as toolmakers,
machinists, crane operators, lumberjacks, stevedores, blacksmiths, and railroad
workers. Also, a total of 200,000 women
served in the Women’s Army Corps (WAC),
Women Accepted for Volunteer Emergency Service (naval WAVES), and the Marine
Corps, Coast Guard , and Army Air Force.
In an economy weighted toward
production of military goods, shortages of consumer goods created inflationary
pressures. In January 1942, Congress
authorized the Office of Price Administration to establish ceilings for
high-demand goods. According to the
General Maximum Price Regulation (1942), prices were to be frozen at the
highest levels that they had reached during the prior month. This necessitated a rationing system whereby
consumers paid for tires, sugar, coffee, gasoline, and meats with rationing
coupons.
After the Pearl Harbor bombing brought
the United States into World War II, there was a mobilization of the national spirit
as well as the national economy.
President Roosevelt continued the “Fireside Chats” via radio that had
provided consolation and inspiration to people during the Great Depression. His voice was warm and reassuring, his
message unflappably optimistic. When
photographed his demeanor matched his voice, showing a face ever smiling, with
a touch of mischievousness, conveying courage, generosity of spirit, and faith
in the people and their nation.
Journalists and photographers cooperated in photographing Roosevelt only
in the most favorable
presentation, never revealing the
polio that made him dependent on crutches, the wheelchair, and the arms of
aides. Roosevelt in fact cultivated the
press as an extension of the presidential office: He suggested headlines and
indicated topics that should be covered in media stories.
Roosevelt was also enthusiastic about
the potential of popular movies to convey messages likely to stir enthusiasm
for the war effort. As the film industry
became ever more technically sophisticated, so that movies, too, served to
elevate the spirit of the American people and inspire them in the war
effort. Notable films of the period with
inspirational wartime themes included Across
the Pacific (1942); Casablanca (1942); Air Force (1943); Sahara (1943);
and They Were Expendable (1945).
Franklin Roosevelt was a physically
sick man by the time he attended the Yalta meeting with Churchill and Stalin in
February 1945. He retreated to Warm
Springs, Georgia, for a last-gasp attempt to recover his health but suffered a
stroke and died there on 12 April 1945.
He was succeeded by Vice-President Harry S. Truman, to whom devolved the
careful weighing of options before deciding to drop the atomic bombs on
Hiroshima (6 August 1945) and Nagasaki (9 August 1945) that brought a statement
declaring cessation of the Japanese military campaign from the emperor Hirohito
on 14 August 1945.
As of 2 September 2015, World War II
came to a formal conclusion, and the United States faced a bevy of issues that
still abide in 2015.
VII. Events and Issues, 1946-2015
The world was very different after
World War II, and the United States had a new and much more exalted role on the
international scene. Over the course of
the first years succeeding World War II, the United States emerged as a
Superpower, with its chief contender now the Soviet Union. The success of the 1948 Berlin airlift gave
the United States a sense of self-confidence that its power could be used as
necessary to keep the Soviet Union from abusing its own military might. Under the reality of mutually assured
destruction whereby the aftermath of a World War III was too horrible to
contemplate, the two sides settled into a Cold War full of espionage,
situational stare-downs, and hot
wars fought through proxies.
A
New Role in a Changed World, 1946-1965
A few months before the bombing of
Pearl Harbor, in the autumn of 1941, Franklin Roosevelt and Winston Churchill
met off the coast of Newfoundland to discuss war aims, articulating these in
the Atlantic Charter. These aims
included most importantly the freedom of the seas, self-determination for
nations, and a permanent international peace organization. The latter was similar in ideational content
to the League of Nations proposed by Woodrow Wilson but founded in a very
different time, facing substantially different issues, and given a new
name: United Nations, which brought to
its chambers representatives from 51 nations at its founding in 1949; in 2015, the United Nations has a 51-member
constituency.
A bevy of new nations appeared across the
globe within a few years of war’s end, and many more would form during the
entire 1946-2015 period. Two of the most
important new nations were Israel and India, presenting, though, very different
circumstances at their founding:
Israel was born in 1948 as a haven for
the Jewish people from centuries of diaspora
and the horrific experience of the Holocaust perpetrated by the Nazis; the founding of Israel occurred not as a
revolt against imperial rule, although it occupied turf that had been
controlled by Great Britain during World War II: Israel was founded, rather, to protect a
people who felt a biblical attachment to a territory perceived as promised to
them by God (Yahweh), but on which Arabs had also dwelled for centuries in the
land they called Palestine. David Ben
Gurion became first president.
India gained independence in 1947, a
grand old civilization spread across a vast subcontinent coming together for
the first time as a modern nation-state.
This was the India wherein Mohandas K. Gandhi had launched his satyagraha movement as the main force in
the quest to oust the British.
Jawaharlal Nehru became first president of India; his Congress Party, which was an incarnation
of the Indian National Congress first formed in 1885, dominated India’s
political scene for many years after independence. Pakistan also became an independent nation in
1947, led by Muhammad Jinnah.
These two new nations loomed large in
the consciousness of leaders many citizens of the Untied States. Israel received much material and moral
support from Jewish people in the United States, and from both major
parties; over time, Democrats would
especially cultivate a political constituency among the Jewish people. India emerged as a rather messy but genuine
democracy, therefor serving as a bulwark against the Soviets in a world full of
colonized and oppressed people to whom Marxist rhetoric often appealed.
Very disconcerting to the United
States, and among the reasons to celebrate the democratic system in India, was
the founding of the People’s Republic of China.
Mao Zedong for two decades of the but postwar period presented the most
authentic vision of the modern Communist state.
He was an avowed antiimperialist and his rhetoric expressing antipathy
to the United States as the chief bearer of capitalist values could pierce
American sensibilities. For the most
part, though, he concentrated on development within China, overseeing the
establishment of a modern infrastructure while experimenting with various rural
and urban collectivization schemes. A
great deal of the countryside was gradually collectivized and for a time seemed
to lift production as landowners were ousted and peasants moved to the fore. But Mao overreached in his Great Leap Forward
(1958-1959), which attempted to take collectivization to the level of the
multi-village Commune; and later in his
Great Proletarian Cultural Revolution, which whipped up revolutionary zeal
among young people who were encouraged to criticize anyone (including parents and
family members) who evidenced belief and action contrary to
Marxism-Leninism-Mao Zedong Thought.
These two large political movements
(Great Leap Forward and Cultural Revolution) caused considerable misery and
loss of life, but much of that was hidden until Mao’s death in 1976. The turmoil, though, was palpable and very
frightening to the leadership of the United States. Although Mao broke with the Soviet Union in
1962, to people in the United States, a map across the expanse of Asia from the
Ural Mountains and Asiatic Russia through China appeared far too “Red,” and Mao
seemed far too intent on making a mass-based Communist state a reality.
And this occurred within the context
of a world in which the Iron Curtain had been drawn across Eastern Europe,
communist insurgencies loomed in Southeast Asia, and the Soviets seemed bent on
exporting the tenets of their system to the leadership of any emerging state in
Africa and Latin America that proved ready to listen--- and accept aid, which many did.
In this context went forward the
Marshall Plan (articulated as Congressional legislation, in 1947, implemented
1948-1951) to rebuild the economies in Europe, including that of West
Germany; and the essentially benign
American Occupation of Japan (1945-1952).
In the latter case, the United States worked with Japanese friendly to
liberal democracy and capitalism to establish a framework for postwar
governance. The constitution went into
effect in 1953, describing the legal tenets for a constitutional monarchy with
a parliamentary system of representation, utilizing the emperor as a symbol of
state and a force for cultural continuity---
but with the renunciation of divinity.
Implementation of the Marshall Plan
and the program of the American Occupation in Japan were among the most
successful international initiatives of the United States in the postwar
period. The economies of Allies Great
Britain and France revived, and a government and economy to American liking went
into action in Japan. The economies of
Japan and West Germany were among the most successful in the world during the
1950s through the 1980s, and their governments were very friendly to the United
States. For the United States, securing
the stability and friendship of these wartime opponents provided an ironic
counterpoint to the antipathy for the American system expressed by the Soviet
Union, their erstwhile ally against Nazi Germany.
The Cold War confrontation permeated
domestic and international affairs. In
1947, George F. Kennan wrote what at the time was an anonymous (unsigned)
article for the journal Foreign Affairs that
prescribed the policy that would guide United States policy-makers through the
1980s. He wrote that “It is clear that
the main element in any United States policy toward the Soviet Union must be
that of a long-term, patient but firm and vigilant containment of Russian
expansive tendencies.” Also in 1947,
President Harry S. Truman paired Kennan’s article with his own policy of aid to
Greece and Turkey and efforts to guide the politics of those nations, stating
in what became known as the Truman Doctrine, “I believe that there must be a
policy of the United States to support free peoples who are resisting attempted
subjugation by armed minorities and outside pressures.”
As the Iron Curtain drew tight, the
Chinese Communist victory was secured (1949), Soviets detonated their own
successful atomic bomb (1949), a Communist government took power in North
Korea, and both the United States and the Soviets exploded hydrogen warheads
(1953-1954), Kennan’s foreign policy imperative was internalized by strategists
and public alike: The Cold War was among
the grimmest realities of postwar life.
As the United States and its allies in Western Europe formed the North
Atlantic Treaty Organization (NATO)in 1949, and the Soviet and Eastern Bloc
Communist states formed the Warsaw Pact in 1955, the two sides squared off for
what was to be four decades of confrontation, propaganda, and maneuvering for
advantageous international position.
A virulent form of the anticommunist
spirit in the United States during the 1950s came to be known as McCarthyism,
for the influence of Wisconsin Senator Joseph McCarthy on the tenor of the
times. During a speech in Wheeling, West
Virginia, in February 1950, McCarthy raised a paper purported to be a list of
205 communists in the State Department.
McCarthy chaired a special subcommittee for investigation of subversion
in public and private organizations.
Hearings drew heavy media coverage, and McCarthy rose to national
prominence. But McCarthy never produced
evidence that any federal employee had communist connections. After he recklessly attacked President Dwight
D. Eisenhower, Secretary of the Army Robert Stevens, and the armed services in
1954, the Senate held Army v. McCarthy
hearings. Among the first to be
televised, these hearings presented to a disgusted public a crude and rude man
who issued brutish accusations and cruel comments that had no substantive
grounding. McCarthy looked starkly
ridiculous. Senators turned on their
colleague in December 1954, voting 67 to 22 to censure him for “conduct
unbecoming to a senator.” McCarthy died
three years later of complications related to alcoholism.
The GNP grew from $214 billion in 1946
to $347 billion in 1952. Gone were the
days of 15-25% unemployment during the 1930s:
Economic mobilization for World War II produced a boom that did not
abate during the 1950s and early 1960s, when unemployment stayed at about 5% or
lower.
Suburbs, those residential areas
ringing major urban centers, became much more populous and influential in
American life. Housing construction
boomed and automobile ownership doubled. Per capita income increased more than
20% during 1945-1960, raising familial purchasing power accordingly. During the 1950s, the United States had
achieved a standard of living rising to heights unprecedented in world
history.
American society was a study in
contradictions during the 1950s. There
was a discernible disconnect between mainstream society and the world of
artists and intellectuals:
Mainstream society sought security and
contentment, goals that seemed imminently reachable in the context of the
postwar economic boom. Rosie the Riveter
gave way to Helen the Homemaker, at least in the popular imagination and
frequent aspiration. Many men identified
as providers in urban settings, often seeking work in corporate America, where
loyalty rewarded employees with secure positions for employment during the
working life and pensions in retirement.
Labor unions grew and could be contentious with big business but sought
the same essential goals: In time, the
auto factory worker would also achieve middle class status, with good wages,
health care benefits, and security in retirement. The domestic coziness idealized by mainstream
Americans seemed perfectly captured in the Norman Rockwell paintings, many of
them finding their way onto the front covers of the Saturday Evening Post.
But there was another America that
read Jack Kerouac’s On the Road, the
poetry of Allen Ginsburg, and followed the life and novels of William S.
Burroughs along the beatnik trail. This
was an American disgruntled with a perceived bourgeois banality. The beat poets, bohemian beatniks, jazz
innovators, absurdist playwrights (Samuel Beckett, Waiting for Godot) and avant-garde
artists (many of these latter
having come as refugees from Nazism to stimulate Abstract Expressionism, as in
the wildly colorful displays and sprays of Jackson Pollack), all harbored
cultural viewpoints that presaged a shift in the zeitgeist during the 1960s.
African Americans also were manifestly
discontent, as Rosa Parks and Martin Luther King (Montgomery Bus Boycott, 1955)
broadened a pathway to freedom on turf broken painstakingly by A. Phillip Randolph
(leader of the Brotherhood of Sleeping Car Porters) and Thurgood Marshall
(chief attorney for the National Association for the Advancement of Colored
People [NAACP] in Brown v. Board of
Education [1954], which ended school segregation as a legal institution). The late 1950s and early 1960s were replete
with major events in a burgeoning effort to overcome the deep injustices of
history wrought by the Compromise of 1877.
Following the Brown v. Board of
Education decision and the Montgomery Bus Boycott, confrontation at Little
Rock’s Central High School (1957), the Emmett Till murder (1961), James
Meredith’s troubled entrance and matriculation at the University of Mississippi
(1962), the murder of four young girls in a Birmingham church (1963), lunch
counter and other sit-ins in the early 1960s, and efforts to highlight the
difficulty of voting in Mississippi and other southern states during the same time span, all created
momentum for the landmark passage of the Civil Rights Act of 1964 and the Voting
Rights Act of 1965.
The death of Joseph Stalin in 1953
cleared a pathway for a more amiable character, Nikita Khrushchev, to attempt
to put a brighter face on communism in power.
Khrushchev liked to travel, and his shaking hands with factory workers
on a trip to Great Britain abetted an image of the new Soviet leader as an
earthy people person. But Khrushchev
also could be very prickly, as when he slammed his shoe down at the United
Nations and declared to a United States contingent that “We will bury you!”
Also the launching of Sputnik (1957), the construction of the Berlin Wall
(1961), and the shakeout from Fidel Castro’s victory in Cuba (1959), gave the
United States leadership a heightened sense of competition with this other
major superpower. The failed United
States effort to undermine the new Castro government in the Bay of Pigs
Incident (1961) and the tense stare-down in the Cuban Missile Crisis 1962)
resulted in major initiatives to outshine the Soviets in matters of science and
technology.
Many an academically talented United
States youth got a free ride to an excellent university for the study of
natural science and engineering via a major federal government scholarship
initiative. And young President John F.
Kennedy gave the go-ahead and cultivated the necessary congressional
cooperation for a full-tilt operation at the National Aeronautics and Space
Association (NASA) to launch astronauts toward explorations of the universe.
John Kennedy came to office after the
two-term (1953-56; 1957-1960) presidency of Dwight D. Eisenhower (1952-1960),
the former World War II Allied Supreme Commander and major strategist for
D-Day. Eisenhower avowed a preference
for private enterprise and state-directed programs for achieving aims conducive
to national progress, but in practice he was in many ways an activist
president. Eisenhower approved
extensions of Social Security benefits, raised the minimum wage from $0.75 to
$1.00 per hour, and prevailed upon Congress to establish the Department of
Health, Education, and Welfare. He also
lent enthusiastic support for the 1956 Highway Act for the construction of a
41,000-mile interstate system that vastly improved overland travel in the
United States; while serving as a great
boon to the trucking industry, the interstate system also promoted the
development of a car culture that had deleterious effects and impeded the hopes
of those who would have preferred a wide-ranging mass transit system.
When the economic boom that had
characterized most of the 1950s stalled in 1957 and the nation faced a
recession, Eisenhower reluctantly proposed and received congressional
authorization for greater federal spending.
This spending did help to rejuvenate the economy, but it dashed
Eisenhower’s hopes for a balanced budget:
The national deficit when Eisenhower left office in 1961 was $12
billion.
Kennedy prevailed over Vice-President
Richard Nixon in the closely contested presidential election of 1960. Nixon had a reputation as a rabid
anti-Communist and Cold War hawk. Nixon
had served as a high-profile congressperson from California in the United
States House of Representatives and stayed before the public eye as
Eisenhower’s vice-president. Nixon had
great appeal for staunch Republican conservatives but also was able to maintain
support of more moderate types who admired Eisenhower. Kennedy held most of the Roosevelt coalition
that included highly disparate groups:
labor unions, farmers, Northeastern liberals, and most of the
South. But Kennedy had to contend with
bias against his Roman Catholicism; his
choice of Lyndon Baines Johnson of Texas abetted his retention of the vote in
that state and those of the South, and Johnson’s service as Speaker of the
House had given him strong ties to many constituencies important to the
Democrats.
What may have been the deciding factor
in the 1960 presidential context, though, was the televised debate that
year. Nixon eschewed stage makeup and at
many junctures presented a swarthy and sweaty appearance. Nixon matched Kennedy for skill in asserting
his policy positions in the debate, and many experts in fact thought that Nixon
won strictly on debate points. But the
vibrant image that Kennedy projected was perfect under the gaze of those
peering into these fascinating new television screens, and his cause in the
presidential contest got a mighty boost.
Kennedy proved to be an enormously
inspiring president, for whom the aura of the presidential office and domestic
chambers (shared with wife Jacqueline Kennedy and children Caroline and John)
invited media comparisons to the legendary Camelot of King Arthur. But while Kennedy did successfully overcome
the Soviet threat in the Cuban Missile Crisis and launch Americans toward the
moon, his actual legislative accomplishments were slim. Kennedy had to contend with an
ultraconservative Republican-dominated Congress in both chambers that blocked
most of his spending initiatives for health and education. He dubbed his legislative agenda the “New
Frontier,” through which he sought to regain full momentum from the 1957 recession. Kennedy did achieve the needed boost, but in
the form of increased spending for military goods and services, assistance for
nations important in the Cold War chess match, and the space race. His domestic agenda would have to wait for
impulsion from Lyndon Johnson, who took over when Kennedy was assassinated by
Lee Harvey Oswald in Dallas, Texas, on 22 November 1963.
Cataclysmic
Social Change and the Limits of National Power, 1965-1980
The decade of the 1960s and the early
years of the 1970s constituted a time of cataclysmic social change in the
United States. When Lyndon Johnson
assumed the presidency upon the death by assassination of John Kennedy in 1963,
the nation soon observed a political operator of extraordinary skill. Johnson spoke in the cadences of the Texan,
one of the dialectics of the southern type, so that he was able to communicate
the need for racial justice with an immediacy and language that his fellow
southerners could understand. Johnson
was also magnificent in calling in political chips, using the political capital
that he had built up in the course of more than 30 years in Congress. It was he who secured passage of the 1964
Civil Rights Act and the 1965 Voting Rights Act envisioned by Kennedy; he went on to prevail upon Congress to enact
legislation to provide for fair hiring practices and nondiscriminatory
standards in the sale of residential housing, and through his Great Society
programs he convinced Congress to create Medicare and Medicaid for the health
care needs of elderly and low income people respectively, and he boosted
education spending via the Elementary and Secondary Education Act.
But Johnson’s exertion of political
energy to get his way on matters of military spending and troop commitments did
not produce the same fortunate effects.
Johnson inherited policies from the Eisenhower and Kennedy administrations
that focused heavily on the situation in Vietnam. After Ho Chi Minh and his nationalists ousted
the French from Vietnam in 1954, the Geneva Convention called for a complete
French exit from Indochina, with elections to be held to determine the
leadership for all of Vietnam. But the
South Vietnamese leader Ngo Dinh Diem, who had ousted the emperor Bao Dai,
dithered in setting a date. He perceived
that Ho Chi Minh would in all likelihood win the election and could not relent
to that looming democratic verdict. The
Eisenhower and Kennedy administrations did not want a Ho Chi Minh victory,
either, so they exerted no pressure on the Diem regime to move forward with an
election. But the Kennedy administration
did grow weary of Diem’s prevaricating statements and actions, and Kennedy
advisers worried about Diem’s personal unpopularity. Thus, the Kennedy administration supported
the coup that ousted Diem and brought Nguyen Van Thieu to power in South
Vietnam in 1963.
By then, the Truman, Eisenhower, and
Kennedy administrations had set the precedent for involvement in Vietnam. The Truman and Eisenhower administrations
kept that support at moderate levels of investment and low-key in terms of
political and military commitments.
Truman and Eisenhower sent political and military advisers to help guide
the political maneuverings that brought
Diem to power and kept Ho Chi Minh at
bay. Kennedy elevated the number of
advisers to 17,000 by the early 1960s, and some questioned whether the
proximity of United States military personnel to the field of military action
really entailed a merely advisory role.
Johnson moved American policy in the
direction of explicit military involvement.
Seizing on an August 1964 Tonkin Gulf Incident, during which an American
ship was fired upon a few miles off the coast of North Vietnam, Johnson secured
the Tonkin Gulf Resolution from Congress, allowing him to act as necessary to
protect American lives. Johnson used the
resolution, which authorized him to respond with force if necessary to protect
American lives and property, as an effective permit to wage war: In February 1965 came increased air assaults
on North Vietnam, and the first acknowledged American troop commitments went
afield in Vietnam in April 1965.
American troop levels rose year by year until peeking at over 500,000 in
1968. But the going was tough under
conditions of guerrilla warfare, and the capacity of the Viet Cong (allies of
the North Vietnamese government carrying the brunt of the fighting in the
South) to wage the major Tet Offensive at the time of the Vietnamese New Year
in January 1968 led many in the United States to question United States
government reports and those of General Westmoreland (chief of United States
military activity in Vietnam) that the battle against the North Vietnamese
government of Ho Chi Minh was trending toward victory.
On 31 March 1968, President Lyndon
Johnson announced a pause in bombing
and his intention not to seek reelection. An antiwar movement in the United States was
gaining steam and making inroads into key constituencies in mainstream society
that would pose electoral challenges to Johnson. Richard Nixon completed a surprising
political comeback by securing the Republican nomination in 1968, then
defeating Johnson’s vice-president, Hubert Humphrey, in the general
presidential election of 1968. Nixon had
said that he had a plan for ending the war.
That plan turned out to be gradual “Vietnamization” of the war, which
took five years of troop reductions until the United States withdrawal was
complete on 15 August 1973. In the
meantime, the Nixon administration had superintended a bombing campaign of
communist positions in Cambodia and had continued to enlist support of the
Laotian highland Hmong population to engage in reconnaissance and supply missions
in behalf of the United States and their allies in South Vietnam.
The adversaries in the War in Vietnam
signed a peace agreement on 28 January 1973.
But peaceful resolution of the issue of political leadership for all of
Vietnam was not achieved, and over the course of the years 1973-1975 fighting
resumed. With the exit of their United
States military backers, the leaders of South Vietnam were at a loss to mount a
credible military campaign. The North
Vietnamese won a decisive victory in 1975, as all remaining United States
personnel departed in a panic; some of
their more prominent South Vietnamese associates were able to get seats on the
departing planes, but others clung desperately to the doors of planes to no
avail: Many of those South Vietnamese
who had played important parts on the losing side spent time in prison or were
executed. The North Vietnamese set up a
communist government under the appellation of the People’s Democratic Republic
of Vietnam that remains in 2015, albeit amidst improving relations with the
United States and generous incorporation of capitalist strategies in the
program for economic development.
The Vietnam War became one of those
divisive issues that defined life in the United States during the 1960s and
early 1970s. The early 1960s began as an
extension of the 1950s, with the
malcontents from the world of the
beatniks and the avant-garde still on the fringes or underground. In the mid-late 1960s the forces of the
discontented struck with a vengeance, either joining the discussion in
mainstream forums or striking so forcefully at mainstream society as to alter
all that was standing at the middle of the road:
Here came oppositional or invading
feminists, hippies, homosexuals, British musicians, rioters, and revolutionaries
demonstrating clearly and often angrily the degree of their discontent:
In 1963, Betty Friedan’s The Feminine Mystique appeared,
recording the quiet desperation with which many housewives suffered through
their lives. Her book gave voice to
women mired in traditional roles for whom being wife and mother were not enough
to give them fulfillment. In 1966,
second-wave feminists founded the National Organization for Women (NOW), then
in 1972 Gloria Steinem and Dorothy Pitman Hughes published the first
stand-alone issue of Ms. Magazine
(the very first edition had appeared as a supplement to the New York Times); founding editors were Patricia Carbine,
Joanne Edgar, Nina Finkelstein, Mary Peacock, Letty Cottin Pogrebin, and Mary
Thom.
Among the many issues covered in Ms. Magazine was the movement to secure
passage of the Equal Rights Amendment (ERA), which had actually been written by
first-wave feminists Alice Paul and Crystal Eastman and introduced into
Congress in 1923. In 1972, the ERA passed
both houses of Congress; but in route to
what seemed like certain ratification by the states, the legislation
encountered conservative activist Phyllis Schlafly and the opposition that she
mobilized: At both the original 22 March
1979 deadline and the 30 June 1982 extended deadline the ERA fell three short
of the necessary 38 states needed for ratification.
Opponents had raised the specter of
unisex restrooms and such, and they argued that under the ERA women might lose
special privileges and protections in the workplace; opponents of women serving in the military
also argued against the ERA on that basis.
But the women’s movement continued to push for equal access to
professional schools and for equal opportunities in all realms of American
life.
Many young people during the mid-late
1960s and early 1970s seemed bent on challenging the mores and values of their
parents. College campuses teemed with
organizations founded on a perceived need to change some aspect of the
political or economic culture of the United States; they
were hugely important in providing the
sheer mass of people power in demonstrations against the War in Vietnam,
organized by such groups as Students for a Democratic Society (SDS) and the
Weathermen. Abbie Hoffman, Jerry Rubin,
and Tom Hayden were among those of the New Left who were seeking revolutionary
change in American society.
Rock music, powered by the “British
invasion” of groups such as the Beatles, Rolling Stones, and The Who took on a
lyrical and tonal aura in sync with a time of great change. American musicians with origins in the folk
movement (Bob Dylan, Joan Baez, Pete Seeger) sounded similar sentiments for
change, as did rockers such as Buffalo Springfield and Cream, and rhythm and
blues artists, very prominently Marvin Gaye in his seminal What’s Goin’ On? Near the
end of the turbulent 1960s, an enterprising young rock enthusiast organized the
concert that came to be known as Woodstock
after the town in upstate New York near where the concert was held on a
farm owned by Max Yeager. The
multiday
concert featured a great deal of drug use and promiscuous sexual behavior but
was remarkably peaceful given the number of people in attendance. Some attendees and vicariously thrilled
admirers from afar wanted to feel that the event augured an era of peace and
love, or the Age of Aquarius, as
presented in a popular song and musical of the time.
Some African Americans grew restless
with the nonviolent approach of Martin Luther King, giving support to groups
such as the Nation of Islam (Black Muslims) and Black Panthers. At the urban core, discontent flamed into
violent protest, with major incidents in the course of 1966 and 1967 in places
such as Los Angeles (Watts district), Cleveland, Detroit, Chicago, and
Newark. The Black Power movement
stressed the need for immediate action to address continued inequality of
treatment, with many arguing for the formation of a separate African American
nation within a nation, or for the ultimate overthrow of the United States
government. The Black Culture movement
asserted pride in African roots and African American culture; visual, musical, and literary artists
searched for and presented ethnically authentic artistic expression.
But the Age of Aquarius never
arrived.
The crushing loss of liberal Democrat
George McGovern to conservative Republican Richard Nixon in the presidential
election of 1972 seemed to symbolize the thematic end to the 1960s.
By the mid-1970s, the movements for
dramatic change were losing their former energy. Federal Bureau of Investigation (FBI) efforts
to undermine the Black Panthers, as with the shooting of Fred Hampton in
Chicago in 1972, had an impact on the dissipation of that organization’s
energy. The problems of people at the
urban core grew worse, as both fleeing whites and middle class African
Americans fled the central cities. The
introspective singer-songwriter dominated the popular music charts at the
beginning of the decade, and toward the end of the 1970s disco became a
phenomenon, calling people away from the mean streets of protest to escapist
fancies beneath the strobe light and bodies whirling to sounds of music
explicitly crafted for this type of dance and for this sort of scene. In such a setting of dissipated energy for
movements of change, the very energetic conservative Schlafly was organizing to
put a damper on the ERA initiative.
Americans also were forced to face the
limits of American power and the limits of economic prosperity:
In 1973-1974, the last of the major
Arab-Israel wars precipitated an embargo by OPEC (Organization of Petroleum
Exporting Countries), sending gasoline prices to record highs and causing many dislocations in national and
international economic life.
At about the same time, the War in
Vietnam (which cost 58,000 American lives, almost half of them lost during the
Vietnamization stage of the last few years) was winding down without
victory.
The second administration of Richard
Nixon fell when evidence made clear that burglars employed by his Committee to
Reelect the President had broken into the headquarters of the Democratic Party
at the Watergate Hotel in Washington, D. C., during the 1972 campaign--- and
that Nixon had obstructed justice by impeding the investigation of the
burglary; after hearings by the Judicial
Committee of the House of Representatives recommended articles of impeachment,
Nixon resigned in August 1974.
Vice-President Gerald R. Ford took
over as president and did an admirable job of restoring dignity and effective
governance to the presidential office before losing to Democrat Jimmy Carter in
the 1976 campaign.
But the economy was headed toward
stagflation (a seldom-witnessed phenomenon in United States history, entailing
both stagnant economic growth and inflated prices for goods and services), and
Ford bore some of the damage from the Watergate incident that accrued to
Republicans in general; this became all
the more true when Ford decided to offer Richard Nixon a pardon that obviated
the possibility for a trial. Jimmy
Carter--- engineer, peanut farmer, and
former governor of Georgia--- defeated
Ford in the fairly closely contested election of 1976.
Carter demonstrated in numerous ways
his keen intelligence, never more so than in his adroit handling of
negotiations in 1978 between Anwar Sadat (Egyptian president) and Menachem
Begin (Israeli prime minister) that has resulted in peace between Egypt and
Israel ever since. He also superintended
normalization of relations with China, so that on 1 January 1979 the nations
exchanged ambassadors for the first time in many decades (only in 1972 had the
United States ceased its opposition to entry of the People’s Republic of China
into the United Nations). But Carter
could not stem the stagflation doldrums, and he acknowledged that the nation
languished in a post-Vietnam,
post-Watergate funk that he as
president had not been able to counter with transformative policies or
inspirational leadership. When beginning
in 1979 Iranians captured and held Americans at the United States embassy in
Tehran for 16 months--- and United
States forces failed in a disastrous rescue attempt--- Carter’s fate in the 1980 election was
sealed: Conservative Republican Ronald
Reagan won decisively in the presidential election of 1980, inaugurating a very
different decade in the life of the nation.
Technological
Fascinations and New Malefactions , 1980-2015
Ronald Reagan brought a reputation for
purist conservatism to office in 1981 and maintained that aura over the course
of his tenure. People in the United
States responded gratefully to a president who struck an optimistic tone with
regard to the nations’ future and vowed a tough stance against the Soviet
Union.
Reagan seemed vulnerable on matters of
policy in the presidential election of 1984, when the economy was sluggish and
Democrats asserted that Reagan’s retrenchment on spending for social programs
while increasing defense expenditures represented misplaced priorities; they also maintained that Reagan had
contravened his avowed fiscal conservatism in failing to assure adequate government
revenue while raising defense spending, thus contributing to a rising national
debt. Reagan and his advisers countered
by touting the virtues of their application of supply side economics as
advocated by conservative economist Arthur Laffer, who maintained that keeping
taxes low while continuing to supply consumers with goods and services would
eventually result in a strong market for the products of business and industry,
stimulating economic growth.
In 1982, the Reagan administration
caught biting criticism for its placement of American troops in harm’s way in
Lebanon, where a Muslim terrorist group struck at an American compound and
killed 239 sleeping soldiers. Reagan
soon withdrew all United States personnel, which had been placed in Lebanon in
an effort to prevent turmoil as Christian-Muslim tensions had grown worse with
an Israeli invasion to rout forces of the Palestinian Liberation Organization
(PLO).
Despite contentious policies and
international miscues, Reagan seemed benefit from a kind of “Teflon effect,”
whereby voters overlooked imperfections while embracing the president’s
persona. Reagan’s personality and
conservative message still resonated with most American voters, who returned
him to office with a landslide victory over Democrat Walter Mondale in 1984.
By 1986, some of the bright sheen of
Reagan’s popularity was tarnished in the Iran-Contra affair, whereby members of
his administration were found to have surreptitiously superintended arms sales
to the government of Iran, with some of the profits apparently going to aid the
Contras in their opposition to the Marxist government of Daniel Ortega’s
Sandinistas in Nicaragua. The sales to
Iran were also perceived to have been in exchange for return of American
hostages; for Iran, the main advantage
in striking the deal lay in the acquisition of new weaponry in the war against
Iraq.
Then, on 19 October 1987, the stock
market plunged 508 points in continuation of a slide that had begun in
August. The market did recover half of
its losses by the end of 1987, though.
And the Reagan administration could claim to have dramatically reduced
inflation, which had risen as high as 13.5% during the Carter years; in late 1988, the inflation rate stood at
5.5%. Still, the national debt continued
to rise as the “Laffer Curve’ and supply side economics proved to be much less
successful in practice than theory had projected.
Democrat Michael Dukakis ran a
lackluster campaign and exerted little personal appeal to voters in the 1988
presidential contest. His rote responses
to questions, seeming to come from some practiced liberal Democratic
sourcebook, followed him to a low point in one of his debates with Republican
George H. W. Bush, at which he was asked about his likely feelings toward a
perpetrator of rape should the victim be his wife, Kitty. Dukakis gave a formalistic reply stressing
his opposition to capital punishment.
This moment in the debate, and a photographed scenario in which Dukakis
seemed hopelessly ill at ease while seated in a tank, complete with a dutifully
but awkwardly donned helmet, presented a candidate whose persona seemed to lack
the genuine humanity that voters had perceived in Ronald Reagan.
George H. W. Bush also drew criticism
for lacking what he himself termed the “vision thing,” but he prevailed in the
election over Dukakis. The first years
of the Bush presidency proved very eventful, with the Eastern European
communist regimes crumbling I 1989 and the Soviet Union itself passing from
history in 1991. And in the meantime,
Saddam Hussein’s invasion of Kuwait brought American coordination of a
coalition that pushed Hussein’s forces out of this tiny nation of enormous oil
wealth positioned on the southeast coast of the Arabian peninsula. The military response was dubbed operation
Desert Storm, lasting from 16 January 1991 through 28 February 1991.
This successful military episode,
while drawing criticism from those who would have preferred that Saddam Hussein
be pursued and ousted from power, seemed to suggest an internationally
resurgent United States, in a world in which Cold War was no more. Bush seemed to have the inside track to
victory in the 1992 presidential election in the immediate aftermath of the
Persian Gulf War, with its favorable juxtaposition to the fall of the Soviet
Union. But Democratic candidate Bill
Clinton followed the instruction of his political adviser James Carville to
remember, “It’s the economy, stupid,”
and in all ways to project himself as the perfect presidential
candidate, possessed of all of the personal appeal that Michael Dukakis had
lacked--- and that George H. W. Bush did
not possess in great abundance. And Bush
was certainly vulnerable on economic matters, since in 1992 unemployment stood
at 7.8% and the poverty rate stood 14.2%.
The national debt had continued to grow, despite the fact that Bush had
incurred the wrath of fellow conservatives by raising taxes, despite his
frequently recounted “read my lips” vow that he would not do so.
Riding a high wave of expectations,
Clinton largely succeeded. Early in his
presidency, Clinton prevailed upon Congress to end federal prohibition on the
use of fetal tissue for medical research and repealed restrictions on abortion
counseling at federally funded health care facilities. Right away, Clinton appointed numerous women,
African Americans, and people of diverse ethnicities to positions in his
administration. Ever politically astute,
Clinton took an official position yielding to tradition in banning homosexuals
from military service, but he raised no objection when Congress countered with
a formulation rendered as “don’t ask, don’t tell.”
Clinton’s years were not entirely
smooth. He vowed to bring universal
health care to the United States and gave the assignment for making this happen
to his wife, Hillary (who also bowed to tradition in changing the presentation
of her last names to Rodham Clinton after years of simply using her natal
surname, Rodham). The health care
initiative failed. But Clinton’s
otherwise politically canny and perceptibly effective economic policies counted
on the issues about which voters care the most:
He won easily over Republican candidate Bob Dole in the 1996
presidential contest.
The second term of Bill Clinton’s
presidency was a study in policy effectiveness and personal trial:
Clinton oversaw transformation of the
key welfare program from Aid to Families with Dependent Children (AFDC) to
Temporary Assistance to Needy Families (TANF), which came with a work or
educational advancement requirement for all able bodied parents after five
years of receiving
Assistance; Clinton paired this initiative with
successful expansion of the earn-income tax credit for the working poor. He also secured a major tax cut in 1997, even
as he worked with House leader Newt Gingrich to get a deficit-reduction package
through Congress. Clinton also greatly
reduced the number of workers on the government payroll.
But when investigators found evidence
that Bill Clinton and Hillary Rodham Clinton had committed fraud in a deal
related to property traded by Whitewater real estate firm in Arkansas, the president
was subpoenaed to give a deposition. No
firm proof of fraud in the land deal was found, but in to have giving his
deposition, Clinton denied having a sexual relationship with White House intern
Monica Lewinsky, a statement that he later admitted was false. Technically this put Clinton in a position to
be tried for perjury, for which the House of Representatives did impeach the
president--- the only president other
than Andrew Johnson to be impeached. But
the Senate did not find Clinton guilty, so that he survived to finish his
second term.
Quite notably, given all of the
negative publicity swirling around him from the Whitewater and Lewinsky
matters, Clinton left office a very popular leader, presiding as he did over a
dynamic economy full of thriving businesses, successful job seekers, and
enthusiastic consumers. Clinton could
also claim that he made no major commitment of troops during the two terms of
his presidency, although lack of engagement in foreign conflicts also had a
downside: Americans largely ignored
atrocities that resulted in one million deaths in the Hutu-Tutsi dispute in
Rwanda (1994), and Serbian president Slobodan Milosevic perpetrated his Bosnian
massacres (1995) against Bosniaks (a mostly Muslim population) and Croats (mostly
Catholic) in the absence of any effective intervention by the United States,
except as a small contingent within NATO forces.
Clinton’s vice-president Al Gore
secured the Democratic nomination in 2000, running against Republican nominee
George W. Bush. Gore ran an thematically
unfocused campaign; he failed to take
severable states generally considered winnable for Democrats, and he could not
persuade constituents in his home state of Tennessee to give him a majority of
their votes. He did win the popular
vote, but his loss of several states key to his Electoral College count made
him ultimately dependent on the vote in Florida, where problem-plagued polling
sites produced a dispute that went all the way to the Supreme Court. The Justices ruled 6-3 that an official tally
in Bush’s favor would stand; Gore
declined to mount any further challenge, yielding the presidency to his
opponent.
George W. Bush took office on 15
January 2000. He wanted to concentrate
on a domestic agenda for which he vowed to apply conservative solutions in
compassionate service to impoverished and other challenged populations often
considered more politically responsive to programs of Democratic provenance:
In 2002, Bush oversaw bipartisan
passage of legislation on which personnel at the United States Department of
Education had worked; officially a
renewal of the Elementary and Secondary Education Act first issued in 1967
during the presidency of President Lyndon Johnson, this legislation went by the
more specific appellation of No Child Left Behind. Consistent with President Bush’s “compass
sionate conservative” agenda, the explanatory comments in the legislative
document cautioned against the “soft bigotry of low expectations” and put
forward an aggressive program to achieve educational equity by 2014. To attain such a lofty goal, the legislation
mandated the construction of tests (assessments) for evaluation of grade level
performance at Grades 3-8 and the concomitant generation of instruments for
measuring the achievement of high school students in math, reading, and
writing. The data were to be
disaggregated to assess achievement of students according to gender, economic
level, and ethnicity, and schools were to be held responsible for grade level
achievement in all demographic categories.
Those schools that were found to be underserving any of the student
populations were put on a five-year sequence of warning, hiring of outside
tutoring, and restructuring for persistent failure. Over the longer term, city, state, or
private contractor control of perpetually failing schools and school districts
was to be an option.
At first, Republicans were solidly
behind the legislation, which conveyed a get-tough approach that made them feel
that an old-school return to basics was moving forward. But within a half-decade, came the
conservative push-back against a federally mandated program: Republicans began repeating a refrain in
favor of local control and the effective disassembling of the working framework
of No Child Left Behind.
As to Democrats, they were subject to
immediate pressure from the education establishment--- especially the National Education Association
(NEA)and the American Federation of Teachers (AFT)and their local school district
teacher union affiliates--- that typically contributed heavy cash to the
campaigns of Democrats. By the midpoint
of the first decade of the new millennium, most Democrats both in Congress and
local legislative chambers were working to gut No Child Left Behind.
By the time George Bush left office,
No Child Left Behind had been so seriously undermined
as to have lost any potential for
raising the performance level of all students.
When President Obama took office his staff at the United States
Department of Education began a new program called, “Race to the Top,’’ which
mandated that states show anew how they were going to address the persistent
gap in performance levels between the white and the economically well-off, on
higher performance side, and young people of color and low income, on the lower
performance side. Buying into the
rhetoric and prevarication by both Democrats and Republicans, the Obama
administration approved waivers from the mandates of No Child Left Behind in
states that could construct and articulate the case for a program for achieving
educational equity.
Thus it was that one of the few
promising items from the George W. Bush domestic agenda was gutted by political
forces at left and right--- and how we
still wring our hands over the achievement gap, having, over the course of
about ten years, destroyed the most promising program in American history for
addressing the issue of educational equity.
By the time that Bush secured passage
of the eventually ill-fated No Child Left Behind legislation, the events of 11
September 2001 had altered the focus of his presidency. The three bombings (one at each of the twin
towers of New York’s World Trade Center, and another at the Pentagon), along
with the downing of Flight 93, put the Bush administration on notice that a
threat at least as insidious as that represented by the Soviet Union during the
Cold War must be countered.
Having received intelligence that the
Sunni Muslim terrorist group Al-Qaeda had planned and implemented the attacks
on 9-11, and knowing that Al-Qaeda leader Osama bin Laden organized his
followers in Afghanistan with the cooperation of the Taliban regime (another
terrorist organization with a skewed interpretation of Islam, in power since
August 1994), the Bush administration opted to send bombers from the United
States Air Force to strike Afghan cities on 7 October 2011; Kabul fell on 13 November 2011, and
thereafter the Bush administration sent advisers and ground troops into Afghanistan
that remained throughout the George Bush (2001-2008) and into the Barack Obama
(2009-2016) administrations. The effort
was to find Osama and to either rout the Taliban or prevent the return of the
Taliban to power. Special Forces of the
United States did eventually hunt down and kill Osama, not in Afghanistan but
in neighboring Pakistan. Meanwhile, the
Obama administration sought to stabilize minimally competent and not very
popular regimes in Afghanistan, while having to settle for keeping the Taliban
away from Kabul and on the defensive.
The Bush administration then opted to
deal with the threat of world terrorism in another, quite tangential,
manner--- by invading Iraq. Airstrikes against Baghdad ensued on 20
March 2003. Upon the pretext of a search
for Weapons of Mass Destruction (WMD), the administration sent ground forces at
the head of what was officially an international coalition into Iraq on 22
March 2003. These forces gained control
of Baghdad 9 April 2003 and took the last major city (Tikrit) to fall on 13
April 2003. United States forces
occupied a capital city in chaos and placed forces throughout the country for a
stay that lasted until December 2011.
United States forces eventually hunted Saddam Hussein down on 13 December
2003; following imprisonment and trial,
Saddam was executed on 30 December
2006.
Americans are by tradition hesitant to
change governments when facing external threats, so that the events of 9-11
probably contributed to Bush’s victory over Democratic nominee John Kerry in
the 2004 presidential election. The
contest was close, though. Bush did win
a small majority in the popular vote this time, but the contest for Electoral
College votes was tight. Kerry, though,
ran a lackluster and unfocused campaign that could not produce a victory in the
winnable state of Pennsylvania, thereby ceding the election to Bush.
The Bush agenda for the second term
continued largely as in the first. The
commitments in Afghanistan and Iraq continued, now under conditions in which
outright victory over adversaries (Taliban in Afghanistan, Sunni opponents to
the Shiite regime in Iraq) was acknowledged to be impossible, and in which
neither nation seemed capable of generating or supporting a capable domestic
leadership. Even the hawkish Bush
administration, having adopted a “surge” strategy of major additional troop
placements in 2006, when internecine turmoil in Iraq led many foreign policy
analysts in the United States to criticize the Iraq commitment as wasteful and faulty,
set a timetable for withdrawal of troops from Iraq. But the presence of the United States in
Afghanistan continued unabated and actually increased during the early years of
the Obama administration.
After President Bush successfully
oversaw the passage of No Child Left Behind legislation in 2002, his
distraction with matters pertinent to 9-11 made pressing forward with any other
major domestic programs very difficult.
In 2005, early therefore in his second presidential term, Hurricane
Katrina hit with a vengeance along the Gulf coast, especially wreaking damage
on the culturally proud
city of
New Orleans. While many efforts were
made by federal, state, and city officials to deal with the disaster that
caused almost 2,000 deaths, many analysts and affected citizens thought that
the initial response was too slow and that efforts in the days immediately
following the full-force strike lacked adequate attention to detail and a sense
of the gravity of the crisis. News that
people were still stranded and dying many days after disaster struck was
received by many as a message that the Bush administration lacked the
sufficiency of compassion and intensity of action needed to save people who
were stranded in grave predicaments.
The 9-11 incident and the
entanglements in Afghanistan and Iraq;
along with the Katrina crisis and perceived inadequacy of the
presidential response; formed in the
public consciousness a sense that George W. Bush lacked the knowledge base,
objective judgment, and administrative skill required in a president. A precipitous stock market decline during
2007-2008; and a nationwide economic
crisis traceable to ill-considered loans extended by financial institutions to
unqualified borrowers; contributed to
the sense that the Bush administration was inadequate to the economic and
security concerns of the nation.
In this context, Republican John
McCain faced a difficult task as he took on Democrat Barack Obama in the 2008
presidential election. McCain secured
the nomination as someone who had frequently distanced himself from Bush’s
foreign and domestic policies, but the electorate still associated the
Republican candidate with a disparaged president from his own party; and McCain himself often seemed awkward and
not sufficiently presidential and charismatic
to match the great appeal of Barack Obama. As a young African American candidate who
offered the appealing campaign theme of “Yes We Can!” Obama seemed to many
people to be the right person to shake the nation out of its doldrums and get
the country moving forward toward recovery of spirit and prosperity. Many people looked beyond ideology to vote
for Obama on the basis of his energy, style, oratory, demeanor, and sheer star
quality. He won the presidential contest
decisively.
And Obama hit the ground running with regard
to policy. His chief domestic objective
was the achievement of universal health care.
An avowed advocate of a single payer, entirely government run system,
Obama knew that this was not politically attainable in the United States, even
though this was the system that had worked most efficiently across the globe
(e. g., Germany, France, Great Britain, Canada, the Scandinavian countries,
many nations of Southeast and East Asia).
So Obama had his team go to work on a system that left Medicare in place
and expanded Medicaid to cover additional people in a broadened definition of
low income, thereby incorporating and working his innovations upon the two
federal government programs that had been in operation from the days of the
Lyndon Johnson presidency.
Then Obama and his advisers worked into their
health care program mandates that everyone have health insurance, through one’s
workplace if offered, and through personal options if not. The personal options came in the form of
health care exchanges, on which were listed the available private insurance
providers for a given state. These
insurance companies listed their prices, health care facilities, and covered
items on the exchange along with competing companies, so that the purchaser
could make an objective, informed decision.
Every state had to have an exchange, either run by the state or by the
federal government. Federal subsidies were
provided on a sliding income scale.
Monetary penalties were established for not getting health care coverage
in one of the available ways. In this way, the objective of 100% participation
and universal coverage became viable goals.
Two very appealing aspects of the program
were the extension of coverage to young people in a household up to the age of
25, and the prohibition on denial of coverage because of previous medical
conditions. But there was a great deal
of opposition to such dramatically new approaches to health care in the United
States. The vote in Congress, then controlled
by Democrats, fell entirely along party lines.
Obama had himself worked hard to get bipartisan participation but in the
end had to settle for pushing the legislation through with the required votes
from his own party. The legislation,
named the Affordable Health Care Act, passed and went into effect in 2011. Legal challenges were posed, with two going
all the way to the Supreme Court, but the new health care law from the Obama
administration survived the Supreme Court decisions. As of 2015, grumbling continues among
Republicans and conservative groups, but the facts indicate great improvement
in the percentage of people covered, coverage for those who have had chronic or
preexisting conditions, and lower costs for most (but not all) people.
During his first term in office, Obama also
pushed through an economic stimulus package that injected federal dollars into
the economy through spending for roads, bridges, and other public works; funds to save Ford Motor Company from
collapse; and a similar bailout for
financial services companies that had made grave misjudgments in lending money
to high-risk clients but were judged by the Obama administrations as too big to
fail. The latter decision was especially
controversial, inasmuch as the aid seemed to reward profligacy; the Obama administration countered that the
bailout came with penalties for any further high-risk lending, citing the
practical impact that failure would otherwise have on people whose life savings
were bound up in the companies.
By the time of the 2012 presidential
election, the economy was slowly reviving.
The stock market in particular was registering strong gains. Unemployment, though, proved to necessitate a
longer wait for recovery of acceptable rates;
in 2012, unemployment still hovered at about 8%, and job growth was
sluggish. The Republican candidate Mitt
Romney seemed to make inroads into the popular and Electoral College tallies
after Obama gave what was perceived by many as a lackluster performance in the
first debate; but Obama’s performance
took on more energy and persuasiveness in the second and third debates. Several gaffs on Romney’s part created the
public perception of a candidate who callously considered the needs and even
eschewed the political support of people of low income and those who accessed
public services such as food stamps and Temporary Assistance to Needy Families
(TANF). By November 2012, Obama’s poll
numbers had risen steadily; the sitting
president commanded the popular vote and won a decisive Electoral College
victory.
During the early months of Obama’s second
term as president, he superintended a withdrawal of troops from Iraq and set a
timetable for withdrawal of troops from Afghanistan. Within a few months of the Iraq withdrawal,
though, internecine conflicts between Sunnis and Shiites resumed and turned
very violent. Then the group known as
ISIS (Islamic State of Iraq and Syria;
also recognized by the name ISIL [Islamic State of Iraq and the Levant])
went on the offensive in West Asia and North Africa, gaining control of
territory on the border of Syria and Iraq.
In an enormously complicated situation, the Obama administration found
itself working with Syrian President Assad and Iranian President against the
common threat of ISIS/ ISIL, even though Assad had committed acts of violence
against those among his own people who opposed the dictatorial regime; and the Iranian leader presided over a
leadership that frequently voiced antipathy for the United States and continued
to develop atomic weapons.
By 2015, the United States and Iranian
governments seemed to be seizing the opportunity presented by the common threat
of ISIS/ ISIL to forge a new agreement for the suspension of Iranian atomic
weapons development in exchange for U. S. lifting of economic sanctions. Any cooperation with Assad, though, became
enormously difficult as hordes of refugees fled Syria in such numbers as to
cause a backlash against immigrants in those European nations where Syrians and
others sought safe havens.
Thus, international terrorism had by the
second decade of the 21st century come to pose a problem at least as
grave and seemingly intractable as that presented by the perceived Soviet
threat in the Cold War. In 2015, the
Obama administration was considering some level of reentry into Iraq to
stabilize that nation in the context of the ISIS/ ISIL threat; and although most American troops had been
withdrawn from Afghanistan, the United States kept advisers in the nation,
continued to conduct surveillance operations, and maintained a low number of
soldiers in the event of a crisis demanding more direct military action. Over the course of about ten
years, the United States has lost a total of 6,677 lives in Iraq and
Afghanistan: losses in the former have been
4,448; in the latter, losses have been
2,229.
The economy continued to improve during Obama’s second term in
office. The unemployment rate declined
to about 6.5%, with steady monthly job gains.
Growth in GDP (Gross Domestic Product) proceeded at an annual rate of
about 2.2% on average. All of these
figures presented vast improvements over those inherited by Obama upon taking
office, but they did not satisfy critics either on the right or the left. Obama, whose party had lost control of
Congress in the 2014 elections, also incurred criticism for making generous use
of executive action to create an easier path to citizenship for the children of
immigrants who had entered the United States illegally, to recognize marriage
as legal for homosexuals, and other administrative maneuvers that did not
absolutely require the approval or legislative initiative of Congress.
As the presidential election of 2016 approached, Hillary Clinton
seemed to have the inside track on the Democratic nomination but faced a
serious challenge from avowed Democratic Socialist Bernie Sanders of
Vermont. On the Republican side, Jeb
Bush raised a great deal of money from establishment types within his party,
Chris Christie contended for that same political base, Scott Walker of Wisconsin
had a conservative message of some resonance, and Marco Rubio and Ted Cruz also
looked for votes among those farthest on the Republican right; but Donald Trump and Ben Carson were
attracting strong support from those disgruntled with more conventional
candidates, and Carly Fiorina was making a surge that seemed to activate
excitement among both the establishment and disgruntled Republican
constituencies.
For his part Obama seemed to have weathered some low points in
approval ratings during 2013-2014 to rise again in the estimation of
voters. The years of his presidency had
certainly been eventful, as he oversaw the inauguration of a bold new health
care plan, a broadening definition of the institution of marriage to include
those of the same sex, belated but definite advances along the path of
citizenship for children of illegal immigrants, a dramatically improved
economy, and great efforts to wend a governmental policy through hugely
complicated and terror-laden foreign scenes.
The United States has lost 6,677 in
Iraq (4,448) and Afghanistan (2,229).
United States young people are still giving their lives in
Afghanistan. The Obama administration
pulled troops out of Iraq but is now considering sending some limited number of
troops back into Baghdad as the city and nation continue to be riven with
internecine conflicts, and under conditions whereby the Sunni Muslim group,
Islamic State of Iraq and the Levant (ISIL, also known as the Islamic State of
Iraq and Syria[ISIS] is a threat to Iraq and all other nations of West Asia and
North Africa.
In the course of the years since 1982,
women have in fact begun to serve in combat roles in the military and to assume
professional positions at rates equal to and even exceeding men, although
feminists continue to assert that there remains a glass ceiling in corporate
America that prevents women from attaining the highest level positions on a par
with men.
No comments:
Post a Comment