The United States Becomes a World Power - PBworks



The United States Becomes a World Power

"A Man, A Plan, A Canal, Panama"

At noon, December 31, 1999, the United States voluntarily gave up the Panama Canal, ending 85 years of control. Prior to the development of the atomic bomb and the landing of astronauts on the moon, the Panama Canal was perhaps this country's signal engineering achievement. Fifty-one miles long, with about $3.5 billion in bases and infrastructure, the canal links the Atlantic and Pacific oceans.

At the end of the twentieth century, the canal was no longer essential to U.S. strategic or economic interests. Aircraft carriers and oil tankers were too large to pass through the canal's locks. Earlier in the century, however, the canal was regarded as a vital national interest. During World War II, the United States stationed 65,000 troops in Panama to protect the canal. A number of U.S. interventions in the Caribbean and Central America were undertaken largely to protect the canal from hostile powers.

The canal's construction was a phenomenal undertaking. In 1850, U.S. interests in Panama built a railroad across the isthmus to transport '49ers to California. In 1879, the French, fresh from their success in building the Suez Canal, started building a canal. Over the next twenty years, between 16,000 and 22,000 workers died from malaria, yellow fever, typhoid, snake bites, and accidents. Torrential rains averaging 200 inches a year washed away much of the work.

America's 1898 war with Spain made a canal seem essential. During the Spanish-American War, the only way for U.S. battleships to sail from the Atlantic to the Pacific Ocean was to make an 8,000 mile journey around Cape Horn at the tip of South America.

The canal was completed in the face of seemingly insurmountable political, medical, and technological obstacles. The Isthmus of Panama was located in Colombia, which had rejected a U.S. proposal to build a canal. ("You could no more make an agreement with them than you could nail currant jelly to a wall," President Theodore Roosevelt said).

A French adventurer, Philippe Bunau-Varilla, and an American lawyer, Nelson Cromwell, conceived of the idea of creating the republic of Panama. They persuaded Roosevelt to support Panama. Bunau-Varilla engineered a revolution and U.S. warships prevented Colombia from stopping Panama's attempt to break away. (In 1921, the U.S. paid an indemnity to Colombia in recognition of the U.S. role in the Panamanian revolution). Bunau-Varilla repaid the United States for its assistance by signing a treaty on behalf of the Panamanians, which gave the United States a zone stretching five miles from each bank of the canal in perpetuity. Within the zone, U.S. laws, police, and courts ruled.

Years later, President Roosevelt said that the people of Panama rebelled against Colombia "literally as one man." A Senator quipped, "Yes, and the one man was Roosevelt." In 1911, Roosevelt said bluntly, "I took the Isthmus, started the canal and then left Congress not to debate the canal but to debate me." In 1906, eager to see the greatest accomplishment of his presidency, he became the first president to travel overseas, when he went to Panama at the height of the rainy season and took the controls of a 95-ton steam shovel.

To reduce deaths by disease, William Gorgas, an army physician, oversaw the massive draining of swamps in order to eliminate mosquitoes that carried yellow fever and malaria.

The French had attempted to build a canal at sea level, but grossly underestimated the difficulty of achieving this goal. To allow ships to travel between the oceans, American engineers designed a system of locks capable of raising and lowering ships 64 feet, using the force of gravity and 40-horsepower motors to move the gates. One set of locks used enough concrete to build a wall 8-feet thick and 12-feet high, stretching between Cleveland and Pittsburgh.

At its peak in 1913, the workforce consisted of 44,000 persons. West Indian workers were the canal's unsung heroes. Each day, 200 trainloads of dirt had to be hauled away. More than 25,000 worked as canal diggers--three times the number of Americans who worked on the canal. Between 1904 and 1915, 5,600 lives were lost to disease and accidents. Most of those who died were from Barbados. The quinine used to treat malaria left many workers deaf. In December 1908, 22 tons of dynamite exploded prematurely, killing 23 workers. Black workers' children attended segregated schools that only went up to the eighth grade.

Built at a cost of $387 million over a period of ten years, the Panama Canal was a declaration of America's coming of age in the world.

The United States Becomes a World Power

By 1890, the United States had by far the world's most productive economy. American industry produced twice as much as its closest competitor, Britain. But the United States was not a great military or diplomatic power. Its army numbered less than 30,000 troops and its navy had only about 10,000 seamen. Britain's army was five times the size of its American counterpart, and its navy was ten times bigger. The United States' military was small because the country was situated between two large oceans and was surrounded by weak or friendly nations. It faced no serious military threats and had little interest in asserting military power overseas.

From the Civil War until the 1890s, most Americans had little interest in territorial expansion. William Seward, the secretary of state under presidents Lincoln and Johnson, did envision American expansion into Alaska, Canada, Mexico, Central America, the Caribbean, Iceland, Greenland, Hawaii and other Pacific islands. But he realized only two small parts of this vision. In 1867, the United States purchased Alaska from Russia for $72 million and occupied the Midway Islands in the Pacific.

Americans resisted expansion for two major reasons. One was that imperial rule seemed inconsistent with America's republican principles. The other was that the United States was uninterested in acquiring people with different cultures, languages and religions. But where an older generation of moralists thought that ruling a people without their consent violated a core principle of republicanism, a younger generation believed that the United States had a duty to uplift backward societies.

By the mid-1890s, a shift had taken place in American attitudes toward expansion that was sparked partly by a European scramble for empire. Between 1870 and 1900, the European powers seized 10 million square miles of territory in Africa and Asia, a fifth of the world's land mass. About 150 million people were subjected to colonial rule. In the United States, a growing number of policy makers, bankers, manufacturers and trade unions grew fearful that the country might be closed out in the struggle for global markets and raw materials.

A belief that the world's nations were engaged in a Darwinian struggle for survival and that countries that failed to compete were doomed to decline also contributed to a new assertiveness on the part of the United States. By the 1890s, the American economy was increasingly dependent on foreign trade. A quarter of the nation's farm products and half its petroleum were sold overseas.

Alfred Thayer Mahan, a naval strategist and the author of The Influence of Sea Power Upon History, argued that national prosperity and power depended on control of the world's sea-lanes. "Whoever rules the waves rules the world," Mahan wrote. To become a major naval power, the United States began to replace its wooden sailing ships with steel vessels powered by coal or oil in 1883. But control of the seas would also require the acquisition of naval bases and coaling stations. Germany's Kaiser Wilhelm had copies of Mahan's books placed on every ship in the German High Seas Fleet and the Japanese government put translations in its imperial bureaus.

During the late nineteenth century, the idea that the United States had a special mission to uplift "backward" people around the world also commanded growing support. The mainstream Protestant religious denominations established religion missions in Africa and Asia, including 500 in China by 1890.

During the late 1880s, American foreign policy makers began to display a new assertiveness. The United States came close to declaring war against Germany over Samoa in 1889, against Chile in 1891 over the treatment of U.S. sailors, and against Britain, over a territorial dispute between Venezuela and Britain in 1895.

American involvement in the overthrow of Hawaii's monarchy in 1893 precipitated a momentous debate over the United States' global role and whether it should behave like a great power and seize colonies or whether it should remain different.

The Annexation of Hawaii

After a century of American rule, many native Hawaiians remain bitter about how the United States acquired the islands, located 2,500 miles from the West Coast.

In 1893, a small group of sugar and pineapple-growing businessmen, aided by the American minister to Hawaii and backed by heavily armed U.S. soldiers and marines, deposed Hawaii's queen. Subsequently, they imprisoned the queen and seized 1.75 million acres of crown land and conspired to annex the islands to the United States.

On January 17, 1893, the conspirators announced the overthrow of the queen's government. To avoid bloodshed, Queen Lydia Kamakaeha Liliukalani, yielded her sovereignty, and called upon the U.S. government "to undo the actions of its representatives." The U.S. government refused to help her regain her throne. When she died in 1917, Hawaii was an American territory. In 1959, Hawaii became the 50th state after a plebiscite in which 90 percent of the islanders supported statehood.

The businessmen who conspired to overthrow the queen claimed that they were overthrowing a corrupt, dissolute regime in order of advance democratic principles. They also argued that a western power was likely to acquire the islands. Hawaii had the finest harbor in the mid-Pacific and was viewed as a strategically valuable coaling station and naval base. In 1851, King Kamehameha III had secretly asked the United States to annex Hawaii, but Secretary of State Daniel Webster declined, saying "No power ought to take possession of the islands as a conquest...or colonization." But later monarchs wanted to maintain Hawaii's independence. The native population proved to be vulnerable to western diseases, including cholera, smallpox, and leprosy. By 1891, native Hawaii's were an ethnic minority on the islands.

After the bloodless 1893 revolution, the American businessmen lobbied President Benjamin Harrison and Congress to annex the Hawaiian islands. In his last month in office, Harrison sent an annexation treaty to the Senate for confirmation, but the new president, Grover Cleveland withdrew the treaty "for the purpose of re-examination." He also received Queen Liliuokalani and replaced the American stars and stripes in Honolulu with the Hawaiian flag.

Cleveland also ordered a study of the Hawaiian revolution, which concluded that the American minister to Hawaii had conspired with the businessmen to overthrow the queen and that the coup would have failed "but for the landing of the United States forces upon false pretexts respecting the dangers to life and property." Looking back on the Hawaii takeover, President Cleveland later wrote that "the provisional government owes its existence to an armed invasion by the United States. By an act of war...a substantial wrong has been done."

President Cleveland's recommendation that the monarchy be restored was rejected by Congress. The House of Representatives voted to censure the U.S. minister to Hawaii and adopted a resolution opposing annexation. But Congress did not act to restore the monarchy and in 1894, Sanford Dole, who was beginning his pineapple business, declared himself president of the Republic of Hawaii without a popular vote. The new government found the queen guilty of treason and sentenced her to five years of hard labor and a $5,000 fine. While the sentence of hard labor was not carried out, the queen was placed under house arrest.

The Republican party platform in the presidential election of 1896 called for the annexation of Hawaii. Petitions for a popular vote in Hawaii were ignored. Fearing that he lacked two-thirds support for annexation in the Senate, the new Republican president William McKinley called for a joint resolution of Congress (the same way that the United States had acquired Texas). With the country aroused by the Spanish American War and political leaders fearful that the islands might be annexed by Japan, the joint resolution easily passed Congress. Hawaii officially became a U.S. territory in 1900.

When Capt. James Cooke, the British explorer, arrived in Hawaii in 1778, there were about 300,000 Hawaiians on the islands, but infectious diseases reduced the native population. Today, about 20 percent of Hawaii's people are of native Hawaiian ancestry, and only about 10,000 are of pure Hawaiian descent. Native Hawaiians were poorer, less healthy, and less educated than members of other major ethnic groups on the islands.

Sugar growers, who dominated the islands' economy, imported thousands of immigrants laborers first from China, then Japan, then Portuguese from Madeira and the Azores, followed by Puerto Ricans, Koreans, and most recently Filipinos. As a result, Hawaii has one of the world's most multicultural population.

In 1993, a joined Congressional resolution, signed by President Bill Clinton, apologized for the U.S. role in the overthrow. The House approved the resolution by voice vote. The Senate passed it 65 to 34.

The Spanish American War

The debate over America's global role intensified when Cubans began to fight for their independence from Spain in 1895. Americans were sympathetic to Cuba's struggle for independence, but were divided about how to help. The Republican speaker of the House did not want "to spill American blood," unless American interests were directly threatened, while Theodore Roosevelt, the Republican assistant secretary of the Navy pushed for war against Spain.

President William McKinley was deeply ambivalent about war against Spain. The last president to have served in the Civil War said he had seen too much carnage at battles like Antietam to be enthusiastic about war with Spain. "I've been through one war. I have seen the dead piled up, and I do not want to see another."

Ultimately, however, the pressure of public opinion forced McKinley into the war that made the United States an international power. Newspaper publishers like William Randolph Hearst and Joseph Pulitzer worked up war fever among the public with reports of Spanish atrocities against Cuban rebels. Then, Hearst's New York Journal published a leaked letter in which the chief Spanish diplomat in Washington, Enrique Duby de Lome, described President McKinley as "weak" and a "petty politician." Hearst publicized the de Lome letter under the screaming headline: "WORST INSULT TO THE UNITED STATES IN ITS HISTORY".

Days later an explosion sank the U.S.S. Maine in Cuban's Havana harbor. A naval court of inquiry blamed the explosion on a mine, further inflaming public sentiment against Spain.

Then a respected U.S. Senator, Redfield Proctor, after returning from a visit to Cuba, announced that he had reversed his position from isolationism to intervention "because of the spectacle of a million and a half people, the entire native population of Cuba, struggling for freedom and deliverance."

After ten days of debate, Congress declared war, but only after adopting the Teller Amendment, in which the United States made it clear that it did not harbor imperialist ambitions. The amendment announced that the United States would not acquire Cuba. European leaders were shocked by this declaration. Britain's Queen Victoria called on the European power to "unite...against such unheard [of] conduct," since the United States might in the future declare Ireland and other colonies independent.

But after the United States defeated Spain, it set up a military government on Cuba and made the soldiers' withdrawal contingent on the Cubans accepting the Platt amendment, which gave the United States the right to intervene in Cuba to protect "life, property, and individual liberties." The 144-day war also resulted in the United State taking control of the Philippines, Puerto Rico, and Guam.

The Philippines

The twentieth century began with the United States engaged in a bloody but largely forgotten war in the Philippines that cost hundreds of thousands of lives. The Philippine-American War, fought from February 1899 to July 1902, claimed 250,000 lives and helped establish the United States as a power in the Pacific.

Today, few Americans are aware of the Philippine-American War. The conflict was a sequel to the Spanish-American War of 1898, which had been waged, in part, in support of Cubans fighting for independence from Spain. But it was also fueled by American desire to become a world power.

It prompted Mark Twain and other writers and artists to speak out against those who advocated American expansion. It fueled a bitter national debate over U.S. involvement overseas, a precursor to the outcry over the Vietnam War a half-century later. Some who opposed the occupation were motivated by racism, fearful that annexation of the Philippines would lead to an influx of non-white immigrants. One U.S. Senator warned of the coming of "tens of millions of Malays and other unspeakable Asiatics." Many who considered the occupation immoral and inconsistent with American traditions and values joined Anti-Imperialist Leagues.

The conflict helped popularize the concept of the "white man's burden," the notion that the United States and Western European societies had a duty to civilize and uplift the "benighted" races of the world. A U.S. Senator from Indiana declared: "We must never forget that in dealing with the Filipinos, we deal with children."

It also paved the way for migration from the Philippines. Shortly after the war, Filipino immigrants began arriving in the United States as students, U.S. military personnel, or farm and cannery workers. Today, there are more than 2 million Filipinos and Filipino-Americans in the United States, making them alone of the nation's largest Asian communities.

On May 1, 1898, Commodore George Dewey had entered Manila Bay and destroyed the decrepit Spanish fleet. In December, Spain ceded the Philippines to the Untied States for $20 million. Mark Twain called the $20 million payment an "entrance fee into society--the Society of Scepter Thieves." "We do not intend to free but to subjugate the people of the Philippines," he wrote. "We have pacified some thousands of the islanders and buried them, destroyed their fields, burned their villages, and turned their widows and orphans out of doors."

On June 12, 1898, a young Filipino General, Emilio Aguinaldo, had proclaimed Philippine independence and established Asia's first republic. He had hoped that the Philippines would become a U.S. protectorate. But pressure on President William McKinley to annex the Philippines was intense. After originally declaring that it would "be criminal aggression" for the United States to annex the archipelago, he reversed himself, partly out of fear that another power would seize the Philippines. Six weeks after Dewey defeated the Spanish fleet at Manila Bay, a German fleet sought to set up a naval base there. The British, French, and Japanese also sought bases in the Philippines. Unaware that the Philippines were the only predominantly Catholic nation in Asia, President McKinley said that American occupation was necessary to "uplift and Christianize" the Filipinos.

On February 4, 1899, fighting erupted between American and Filipino soldiers leaving 59 Americans and approximately 3,000 Filipinos dead. With the Vice President casting a tie-breaking vote, a congressional resolution declaring the Philippines independent was defeated. American commanders hoped for a short conflict, but in the end more than 70,000 would fight in the archipelago. Unable to defeat the United States in conventional warfare, the Filipinos adopted guerrilla tactics. To suppress the insurgency, villages were forcibly relocated or burned. Non-combatant civilians were imprisoned or killed. Vicious torture techniques were used on suspected insurrects, such as the water cure, in which a suspect was made to lie face up while water was poured onto his face. One general declared:

It may be necessary to kill half of the Filipinos in order that the remaining half of the population may be advanced to a higher plane of life than their present semi-barbarous state affords.

The most notorious incident of the war took place on Samar Island. In retaliation for a Filipino raid on an American garrison, in which American troops had been massacred, General Jacob W. Smith told his men to turn the island into a "howling wilderness" so that "even birds could not live there." He directed a marine major to kill "all persons...capable of bearing arms." He meant everyone over the age of 10. Smith was court-martialed and "admonished" for violating military discipline.

Aguinaldo was captured by a raid on the Filipino leader's hideout in March 1901. The war was officially declared over in July 1902, but fighting continued for several years. The Philippine war convinced the United States not to seize further overseas territory.

More than 4,000 American soldiers and about 20,000 Filipino fighters died. An estimated 200,000 Filipino civilians died during the war, mainly of disease or hunger. Reports of American atrocities led the United States to turn internal control over to the Philippines to Filipinos in 1907 and pledged to grant the archipelago independence in 1916.

U.S. leaders tried to transform the country into a showcase of American-style democracy in Asia. But there was a strong undercurrent of condescension. U.S. President William Howard Taft, who had served as governor-general of the Philippines, called the Filipinos "our little brown brothers." The Philippines were granted independence in 1946.

Policing the Caribbean and Central America

In 1904, Germany demanded a port in Santo Domingo (now the Dominican Republic) as compensation for an unpaid loan. Theodore Roosevelt, who had become president after William McKinley's assassination, told Germany to stay out of the Western Hemisphere and said that the United States would take care of the problem. He announced the Roosevelt Corollary to the Monroe Doctrine:

Chronic wrongdoing, or an impotence which results in a general loosening of ties of civilized society, may in America, as elsewhere, ultimately require intervention by some civilized nation, and in the western hemisphere, the adherence of the U.S. to the Monroe Doctrine may force the United States, however reluctantly, in flagrant cases of wrongdoing or impotence, to the exercise of international police power.

Several recent developments led Roosevelt to declare that the United States would be the policeman of the Caribbean and Central America. Three European nations had blockaded Venezuela's ports, violating the Monroe Doctrine's unilateral declaration that Europe should not interfere in the Americas. Meanwhile, an international court in The Hague in the Netherlands had ruled that a creditor nation that had used force would receive preference in repayment of a loan. Further, Roosevelt had recently gained the right to build the Panama Canal, and believed that any threat to the canal threatened U.S. strategic and economic interests.

To enforce order, forestall foreign intervention, and protect U.S. economic interests, the United States intervened in the Caribbean and Central America some twenty times over the next quarter century, in Cuba, the Dominican Republic, Haiti, Mexico, Nicaragua, and Panama. Each intervention followed a common pattern: After intervening to restore order, U.S. forces became embroiled in the countries' internal political disputes. Before exiting, the United States would train and fund a police force and military to maintain order and would sponsor an election intended to put into power a strong leader supportive of American interests. Unfortunately, the men who took power in many these countries, such as Anastasio Somoza in Nicaragua, Rafael Trujillo in the Dominican Republic, and Francois Duvalier in Haiti, established despotic rule.

Intervention in Haiti

In July 1915, a mob murdered Haiti's seventh president in seven years. Vilbrun Guillaume Sam was dragged out of the French legation and hacked to death. The mob then paraded his mutilated body through the streets of the Haitian capital of Port-au-Prince. During the preceding 72 years Haiti had experience 102 revolts, wars, or coups, and just one of the country's 22 presidents had served a complete term and just four died of natural causes.

With the European powers engaged in World War I, President Woodrow Wilson feared that Germany might occupy Haiti and threaten the sea route to the Panama Canal. To protect U.S. interests and restore order, the president sent 330 Marines and sailors to Haiti.

This was not the first time that Wilson had sent Marines into Latin America. Determined to "teach Latin Americans to elect good men," he had sent American naval forces into Mexico in 1913 during the Mexican Revolution. American Marines seized the city of Veracruz and imposed martial law.

The last Marines did not leave Haiti until 1934. To ensure repayment of Haiti's debts, the United States took over collection of customs duties. Americans also arbitrated disputes, distributed food and medicine, censored the press, and ran military courts. In addition, the United States helped build about a thousand miles of unpaved roads, a number of agricultural and vocational schools and trained the Haitian army and police. It also helped replace a government led by blacks with a government headed by mulattoes, and forced the Haitians to adopt a new constitution which gave American businessmen the right to own land in Haiti. While campaigning for Vice President in 1920, Franklin D. Roosevelt, who had served as Assistant Secretary of the Navy in the Wilson Administration, later boasted, "I wrote Haiti's Constitution myself, and if I do say it, it was a pretty good little Constitution."

Many Haitians resisted the American occupation. In the fall of 1918, Charlemagne Peralte, a former Haitian army officer, launched a guerrilla war against the U.S. Marines to protest a system of forced labor imposed by the United States to build roads in Haiti. In 1919, he was captured and killed by U.S. Marines, and his body was photographed against a door with a crucifix and a Haitian flag as a lesson to others. During the first five years of the occupation, American forces killed about 2,250 Haitians. In December 1929, U.S. Marines fired on a crowd of protesters armed with rocks and machetes, killing 12 and wounding 23. The incident stirred international condemnation and ultimately led to the end of the American occupation.

By that time, Roosevelt had changed his mind. In 1928, he had criticized the Republican administrations for relying on the Marines and "gunboat diplomacy." "Single-handed intervention by us in the internal affairs of other nations in this hemisphere must end," he wrote. After he became president in 1933, Roosevelt proclaimed a new policy toward Latin America. Under the Good Neighbor policy, he removed American Marines from Haiti, the Dominican Republic, and Nicaragua.

Progressivism

Jane Addams: Champion for the Working Poor

She was a daughter of one of Illinois' richest men, but instead of leading a life of leisure, she dedicated her life to aiding the urban poor. A friend of labor, a proponent of women's suffrage, a foe of city bosses, and an opponent of war, she struggled to make the ideal of civic equality embodied in the Declaration of Independence a reality. Instead of offering charity, she sought to assimilate the immigrant poor into American society and became a pioneer social worker.

Born in 1860, Jane Addams was just two years old when her mother died. Suffering a severe curvature of her spine, she was coddled by her family. Her Quaker-born father, a banker, mill owner, and Republican politician, sent her to Europe twice and to college at Rockford Female Seminary. But like many other members of America's first generation of college-educated women, she felt deeply torn about what to do with her life. A woman, she wrote, "was practically faced with an alternative of marriage or a career." She could enter teaching, or medicine, or missionary work, or else she could marry and devote her life to homemaking.

She enrolled at Philadelphia's Women's Medical College, but suffered a nervous collapse after her father's sudden death. For eight years she suffered excruciating back pain, nervous prostration, and serious depression. She turned to Philadelphia's leading physician, Dr. S. Wier Mitchell, for help, and he prescribed rest cure treatment, confining her to bed and forbidding all visitors and activities. As he told another patient: "Live as domestic a life as possible. Have but two hours of intellectual life a day. And never touch pen, brush or pencil as long as you live."

Adams spent most of her 30s adrift, undergoing repeated rest cures. Then, suddenly she found her mission in life. On a trip to England, she was shocked by the squalor of London's East End slums. In response, she and a friend, Ellen Starr, decided to set up an institution to uplift America's urban poor.

In 1889, they moved into Hull House, a decaying mansion in one of Chicago's most destitute neighborhoods, and provided social services to the city's poor. Hull House offered classes that taught cooking, hygiene, and the rights and responsibilities of citizenship. Hull House also provided child care and a kindergarten and build the first playground in Chicago. Each week, 9,000 people, mostly immigrants from 28 different countries, came to Hull House. Her example helped inspire more than 400 other settlement houses around the country.

From Hull House, Addams tirelessly campaigned for an end to sweat shops and a ban on child labor. She convinced many professors at the University of Chicago to produce empirical, social scientific data. She advocated an eight hour day and legal protections for immigrants, and called for compulsory education, women's suffrage, and improved sanitation. She also sought to organize unions for female workers, establish a state bureau to inspect factories, and create the nation's first juvenile court. She helped create the career of the social worker.

Her memoir, Twenty Years at Hull House, was a best-seller when it appeared in 1910. But in 1915, public opinion began to turn against her when she founded the Women's Peace Party, an international organization dedicated to waging "a women's war against" World War I. Elected president of the Women's International League for Peace and Freedom in 1919, she opposed the peace treaty ending the war as vindictive. In 1931, four years before her death, she won the Nobel Peace Prize.

Progressivism

Few periods in American history have witnessed more ferment than the years between the founding of Hull House and American entry into World War I. This movement touched every aspect of American life. It transformed government into an active, interventionist entity at the national level, most notably under Presidents Theodore Roosevelt and Woodrow Wilson, but also at the state and local levels. For the first time Americans were prepared to use government, including the federal government, as an instrument of reform.

Progressive reformers secured a federal income tax based on the ability to pay, inheritance taxes, a modern national banking system, and government regulatory commissions to exercise oversight over banking, insurance, railroads, gas, electricity, telephones, transportation, and manufacturing.

Education, too, became a self-conscious instrument of social change. Influenced by the ideas of the educator and philosopher John Dewey, progressive educational reformers broadened school curricula to include teaching about health and community life; called for active learning that would engage students' minds and draw out their talents; applied new scientific discoveries about learning; and tailored teaching techniques to students' needs. Progressive educators promoted compulsory education laws, kindergartens and high schools, and raised the literacy rate of African Americans from 43 to 77 percent.

During the Progressive era, public health officers launched successful campaigns against hookworm, malaria and pellagra, and reduced the incidence of tuberculosis, typhoid and diphtheria. Pure milk campaigns also slashed rates of infant and child mortality.

Urban Progressives created public parks, libraries, hospitals, and museums. They also constructed new and water and sewer systems and eliminated "red-light" districts, such as New Orleans' Storyville, in most major cities.

To bridge the gap between capital and labor, Progressives called for arbitration and mediation of labor disputes. Meanwhile, many Progressive businessmen called for a new-style "welfare capitalism" that provided workers with higher wages and pensions.

The Progressive Era was one of the most creative in the realm of culture and the arts. In the hands of Alfred Stieglitz, photography became an art form for the first time. Architects like Frank Lloyd Wright helped create modern architecture. The first exhibition of modern art, the Armory Show in New York in 1913, was held in the United States.

A new vocabulary characterized this era. Americans would speak about a "public interest" that was opposed by "special interests." They would also speak about "efficiency" and "expertise" in government and morality in foreign affairs. For the first time, Americans spoke of "social workers," "muckrakers," "trustbusters," "feminists," "social scientists," and "conservation."

To increase popular control over government, Progressive reformers lobbied successfully for direct primaries, the elimination of boss rule, the direct election of Senators, woman's suffrage, and adoption of the referendum, the initiative, and the recall in many state legislatures. Reformers also saw adoption of the first restrictions on political lobbyists and the first regulations on campaign finances.

To modernize government finances, Progressives successfully instituted the income tax and established the Federal Reserve System to oversee the nation's economy. To regulate corporate behavior, Progressives enforced new anti-trust laws and established the country's first effective regulatory commissions. They also established licenses for such professionals as pharmacists, veterinarians, and undertakers. To improve social welfare, they lobbied for workmen's compensation laws, minimum wage laws for women workers, and old age and widow's pensions. To improve public health, Progressive reformers successfully lobbied water standards, state and local departments of health, sanitary codes for schools, and laws prohibiting the sale of adulterated foods and drugs.

The Progressive era also had a much more negative side. It saw the spread of disfranchisement and segregation of African Americans in the South and even in the federal government. This era also saw the enactment of reforms, such as at-large voting, that lessened the political influence of immigrant groups at a time when city budgets were increasing. Critics frequently condemned Progressives as moralistic, undemocratic, and elitist.

Progressives did not agree on a single agenda. They disagreed vehemently in their attitudes toward such subjects as immigration restriction and prohibition of alcohol. They were a diverse lot that included Republicans and Democrats; Protestants, Catholics, and Jews; urban and rural reformers. Women's organizations stood at the forefront of the social reforms and policy innovations during the Progressive era. Women activists were especially active in efforts to end child labor and to protest companies that had unsafe working conditions or produced unsafe products. For the most part, Progressives were urban and college-educated and included journalists, academics, teachers, doctors, nurses, and many business people.

A book published in 1913, Benjamin Parker De Witt's The Progressive Movement, argued that three tendencies underlay progressive reforms: the desire to eliminate political corruption, the impulse to make government more efficient and effective, and a belief that government should "relieve social and economic distress." Progressives also believed in efficiency, science, and professional expertise as the best ways to solve social problems. They wanted to apply the techniques of systematization, rationalization, and bureaucratic administrative control developed by business to problems posed by the city and industry.

For all its flaws and limitations, the Progressive era was instrumental in formulating the rationale for much of the welfare state, including Social Security, unemployment insurance, and aid to single parent families.

A New Era

The turn of the twentieth century witnessed a sudden clamor for social, political, and economic reform. Progressives boldly challenged the received wisdom in every aspect of life.

Birth Control

Of all the changes that took place in women's lives during the twentieth century, one of the most significant was women's increasing ability to control fertility. In 1916, Margaret Sanger, a former nurse, opened the country's first birth control clinic in Brooklyn. Police shut it down ten days later. "No woman can call herself free," she insisted, "until she can choose consciously whether she will or will not be a mother." Margaret Sanger would coin the phrase "birth control" and eventually convinced the courts that the Comstock Act did not prohibit doctors from distributing birth control information and devices. As founder of Planned Parenthood, her work resulted in the development of the birth control pill, which appeared in 1960.

Civil Rights

The publication of W.E.B. DuBois's The Souls of Black Folk heralded a new, more confrontational approach to civil rights. "The problem of the twentieth century," DuBois's book begins, "is the problem of the color line." In his book, DuBois, the first African American to receive a Ph.D. from Harvard, condemns Booker T. Washington's philosophy of accommodation and his idea that African Americans should confine their ambitions to manual labor. The Nashville Banner editorialized: "This book is dangerous for the Negro to read, for it will only excite discontent and fill his imagination with things that do not exist, or things that should not bear upon his mind." In 1908, after anti-black rioting took place in Springfield, Ill., DuBois and a group of African Americans and whites convene a convention in Harpers Ferry, Va., that becomes the basis for the first country's first national civil rights organization, the National Association for the Advancement of Colored People. By 1914, the NAACP had 6,000 members and offices in fifty cities.

Conservation

In 1907, President Theodore Roosevelt said, 

We are prone to speak of the resources of this country as inexhaustible; this is not so. The mineral wealth of the country, the coal, iron, oil, gas, and the like does not reproduce itself, and therefore is certain to be exhausted ultimately; and wastefulness in dealing with it today means that our descendants will feel the exhaustion a generation or two before they otherwise would.

During Roosevelt's presidency, 148 million acres were set aside as national forest lands and more than 80 million acres of mineral lands were withdraw from public sale. 

Government Reform

A Republican governor in Wisconsin, Robert LaFollette, puts into effect the "Wisconsin idea," which provided a model for reformers across the nation. It provided for direct primaries to select party nominees for public office, a railroad commission to regulate railroad rates, tax reform, opposition to political bosses, and the initiative and recall, devices to give the people more direct control over government. 

Labor Relations

In 1902, President Theodore Roosevelt became the first president to intervene on the side of workers in a labor dispute. He threatened to use the army to run the coal mines unless mine owners agree to arbitrate the strike. The president hand picked a commission to mediate the settlement. 

Medical Education

Abraham Flexner's 1910 study of American medical colleges transformed the training of doctors. His report led to the closing of second-rate medical schools and to sweeping changes in medical curricula and teaching methods.

Philanthropy

John D. Rockefeller revolutionized philanthropy by setting up a foundation staffed by experts to evaluate proposals and support programs to solve critical public problems. His foundation and others funded social surveys--systematic, non-partisan examinations of subjects by experts. 

Radical Trade Unionism

"One Big Union for All" was the goal of the radical labor leaders and Socialists who met in Chicago in 1905 and who formed the International Workers of the world. Rejecting the approach of the American Federation of Labor, which only admitted skilled craft workers to its ranks, the IWW opened its membership to any wage earner regardless of occupation, race, creed, or sex. 

Socialism

A new political party, the American Socialist Party, was founded in 1901. At its peak, in 1912, the party had 118,000 members. The largest socialist newspaper, The Appeal of Reason, published in Girard, Kansas, had a weekly circulation of 761,000. In the 1912 election, Socialist presidential candidate Eugene Debs received 800,000 votes and Socialists captured 1,200 political offices, including the mayors of 79 cities. 

Trust-Busting

In 1902, President Roosevelt instructed his attorney general to file suit against Northern Securities, a railroad holding company, and the beef trust in Chicago, for illegal constraint of trade. The U.S. Supreme Court ultimately ruled on the government's behalf.

The Roots of Progressivism

The Social Gospel

Religious ideas and institutions have always been one of the wellsprings of the American reform impulse. Progressive reformers were heavily influenced by the body of religious ideas known as the Social Gospel, the philosophy that the churches should be actively engaged in social reform. As elaborated by such theologians as Walter Rauschenbusch, the Social Gospel was a form of liberal Protestantism which held that Christian principles needed to be applied to social problems and that efforts needed to be made to bring the social order into conformity with Christian values. 

Muckrakers

Muckraking reporters, exploiting mass circulation journalism, attacked malfeasance in American politics and business. President Theodore Roosevelt gave them the name "muckrakers," after a character in the book Pilgrim's Progress, "the Man with the Muckrake," who was more preoccupied with filth than with Heaven above.

Popular magazines such as McClure's, Everybody's, Pearson's, Cosmopolitan, and Collier's published articles exposing the evils of American society--political corruption, stock market manipulation, fake advertising, vice, impure food and drugs, racial discrimination, and lynching. Upton Sinclair's The Jungle exposed unsanitary conditions in the meat packing industry. John Spargo's Bitter Cry of the Children disclosed the abuse of child laborers in the nation's coal mines. Lincoln Steffin's The Shame of the Cities uncovered corruption in City Government.

Herbert Croly and The Promise of American Life

If any one book can be said to offer a manifesto of Progressive beliefs, it was Herbert Croly's The Promise of American Life. Croly (1869-1930), a political theorist and journalist who founded The New Republic, was Progressivism's preeminent philosopher. His book, published in 1909, argued that Americans had to overcome their Jeffersonian heritage, with its emphasis on minimal government, decentralized authority, and the sanctity of individual freedom, in order to deal with the unprecedented problems of an urban and industrial age. Industrialism, he believed, had reduced most workers to a kind of "wage slavery," and only a strong central government could preserve democracy and promote social progress.

Croly, like most Progressives, was convinced that only a public-spirited, disinterested elite, guided by scientific principles, could restore the promise of American life. Thus, he called for the establishment of government regulatory commissions, staffed by independent experts, to protect American democracy from the effects of corporate power. He also believed that human nature "can be raised to a higher level by an improvement in institutions and laws." 

Progressivism in Government

The challenge confronting early twentieth-century America, according to Croly, was to respond to the problems that had accompanied the transformation of America from a rural, agricultural society into an urban industrial one. Filled with faith in the power of government, Progressives launched reform in the areas public health, housing, urban planning and design, parks and recreation, workplace safety, workers' compensation, pensions, insurance, poverty relief, and health care.

Newsies

n the movies, scrappy urban newsboys hawk papers with screaming headlines, shouting, "Extra! Extra! Read all about it!" Real late nineteenth and early twentieth century newsboys were very different than the Hollywood image of lovable street urchins singing and dancing in the streets.

Newsboys first appeared on city streets in the mid-nineteenth century with the rise of mass circulation newspapers. They were often wretchedly poor homeless children who often shrieked the headlines well into the night and often slept on the street.

In 1866, a reformer named Charles Loring Brace described the condition of homeless newsboys in New York City:

I remember one cold night seeing some 10 or a dozen of the little homeless creatures piled together to keep each other warm beneath the stairway of The [New York] Sun office. There used to be a mass of them also at The Atlas office, sleeping in the lobbies, until the printers drove them away by pouring water on them. One winter, an old burnt-out safe lay all the season in Wall Street, which was used as a bedroom by two boys who managed to crawl into the hole that had been burned.

In 1872, James B. McCabe, Jr., wrote:

There are 10,000 children living on the streets of New York.... The newsboys constitute an important division of this army of homeless children. You see them everywhere.... They rend the air and deafen you with their shrill cries. They surround you on the sidewalk and almost force you to buy their papers. They are ragged and dirty. Some have no coats, no shoes and no hat.

In 1899, several thousand newsboys, who made about 30 cents a day, called a strike, refusing to handle the newspapers of William Randolph Hearst and Joseph Pulitzer. Competing papers lavished coverage on the strikers, who were depicted as colorful characters who spoke in an oddly rendered Irish-immigrant dialect and had names like Race Track Higgens and Kid Fish. The news accounts gave much attention to the exhortations of a pint-sized newsboy and strike leader named Kid Blink (because he was blind in one eye). The New York Tribune quoted Kid Blink's speech to 2,000 strikers:

Friens and feller workers. Dis is a time which tries de hearts of men. Dis is de time when we'se got to stick together like glue.... We know wot we wants and we'll git it even if we is blind.

The lot of newsboys began to improve as urban child-welfare practices took root and publishers began competing for newsies by giving them prizes and trips.

Municipal Progressivism

Tom L. Johnson represented a model of Progressivism at the local level. He was a four-term mayor of Cleveland from 1901 to 1909. In office, he removed all "Keep Off the Grass Signs" from parks and embarked on an aggressive policy of municipal ownership of utilities. He fought the streetcar monopoly, reformed the police department, professionalized city services, and built sports fields and public bathhouses in poor sections of the city. He also coordinated the architecture and placement of public buildings downtown, set around a mall.

James Michael Curley, Boston's mayor, represented the kind of leader that many Progressives opposed. The Boston Evening Transcript called Curley "as clear an embodiment of civic evil as ever paraded before the electorate. Twice sent to prison for fraud, he acquired a 21-room mansion (which had gold-plated bathroom fixtures) paid for by kickbacks from contractors.

The son of an Irish washerwoman, Curley won office by speaking the language of class and ethnic resentment. But Curley also built new schools for the children of working-class Bostonians, tore down slum dwellings, established beaches and parks for the poor, and added an obstetrics wing to the city hospital. He also helped the poor in very direct ways; he provided bail money, funeral expenses, and temporary shelter for those made homeless by fire or eviction. When he died, a million people lined Boston's streets to pay their last respects.

To weaken political machines, municipal Progressives sought to reduce the size of city councils and eliminate the practice of electing officials by ward (or neighborhood). Instead, they proposed electing public officials on a city-wide (an at-large) basis. Candidates from poorer neighborhoods lacked funds to publicize their campaigns across an entire city. Urban Progressives also diminished the influence of machines by making municipal elections non-partisan, by prohibiting the use of party labels in local voting. A number of cities attempted to eliminate politics from city government by introducing city managers. Beginning with Staunton, Va., in 1908, a number of cities began to hire professional administrators to run city government.

Many Progressives wanted to improve the quality of urban life. The World's Columbian Exhibition in Chicago in 1893, marking the 400th anniversary of Columbus's first voyage of discovery, was an inspiration to many urban reformers. As a symbol of its recovery from the disastrous Great Fire of 1871, Chicago erected a massive "White City" to hold the event. Chicago's White City demonstrated the value of careful planning and beautification, and provided the impetus for many Progressive efforts to introduce city planning, zoning regulations, housing reform, and slum clearance.

The most far-reaching Progressive effort to transform the city was known as "municipal socialism." Many cities established municipal waterworks, gasworks, and electricity and public transportation system.

State Progressivism

During the Progressive era, the states were "laboratories for democracy," where state governments experiment with a wide range of reforms to eliminate governmental corruption, eliminate unsafe working conditions, make government more responsive to public needs, and protect working people.

The severe depression beginning in 1893 had discouraged states from engaging in policy innovation. Government retrenchment was the watchword of many lawmakers in the 1890s. Most of the reformers taken during these years were efforts to eliminate political bossism, corruption, and governmental waste. The depression also encouraged the consolidation of corporations, a development that would make trusts a major issue after the turn of the century. The Spanish American War had also diverted attention from domestic matters.

During the early twentieth century, many states adopted reforms that had been enacted years earlier in Massachusetts, which, along with Rhode Island, had been the first state to have a majority of its population live in cities. Many of these reforms involved protections for working people, including:

• compulsory school attendance laws, adopted in every state except Mississippi by 1916;

• laws limiting work hours for women and children in 32 states, and minimum wages for women workers in 11 states;

• workmen's compensation, which provided compensation for workers injured on the job in 32 states.

Other laws established an eight-hour workday for state employees; authorized credit unions; created public utility commission; established state employee pensions and instituted a host of health and safety regulations. Several states also passed laws prohibiting children from working at night.

To make the electoral process more democratic, all but three states adopted direct primaries by 1916, which allowed voters to choose among several candidates for a party's nomination. To allow voters to express their dissatisfaction with elected officials, Progressives proposed the recall, which allowed voters to vote to remove them before the end of their term of office. To give voters a greater voice in law-making, Progressives proposed the initiative and the referendum. The initiative allows voters to propose a bill and legislation and the referendum permits them to vote directly on an issue. Oregon, South Dakota, and Utah were the first states to adopt the initiative and referendum.

Beginning in the 1880s Britain, France, Germany, and Scandinavia adopted a series of social welfare programs--unemployment insurance, old age pensions, industrial accident and health insurance. During the Progressive era, many reformers borrowed these ideas and adapted them to meet American circumstances.

Perhaps the most dramatic American innovation was "widow's pensions." Adopted by most states, these programs provided widows with a monthly payment that allowed them to keep their children at home and not have to put them in orphanages or out for adoption.

National Progressivism

On September 6, 1901, President William McKinley was shaking hands with a line of well-wishers at the Pan American Exposition held in Buffalo, N.Y. Fifty soldiers and secret service agents roamed the premises, scrutinizing the crowd. A 28-year-old ex-Cleveland factory worker and farm hand named Leon Czolgosz (pronounced Chol-gots) moved toward the president and drew a 32-caliber revolver from his pockets. He wrapped his left hand and the gun with a large handkerchief.

A secret service agent touched Czolgosz shoulder. "Hurt your hand?" the agent asked. Czolgosz nodded. "Maybe you better get to the first aid station." Czolgosz replied: "After I meet the President. I've been waiting a long time."

Czolgosz approached McKinley, and said "excuse my left hand, Mr. President." McKinley shook his hand and the farm hand moved on. After several more citizens extended their greetings, Czolgosz lunged toward the president. As a secret service agent tried to grab him, Czolgosz fire twice in rapid succession. One bullet was deflected by McKinley's breastbone, but the other ripped through his stomach and lodged in his back. "I done my duty!" Czolgosz cried out. The president died eight days later.

Czolgosz was an anarchist who didn't believe in governments, rulers, voting, religion or marriage. In a handwritten confession, he complained that McKinley had been going around the country shouting about prosperity, when there was no prosperity for the working man.

The McKinley assassination marked the symbolic end of one era in American national politics and the beginning of a new one. In the United States, the 1890s had been a decade of depression, labor strife, and agrarian unrest. Social upheaval was not confined to the United States. The great European powers were struggling to control Africa, the Near East, and the Far East. Attempted revolution took place in Russia. At the turn of the century, six heads of state were assassinated by anarchists.

By 1900, many of the great questions of the nineteenth century seemed to be settled. Corporate enterprise dominated the American economy, justified by Social Darwinism. The United States had decided to join the struggle for world trade and markets. The status of African Americans was going to be largely defined by white Southerners.

But in fact the twentieth century would not be a continuation of the nineteenth. It was obvious from the moment that Theodore Roosevelt became president that new issues would dominate the twentieth century.

Theodore Roosevelt

At the Republican convention in 1900, a Senator warned his colleagues not to make Theodore Roosevelt its vice presidential nominee: "Don't any of you realize that there's only one life between this madman and the Presidency?" As New York's governor, he had challenged banking and insurance interests and Republican party boss Tom Platt wanted Roosevelt out of state affairs.

Born in New York City in 1858, Roosevelt was, in his own words, "nervous and timid" as a youth. He suffered headaches, fevers, stomach pains. He was so frail and asthmatic that he could not blow out a bedside candle. So he hiked, swam, boxed, and lifted weights to build up his strength and stamina. In 1912, he would be shot in the chest by a deranged man, but proceeded to deliver an hour-long speech before having the bullet removed.

At 23, he was elected to the New York state legislature. But in 1884, his wife and his mother died on the same day. To distance himself from these tragedies, he went to a 25,000-acre ranch in North Dakota's Badlands and became a cowboy. He wore spurs and carried a pearl-handled revolver from Tiffany, the New York jewelers.

He returned to serve as a U.S. Civil Service commissioner, then as New York City's crusading police commissioner, who wore disguises in order to root out corruption. He subsequently became Assistant Secretary of the Navy and governor of New York, before his election as vice president in 1900.

Lacking any military experience, he served as second-in-command of the Rough Riders, a volunteer cavalry unit that fought in Cuba during the Spanish-American war, wearing a uniform custom-tailed by Brooks Brothers. When William McKinley was assassinated, Roosevelt became, at age 42, the youngest president in American history.

Even those who know nothing about his presidency instantly recognize his image carved on Mount Rushmore, his huge toothy smile and his wire-rimmed glasses. As president, he fought for conservation of natural resources, consumer protection and anti-trust legislation. He forced coal operators to recognize the United Mine Workers.

His life was filled with contradictions. He was a member of one of the country's 20 richest families, yet he denounced business magnates as "malefactors of great wealth." The first president born in a big city, he was a hunter as a well as a conservationist; a bellicose man who boxed in the White House, he was also the first American to receive the Nobel Peace Prize for brokering peace between Russia and Japan.

Incredibly active and energetic, he was "a steam engine in trousers" who somehow found time to write three dozen books on topics ranging from history to hunting, in languages ranging from Italian and Portuguese to Greek and Latin. He was the first celebrity president, known simply by his initials. Said a British envoy, "You must always remember the President is about six."

In office, he greatly expanded the powers of the presidency. A bold and forceful leader, he viewed the White House as a "bully pulpit" from which he could preach his ideas about the need for an assertive government, the inevitability of bigness in business, and an active American presence in foreign policy. He broke up trusts that dominated the corporate world and regulated big business. He created the Departments of Commerce and Labor and the U.S. Forest Service. He supported a revolt in a province of Colombia that allowed the United States to build the Panama Canal. He sent a Great White Fleet on an around-the-world voyage to symbolize America's rise to world power. He made a dramatic public statement about race when he invited Booker T. Washington to dine at the White House.

He pushed legislation though Congress authorizing establishing the authority of the Interstate Commerce Commission to set railroad rates. In 1904, he won reelection by the largest popular majority up to that time. But on election night he announced that he would not seek reelection in 1908, a statement that undercut his influence during his second term. In 1909, he retired to hunt big game in Africa, and passed the presidency to his hand-picked successor, William Howard Taft.

Apart from his philosophy of active, interventionist government, Roosevelt's most lasting legacy is that he became the model for a new kind of president: a charismatic, hyperkinetic, heroic leader, who seeks to improve every aspect of society. He made the presidency as large as the problems posed by industrialization and urbanization.

Anti-Trust

One of the most significant issues Roosevelt confronted as president was how best to deal with the growth of corporate power. Between 1897 and 1904, 4,227 firms merged to form 257 corporations. The largest merger combined nine steel companies to created U.S. Steel. By 1904, 318 companies controlled about 40 percent of the nation's manufacturing output. A single firm produced over half the output in 78 industries.

Many Progressives feared that concentrated, uncontrolled corporate power threatened republican government. Public opinion feared that large corporations could impose monopolistic prices to cheat consumers and could squash small independent companies.

Roosevelt's Justice Department launched 44 anti-trust suits, prosecuting railroad, beef, oil, and tobacco trusts. Henry Clay Frick, the steel baron, complained, "We bought the son of a bitch and then he didn't stay bought." The most famous anti-trust suit, filed in 1906, involved John D. Rockefeller's Standard Oil Company. It took five years for Roosevelt to win his case in the Supreme Court. But in the end Standard Oil was broken into 34 separate companies.

Theodore Roosevelt did not oppose bigness in and of itself. He only opposed irresponsible corporate behavior. He distinguished between "good trusts" and "bad trusts" and advocated regulating big corporations in the public interest by means of a government commission.

Government Regulation

At the beginning of the twentieth century, milk distributors frequently adulterated milk by adding chalk or plaster to improve its color and molasses and water to cut costs. Meatpackers killed rats by putting poisoned pieces of bread on their floors; sometimes, the poisoned rats made their way onto the production lines.

The publication of Upton Sinclair's book The Jungle, exposing unsanitary conditions in the meatpacking industry, generated widespread public support for federal inspection of meatpacking plants. The Department of Agriculture disclosed the dangers of chemical additives in canned foods. A muckraking journalist named Samuel Hopkins uncovered misleading and fraudulent claims in non-prescription drugs.

To deal with these problems the federal government enacted:

• The Meat Inspection Act (1906), mandating government enforcement of sanitary and health standards in meatpacking plants;

• The Pure Food and Drug Act (1906), prohibiting false advertising and harmful additives in food

 

Progressives often portrayed their battles as simply the latest example of an older struggle between "the people" and business interests and proponents of democracy against the defenders of special privilege. In fact, this view is quite misleading. Corporate managers were often strong supporters of Progressive reform. During Theodore Roosevelt's presidency, mining companies worked closely with the administration in order to try to rationalize the extraction of natural resources. Big meatpackers promoted the Meat Inspection Act of 1906 to prevent smaller packers from exporting bad meat and closing foreign markets to all American meat products.

Conservation

The United States was the first nation in the world to create wilderness parks. Theodore Roosevelt launched conservation as a national political movement. As president, he argued on behalf of conserving natural resources and preserving wild lands, and he set aside the first national monuments and wildlife refuges. In 1906, he signed the Antiquities Act, which enables a president to protect wild lands as national monuments. Among the places he protected under the act was the Grand Canyon.

Taft

Today, William Howard Taft is better known for his weight than for his presidency. The most corpulent president, he was 6-feet, 2-inches tall and weighed 330 pounds. A special bathtub was installed in the White House large enough to accommodate four average sized adults. When he was governor of the Philippines, he referred in a cable to a horseback ride he had taken into the mountains. The reply: "Referring to your telegram--how is the horse?"

He had served as a federal judge and the appointed governor of the Philippines before Roosevelt named him secretary of war. But his talents as administrator served him poorly as a president, and he was perceived, wrongly, as a tool of entrenched interests. In 1930, he was appointed Chief Justice of the U.S. Supreme Court.

As president, Taft had very substantial Progressive accomplishments. He filed twice as many anti-trust suits as Roosevelt, expanded Roosevelt's program of conserving public lands, created a Children's Bureau within the Labor Department, and pushed through Congress the Mann-Elkins Act of 1910, which strengthened the federal government's power to regulate the railroads. He also submitted a proposal for a tax on corporate income and called for a constitutional amendment to permit an income tax. The amendment was ratified in 1913, during the waning days of days of his administration.

But Taft also fired Roosevelt's trusted lieutenant Gifford Pinchot, who had attacked his conservation policies; and supported the reelection of Joe Cannon, the Republican old guard speaker of the House, in return for conservative support on other issues. He tried to lower tariffs on foreign trade, only to have his proposal gutted by Congress. Said Roosevelt of his successor: 

Taft, who is such an admirable fellow, has shown himself such an utterly commonplace leader, good-natured, feebly well-meaning, but with plenty of small motive; and totally unable to grasp or put into execution any great policy. 

Disenchanted with Taft and missing the glory of the presidency, Roosevelt challenged his successor for the 1912 presidential nomination. "We stand at Armageddon," said Roosevelt in 1912, "and we battle for the Lord."

During the campaign, Roosevelt called the president a "fathead" and a "puzzlewit" who was "dumber than a guinea pig." Roosevelt's remarks deeply embittered Taft. "Even a cornered rat will fight," he reported said to a journalist.

Roosevelt won most of the primaries but lost a rules fight at the Republican convention and only won a third of the delegates. Charging Taft with "hijacking" the nomination, Roosevelt launched a third party. As the Progressive party candidate, Roosevelt received 27 percent of the vote, still a record for a third-party presidential candidate. Taft only won 23 percent of the popular vote, partly due to his failure to publicize his progressive achievements.

Income Tax

A federal income tax is a surprisingly recent innovation. The modern income tax was only introduced in 1913. From 1866 to 1893, the federal government ran surpluses, thanks to revenues from tariffs and excise taxes.

Republicans defended protective tariffs as a positive good. They claimed that a high tariff encouraged industrialization and urbanization, generated high wages and profits, and created a rich home market for farmers and manufacturers. Beginning in 1887, the Democrats, led by Grover Cleveland, argued that the tariff was a tax on consumers for the benefit of rich industrialists. They claimed that the tariff raised prices, encouraged foreign countries to retaliate against American farm exports, and encouraged the growth of economic trusts. In fact, there is little evidence that the tariff had much economic significance. Its major beneficiaries were produces of raw material, especially sugar, wool, hides, and timber.

By the end of the 1890s, revenue from the tariff was declining (since the United States was mainly importing raw materials) as was revenue from federal land sales. Meanwhile, government spending was increasing. By 1905 the expanding U.S. Navy was receiving 20 percent of the federal budget. At the same time Congress expanded pensions for Civil War veterans.

In 1894 the government ran the first deficit since the Civil War and enacted a short-lived income tax, which was declared unconstitutional in 1895. The Supreme Court ruled that it violated a constitutional provision that taxes had to be apportioned among the states. The court reached this decision even though it had earlier upheld an income tax levied during the Civil War.

In April 1909, southern and western congressmen sponsored another income-tax bill, hopeful that a Supreme Court with a new membership might approve it. Their opponents responded by sponsoring a constitutional amendment that would authorize an income tax, which they thought could not be ratified by three-fourths of the states. Congress approved the amendment overwhelmingly. The Senate vote was 77 to 0, the House's 318 to 14.

By the end of 1911, 31 states had approved (including New York and Maryland as well as many southern and western states), five short of the required number. It appeared that the amendment had failed, since no previous amendment had taken so long to be ratified.

But during the 1912 election, Democrat Woodrow Wilson and third-party candidate Theodore Roosevelt rekindled support for the amendment. The amendment went into effect when Wyoming became the 36th state to ratify in February 1913.

The new federal income tax was modest and affected only about one-half of 1 percent of the population. It taxed personal income at one percent and exempted married couples earning less than $4,001. A graduated surtax, beginning on incomes of $20,000, rose to 6 percent on incomes of more than $500,000. The $4,000 exemption expressed Congress' conclusion that such a sum was necessary to "maintain an American family according to the American standard and send the children through college." It was about six times the average male's income. State officials were exempt from paying any taxes, as were federal judges and the president of the United States.

American involvement in World War I caused government expenditures to soar and international trade (and tariff revenues) to shrink. By 1919, the minimum taxable income had been reduced to $1,000 and the top rate was 77 percent. As late as 1939, only 3.9 million Americans had to file. But just six years later, 42.6 million did. Tax withholding was introduced in congressional legislation in 1943. President Franklin Roosevelt vetoed this provision, but Congress overrode his veto.

Wilson

The split in the Republican ranks in 1912 enabled Democrat Woodrow Wilson to win the presidency. Despite receiving only 42 percent of the popular vote in 1912, Wilson steered through Congress the creation of the Federal Reserve System, the Federal Trade Commission, tariff reduction, anti-trust legislation, and a graduated income tax.

He began as something of an isolationist in foreign policy. He apologized to Colombia for the U.S. role in Panama's independence; and he appointed the pacifistic William Jennings Bryan as Secretary of State. But he would later vow to teach Latin Americans lessons in democracy.

Only a week after taking office in 1913, Wilson called upon Mexico's president Victoriano Huerta, who had seized power after the constitutional president was murdered, to step aside when elections were held. When Huerta refused, Wilson used minor incidents--including the arrest of some American sailors in Tampico and the arrival of a German merchant ship carrying supplies for Huerta--as a pretext for occupying the Mexico port of Veracruz. Within weeks, Huerta was forced to leave his country.

During the conflict, the Mexican revolutionary Pancho Villa made a number of raids into U.S. territory near the Mexican border. Wilson responded by ordering Gen. John J. (Black Jack) Pershing to cross into Mexico.

As president, Wilson also sent American troops to occupy Haiti in 1915 and the Dominican Republic in 1916. A year later, the United States bought the Virgin Islands, thereby gaining control of every major Caribbean island except British Jamaica. He engaged in more military interventions abroad than any other American president.

Thomas Woodrow Wilson was born in Virginia but grew up in Augusta, Ga., where his father was an official of the Southern Presbyterian church. After briefly practicing as a lawyer (he only had two clients, one of whom was his mother), he attended graduate school at Johns Hopkins and taught history and political science at Bryn Mawr, Wesleyan, and Princeton, his alma mater. He wrote several highly acclaimed books, including Congressional Government, which decried the weakening of presidential authority in the United States, and The State, a call for increased government activism.

As Princeton's president, he developed a reputation as a reformer for trying to eliminate the school's elitist system of teaching clubs. Professional politicians in New Jersey, thinking wrongly that they could manipulate the politically inexperienced Wilson, helped make him the state's governor and then arranged his nomination as president in 1912 as a way to block another bid by William Jennings Bryan, whose prairie populism had been rejected three times by voters. Before he launched his campaign, Wilson described himself with these words:

I am a vague, conjectural personality, more made up of opinions and academic prepossessions than of human traits and red corpuscles. We shall see what will happen!

With the Republican vote split between Taft and Roosevelt, Wilson became the first southerner to be elected president since the Civil War, carrying 40 states but only 42 percent of the vote. After his election, the moralistic, self-righteous Wilson told the chairman of the Democratic party: "Remember that God ordained that I should be the next president of the United States." Wilson later said that the United States had been created by God "to show the way to the nations of the world how they shall walk in the paths of liberty."

During his first term, he initiated a long list of major domestic reforms. These included:

[pic]

• The Underwood Simmons Tariff (1913), which substantially lowered duties on imports for the first time since the Civil War and enacted a graduated income tax;

• The Federal Reserve Act (1913), which established a Federal Reserve Board and 12 regional Federal Reserve banks to supervise the banking system, setting interest rates on loans to private banks, and control the supply of money in circulation;

• The Federal Trade Commission Act (1914), establishing the Federal Trade Commission, which sought to preserve competition by preventing businesses from engaging in unfair business practices.

• Clayton Act Anti-Trust Act (1914), which limited the ownership of stock in one corporation by another, non-competitive pricing policies, and forbid interlocking directorship for certain banking and business corporations. It also recognized the right of labor to strike and picket and barred the use of anti-trust statutes against labor unions.

Unlike Roosevelt, who believed that big business could be successfully regulated by government, Woodrow Wilson believed that the federal government should break up big businesses in order to restore as much competition as possible. Other social legislation enacted during Wilson's first term included:

• The Seaman Act (1915), which set minimum standards for the treatment of merchant sailors;

• The Adamson Act (1916), which established an 8-hour workday for railroad workers;

• The Workingmen's Compensation Act (1916), which provided financial assistance to federal employees injured on the job;

• The Child Labor Act (1916), which forbade the interstate sale of goods produced by child labor; and

• The Farm Loan Act (1916), which made it easier for farmers to get loans.

Following Wilson's election in 1912, four Constitutional Amendments were ratified:

• 16th Amendment (1913) gave Congress the power to impose an income tax;

• 17th Amendment (1913) required the direct election of Senators;

• 18th Amendment (1919) banned the manufacture and sale of alcoholic beverages; and

• 19th amendment (1920) gave women the right to vote.

Wilson's second term was dominated by American involvement in World War I. At the end of September 1919 Wilson suffered a mild stroke, which was followed, in early October, by a major stroke that almost totally incapacitated him.

More than most presidents, Wilson's historical reputation has swung up and down. During the 1920s as a priggish and anti-business president, an impractical visionary and fuzzy idealist, who embroiled the United States in a needless war. During the 1940s, in sharp contrast, he was depicted in the Hollywood film Wilson during the 1940s as an idealistic leader struggling to create a new world order based on international law.

Immigration

The Statue of Liberty The Statue of Liberty

It is the tallest metal statue ever constructed, and, at the time it was completed, the tallest building in New York, 22 stories high. It stands 151 feet high and weighs 225 tons. Its arms are 42 long and its torch is 21 feet in length. Its index fingers are eight feet long and it has a 4-foot 6-inch nose. For people all around the world, the statue symbolizes American freedom, hope, and opportunity.

There may be grander monuments, but this statute was not like the Egyptian pyramids or the Colossus of Rhodes, "the brazen statue of Greek fame."

The statue was originally proposed by a now obscure French historian, Edouard de Laboulaye, a prominent French abolitionist, and designed by the French sculptor Frederic Auguste Bartholdi. The statue has severed chains on one of her feet.

The Statue of Liberty was a gift from French republicans who wanted to advance their political cause: the replacement of the monarchy of Napoleon III with a republican system of government. It was modeled, in part, on the Roman goddess Libertas, the personification of liberty and freedom in classical Rome, which led some critics to object to a heathen goddess standing in New York harbor. Others derided the statue as a "useless gift," "Neither an object of Art or of Beauty," and it seemed possible that the statue would be placed in Boston or Philadelphia.

The final $100,000 for the statue's pedestal was raised by the Hungarian-born publisher Joseph Pulitzer, who asked New York's poor for contributions. In exchange, he printed their names in his newspaper. One wrote a letter to his paper, The World: "I am a young girl alone in the world, and earning my own living. Enclosed please find 60 cents, the result of self-denial. I wish I could make it 60 thousand dollars, instead of cents, but drops make the ocean."

Over time, the statue's symbolic meaning has been transformed. It was originally intended to express opposition and slavery. After the America's emergence as a world power after its defeat of Spain in the Spanish-American War of 1898, the statue became a symbol of American might. It was not until the twentieth century and massive immigration from eastern and southern Europe that the statue became "a lady of hope" for immigrants and refugees.

Emma Lazarus

On a tablet on the pedestal of the statue of liberty is inscribed a poem. Entitled "The New Colossus," it contains the famous words, "Give me your tired, your poor, your huddled masses yearning to breathe free."

These words were not originally attached to the statue. The poem, which was written in 1883 to help raise money for the statue's pedestal, was forgotten until it was rediscovered in a Manhattan used-book store. The text was only placed on the pedestal in 1903, and it transformed the statue's meaning.

Its author, Emma Lazarus, was an American Jew, born in New York City in 1849. She had a privileged upbringing, and wrote a volume of poetry that was privately printed by her father.

In 1881, a wave of anti-Semitism swept across Russia. Soldiers destroyed Jewish districts, burned homed and synagogues. Thousands of Jews set sail for America. Lazarus was shocked by what she saw and devoted herself to helping the refugees.

The final sum needed to complete the pedestal came from an auction of literary works by such authors as Mark Twain and Walt Whitman. Emma Lazarus was asked to contribute a poem. She was reminded of the Colossus of Rhodes, a huge bronze statue of the sun god Helios, one of the wonders of the ancient world. She called her poem "The New Colossus," and it was sold for $1,500. At the time, she was dying of cancer. She was just 38 years old when she died in 1887.

The New Immigrants

Some 334,203 immigrants arrived in the United States in 1886, the year of the statue of liberty's dedication. A Cuban revolutionary, Jose Marti, wrote: "Irishmen, Poles, Italians, Czechs, Germans freed from tyranny or want--all hail the monument of Liberty because to them it seems to incarnate their own uplifting."

The immigrants who would catch a glimpse of the statue would mainly come from eastern and southern Europe.

In 1900, 14 percent of the American population was foreign born, compared to 8 percent a century later. Passports were unnecessary and the cost of crossing the Atlantic was just $10 in steerage.

European immigration to the United States greatly increased after the Civil War, reaching 5.2 million in the 1880s then surging to 8.2 million in the first decade of the twentieth century. Between 1882 and 1914, approximately 20 million immigrants came to the United States. In 1907 alone, 1.285 million arrived. By 1900, New York City had as many Irish residents as Dublin. It had more Italians than any city outside Rome and more Poles than any city except Warsaw. It had more Jews than any other city in the world, as well as sizeable numbers of Slavs, Lithuanians, Chinese, and Scandinavians.

Unlike earlier immigrants, who mainly came from northern and western Europe, the "new immigrants" came largely from southern and eastern Europe. Largely Catholic and Jewish in religion, the new immigrants came from the Balkans, Italy, Poland, and Russia.

Birds of Passage

Many of the millions of immigrants who arrived into the United States in the late nineteenth and early twentieth centuries did so with the intention of returning to their villages in the Old World. Known as "birds of passage," many of these eastern and southern European migrants were peasants who had lost their property as a result of the commercialization of agriculture. They came to American to earn enough money to allow them to return home and purchase a piece of land. As one Slavic steelworker put it: "A good job, save money, work all time, go home, sleep, no spend."

Many of these immigrants came to America alone, expecting to rejoin their families in Europe within a few years. From 1907 to 1911, of every hundred Italians who arrived in the United States, 73 returned to the Old Country. For Southern and Eastern Europe as a whole, approximately 44 of every 100 who arrived returned back home.

Some immigrants, however, did not come as "sojourners." In particular, Jewish immigrants from Russia, fleeing religious persecution, came in family groups and intended to stay in the United States from the beginning.

Chinese Exclusion Act

From 1882 until 1943, most Chinese immigrants were barred from entering the United States. The Chinese Exclusion Act was the nation's first law to ban immigration by race or nationality. All Chinese people--except travelers, merchants, teachers, students, and those born in the United States--were barred from entering the country. Federal law prohibited Chinese residents, no matter how long they had legally worked in the United States, from becoming naturalized citizens.

From 1850 to 1865, political and religious rebellions within China left 30 million dead and the country's economy in a state of collapse. Meanwhile, the canning, timber, mining, and railroad industries on the United States's West Coast needed workers. Chinese business owners also wanted immigrants to staff their laundries, restaurants, and small factories.

Smugglers transported people from southern China to Hong Kong, where they transfered onto passenger steamers bound for Victoria, British Columbia. From Victoria, many immigrants crossed into the United States in small boats at night. Others crossed by land.

The Geary Act, passed in 1892, required Chinese aliens to carry a residence certificate with them at all times upon penalty of deportation. Immigration officials and police officers conducted spot checks in canneries, mines, and lodging houses and demanded that every Chinese person show these residence certificate.

Due to intense anti-Chinese discrimination, many merchants' families remained in China while husbands and fathers worked in the United States. Since Federal law allowed merchants who returned to China to register two children to come to the United States, men who were legally in the United States might sell their testimony so that an unrelated child could be sponsored for entry. To pass official interrogations, immigrants were forced to memorize coaching books which contained very specific pieces of information, such as how many water buffalo there were in a particular village. So intense was the fear of being deported that many "paper sons" kept their false names all their lives. The U.S. government only gave amnesty to these "paper families" in the 1950s.

Angel Island

It was called the "Ellis Island of the West." Located in San Francisco Bay, Angel Island was also a check point for immigrants in the early years of the twentieth century. But only a small proportion of the 175,000 people who arrived at Angel Island were allowed to remain in the United States. Angel Island was a detention center for Chinese immigrants. It was surrounded by barbed wire.

13-year-old Jack Moy and his mother sailed to the United States in 1927. The two spent a month in the detention center separated from one another. Immigration officials asked insulting personal questions, such as whether their mother had bound feet or how many water buffalo a village had or "who occupies the house on the fifth lot of your row in your native village." Discrepancies in an answer could mean deportation to China. Immigration officials marked down every identifying mark, including scars, boils, and moles.

To join her husband in the United States, Suey Ting Gee had to pretend that she was the wife of another man. Under a U.S. law in effect from 1882 to 1943, the Chinese wives of resident alien laborers could not join them in this country.

Japanese Immigration

Overpopulation and rural poverty led many Japanese to emigrate to the United States, where they confronted intense racial prejudice. In California, the legislature imposed limits on Japanese land ownership, and the Hearst newspaper ran headlines such as 'The Yellow Peril: How Japanese Crowd Out the White Race.'

The San Francisco School Board stirred an international incident in 1906 when it segregated Japanese students in an 'Oriental School.' The Japanese government protested to President Theodore Roosevelt. Roosevelt negotiated a 'gentlemen's agreement' restricting Japanese emigration.

Contract Labor

During the nineteenth century, demand for manual laborers to build railroads, raise sugar on Pacific islands, mine precious metals, construct irrigation canals, and perform other forms of heavy labor, grew. Particularly in tropical or semi-tropical regions, this demand for manual labor was met by indentured or contract workers. Nominally free, these laborers served under contracts of indenture which required them to work for a period of time--usually five to seven years--in return for their travel expenses and maintenance. In exchange for nine hours of labor a day, six days a week, indentured servants received a small salary as well as clothing, shelter, food, and medical care.

An alternative to the indenture system was the "credit ticket system." A broker advanced the cost of passage and workers repaid the loan plus interest out of their earnings. The ticket system was widely used by Chinese migrants to the United States. Beginning in the 1840s, about 380,000 Chinese laborers migrated to the U.S. mainland and 46,000 to Hawaii. Between 1885 and 1924, some 200,000 Japanese workers went to Hawaii and 180,000 to the U.S. mainland.

Indentured laborers are sometimes derogatorily referred to as "coolies." Today, this term carries negative connotations of passivity and submissiveness, but originally it was an Anglicization of a Chinese word that refers to manual workers impressed into service by force or deception. In fact, indentured labor was frequently acquired through deceptive practices and even violence.

Between 1830 and 1920, about 1.5 million indentured laborers were recruited from India, one million from Japan, and half a million from China. Tens of thousands of free Africans and Pacific Islanders also served as indentured workers.

The first Indian indentured laborers were imported into Mauritius, an island in the Indian Ocean, in 1830. Following the abolition of slavery in the British Empire in 1833, tens of thousands of Indians, Chinese, and Africans were brought to the British Caribbean. After France abolished slavery in 1848, its colonies imported 80,000 Indian laborers and 19,000 Africans. Also ending slavery in 1848, Dutch Guiana recruited 57,000 Asian workers for its plantations. Although slavery was not abolished in Cuba until 1886, the rising costs of slaves led plantations to recruit 138,000 indentured laborers from China between 1847 and 1873.

Areas that had never relied on slave labor also imported indentured workers. After 1850, American planters in Hawaii recruited labor from China and Japan. British planters in Natal in southern African recruited Indian laborers and those in Queensland in northeastern Australia imported laborers from neighboring South Pacific islands. Other indentured laborers toiled in East Africa, on Pacific Islands such as Fiji, and in Chile, where they gathered bird droppings known as guano for fertilizer.

Steam transportation allowed Europeans and their descendants to extract "surplus" labor from overpopulated areas suffering from poverty and social and economic dislocation. In India, the roots of migration included unemployment, famine, demise of traditional industries, and the demand for cash payment of rents. In China, a society with a long history of long-distance migration, causes of migration included overpopulation, drought, floods, and political turmoil, culminating in the British Opium Wars (1839-1842 and 1856 and 1860) and the Taiping Rebellion, which may have cost 20 to 30 million lives.

Overwhelmingly male, many indentured workers initially thought of themselves as sojourners who would reside temporarily in the new society. In the end, however, many indentured laborers remained in the regions where they worked. As a result, the descendents of indentured laborers make up a third of the population in British Guiana, Fiji, and Trinidad by the early twentieth century.

Some societies, such as the United States, passed legislation that hindered the migration of Asian women. In contrast, the British Caribbean colonies required 40 women to be recruited for every 100 men to promote family life.

Immigration Restriction

Gradually during the late nineteenth and early twentieth century, the United States imposed additional restrictions on immigration. In 1882, excluded people were likely to become public charges. It subsequently prohibited the immigration of contract laborer (1885) and illiterates (1917), and all Asian immigrants (except for Filipinos, who were U.S. nationals) (1917). Other acts restricted the entry of certain criminals, people who were considered immoral, those suffering from certain diseases, and paupers. Under the Gentlemen's Agreement of 1907-1908, the Japanese government agreed to limit passports issued to Japanese in order to permit wives to enter the United States; and in 1917, the United States barred all Asian immigrants except for Filipinos, who were U.S. nationals. Intolerance toward immigrants from southern and eastern Europe resulted in the Immigration Act of 1824, which placed a numerical cap on immigration and instituted a deliberately discriminatory system of national quotas. In 1965, the United States adopted a new immigration law which ended the quota system.

During the twentieth century, all advanced countries imposed restrictions on the entry of immigrants. A variety of factors encouraged immigration restriction. These include a concern about the impact of immigration on the economic well-being of a country's workforce as well as anxiety about the feasibility of assimilating immigrants of diverse ethnic and cultural origins. Especially following World War I and World War II, countries expressed concern that foreign immigrants might threaten national security by introducing alien ideologies.

It is only in the twentieth century that governments became capable of effectively enforcing immigration restrictions. Before the twentieth century, Russia was the only major European country to enforce a system of passports and travel regulations. During and after World War I, however, many western countries adopted systems of passports and border controls as well as more restrictive immigration laws. The Russian Revolution prompted fear of foreign radicalism exacerbated by the Russian Revolution, while many countries feared that their societies would be overwhelmed by a postwar surge of refugees.

Among the first societies to adopt restrictive immigration policies were Europe's overseas colonies. Apart from prohibitions on the slave trade, many of the earliest immigration restrictions were aimed at Asian immigrants. The United States imposed the Chinese Exclusion Act in 1882. It barred the entry of Chinese laborers and established stringent conditions under which Chinese merchants and their families could enter. Canada also imposed restrictions on Chinese immigration. It imposed a "head" tax (which was $500 in 1904) and required migrants to arrive by a "continuous voyage."

• Xenophobia: Hatred of foreigners and immigrants 

• Nativism: The policy of keeping a society ethnically homogenous.

Migration and Disease

Throughout history, the movement of people has played a critical role in the transmission of infectious disease. As a result of migration, trade, and war, disease germs have traveled from one environment to others. As intercultural contact has increased--as growing numbers of people traveled longer distances to more diverse destinations--the transmission of infectious diseases has increased as well.

No part of the globe has been immune from this process of disease transmission. In the 1330s, bubonic plague spread from central Asia to China, India, and the Middle East. In 1347, merchants from Genoa and Venice carried the plague to Mediterranean ports. The African slave trade carried yellow fever, hookworm, and African versions of malaria into the New World. During the early nineteenth century, cholera spread from northeast India to Ceylon, Afghanistan and Nepal. By 1826, the disease had reach the Arabian Peninsula, the eastern coast of Africa, Burma, China, Japan, Java, Poland, Russia, Thailand, and Turkey. Austria, Germany, Poland, and Sweden were struck by the disease by 1829, and within two more years, cholera had reached the British Isles. In 1832, the disease arrived in Canada and the United States.

Epidemic diseases have had far-reaching social consequences. The most devastating pandemic of the twentieth century, the Spanish Flu epidemic of 1918 and 1919, killed well over 20 million people around the world--many more people than died in combat in World War I. Resulting in such complications as pneumonia, bronchitis and heart problems, the Spanish Flu had particularly devastating impact in Australia, Canada, China, India, Persia, South Africa, and the United States. Today, the long-distance transfer of disease continues, evident, most strikingly with AIDS (Acquired Immunodeficiency Syndrome), which many researchers suspect originated in sub-Saharan Africa.

Disease played a critically important role in the success of European colonialism. After 1492, Europeans carried diphtheria, influenza, measles, mumps, scarlet fever, smallpox, tertian malaria, typhoid, typhus, and yellow fever to the New World, reducing the size of the indigenous population 50 to 90 percent. Measles killed one fifth of Hawaii's people during the 1850s and a similar proportion of Fiji's indigenous population in the 1870s. Influenza flu, measles, smallpox, whooping cough reduced the Maoris population of New Zealand from about 100,000 in 1840 to 40,000 in 1860.

Fear of contagious diseases assisted nativists in the United States in their efforts to restrict foreign immigration. The 1890s was a decade of massive immigration from Eastern Europe. When 200 cases of typhus appeared among Russian Jewish immigrants who had arrived in New York on French steamship in 1892, public health authorities acted swiftly. They detained the 1,200 Russian Jewish immigrants who had arrived on the ship and placed them in quarantine to keep the epidemic from spreading. The chairman of the U.S. Senate committee on Immigration subsequently proposed legislation severely restricting immigration, including the imposition of a literacy requirement.

Fear that immigrants carried disease mounted with news of an approaching cholera pandemic. The epidemic, which had begun in India in 1881, did not subside until 1896, when it had spread across the Far East, Middle East, Russia, Germany, Africa, and the Americas. More than 300,000 people died of cholera in famine-stricken Russia alone.

To prevent the disease from entering the United States, the port of New York in 1892 imposed a twenty-day quarantine on all immigrant passengers who traveled in steerage. This measure, which did not apply to cabin-class passengers, was designed to halt foreign immigration, since few steamships could afford to pay $5,000 a day in daily port fees. Other cities including Boston, Chicago, Cleveland, and Detroit, imposed quarantines on immigrants arriving in local railroad stations. Congress in 1893 adopted the Rayner-Harris National Quarantine Act which set up procedures for the medical inspection of immigrants and permitted the president to suspend immigration on a temporary basis.

A fear that impoverished immigrants will carry disease into the United States has recurred during the twentieth century. In 1900, after bubonic plague appeared in San Francisco's Chinatown, public health officials in San Francisco quarantined Chinese residents. In 1924, a pneumonia outbreak resulted in the quarantining of Mexican American immigrants. After Haitian immigrants were deemed to be at high risk of AIDS during the 1980s, they were placed under close scrutiny by immigration officials.~Throughout history, the movement of people has played a critical role in the transmission of infectious disease. As a result of migration, trade, and war, disease germs have traveled from one environment to others. As intercultural contact has increased--as growing numbers of people traveled longer distances to more diverse destinations--the transmission of infectious diseases has increased as well.

No part of the globe has been immune from this process of disease transmission. In the 1330s, bubonic plague spread from central Asia to China, India, and the Middle East. In 1347, merchants from Genoa and Venice carried the plague to Mediterranean ports. The African slave trade carried yellow fever, hookworm, and African versions of malaria into the New World. During the early nineteenth century, cholera spread from northeast India to Ceylon, Afghanistan and Nepal. By 1826, the disease had reach the Arabian Peninsula, the eastern coast of Africa, Burma, China, Japan, Java, Poland, Russia, Thailand, and Turkey. Austria, Germany, Poland, and Sweden were struck by the disease by 1829, and within two more years, cholera had reached the British Isles. In 1832, the disease arrived in Canada and the United States.

Epidemic diseases have had far-reaching social consequences. The most devastating pandemic of the twentieth century, the Spanish Flu epidemic of 1918 and 1919, killed well over 20 million people around the world--many more people than died in combat in World War I. Resulting in such complications as pneumonia, bronchitis and heart problems, the Spanish Flu had particularly devastating impact in Australia, Canada, China, India, Persia, South Africa, and the United States. Today, the long-distance transfer of disease continues, evident, most strikingly with AIDS (Acquired Immunodeficiency Syndrome), which many researchers suspect originated in sub-Saharan Africa.

Disease played a critically important role in the success of European colonialism. After 1492, Europeans carried diphtheria, influenza, measles, mumps, scarlet fever, smallpox, tertian malaria, typhoid, typhus, and yellow fever to the New World, reducing the size of the indigenous population 50 to 90 percent. Measles killed one fifth of Hawaii's people during the 1850s and a similar proportion of Fiji's indigenous population in the 1870s. Influenza flu, measles, smallpox, whooping cough reduced the Maoris population of New Zealand from about 100,000 in 1840 to 40,000 in 1860.

Fear of contagious diseases assisted nativists in the United States in their efforts to restrict foreign immigration. The 1890s was a decade of massive immigration from Eastern Europe. When 200 cases of typhus appeared among Russian Jewish immigrants who had arrived in New York on French steamship in 1892, public health authorities acted swiftly. They detained the 1,200 Russian Jewish immigrants who had arrived on the ship and placed them in quarantine to keep the epidemic from spreading. The chairman of the U.S. Senate committee on Immigration subsequently proposed legislation severely restricting immigration, including the imposition of a literacy requirement.

Fear that immigrants carried disease mounted with news of an approaching cholera pandemic. The epidemic, which had begun in India in 1881, did not subside until 1896, when it had spread across the Far East, Middle East, Russia, Germany, Africa, and the Americas. More than 300,000 people died of cholera in famine-stricken Russia alone.

To prevent the disease from entering the United States, the port of New York in 1892 imposed a twenty-day quarantine on all immigrant passengers who traveled in steerage. This measure, which did not apply to cabin-class passengers, was designed to halt foreign immigration, since few steamships could afford to pay $5,000 a day in daily port fees. Other cities including Boston, Chicago, Cleveland, and Detroit, imposed quarantines on immigrants arriving in local railroad stations. Congress in 1893 adopted the Rayner-Harris National Quarantine Act which set up procedures for the medical inspection of immigrants and permitted the president to suspend immigration on a temporary basis.

A fear that impoverished immigrants will carry disease into the United States has recurred during the twentieth century. In 1900, after bubonic plague appeared in San Francisco's Chinatown, public health officials in San Francisco quarantined Chinese residents. In 1924, a pneumonia outbreak resulted in the quarantining of Mexican American immigrants. After Haitian immigrants were deemed to be at high risk of AIDS during the 1980s, they were placed under close scrutiny by immigration officials.

The Changing Face of the United States

Today, immigration to the United States is at its highest level since the early twentieth century. Some 10 million legal and undocumented immigrants entered the country during the 1980s, exceeding the previous high of nine million between 1900 and 1910.

Shaped by an unprecedented wave of immigrants from Latin America, Asia and Africa, the face of the United States has changed in the space of twenty years. In 1996, nearly one in ten U.S. residents was born in another country, twice as many as in 1970. Since 1965, when the United States ended strict national immigration quotas, the number of Hispanics in the United States tripled and the number of Asians increased nearly eight-fold.

As recently as the 1950s, two-thirds of all immigrants to the United States came from Europe or Canada. Today, more than 80 percent are Latin American or Asian. The chief sources of immigrants are Mexico, the Philippines, China, Cuba, and India. Nearly half the foreign-born population is Hispanic; a fifth is Asian; a twelfth is black.

As a result of massive immigration, the United States is becoming the first truly multi-racial advanced industrial society in which every resident will be a member of a minority group. California recently became the first state in which no single ethnic group or race makes up half of the population.

Immigration's impact has been geographically uneven, concentrated in distinct parts of the country. Immigrants have been attracted to areas of high growth and high rates of historic immigration. Immigrants are particularly attracted to areas where their countrymen already are. Three-quarters of all immigrants live in six states--California, Florida, Illinois, New Jersey, New York, and Texas--and more than half of these migrants settled in just eight metropolitan areas.

Meanwhile, the native-born population is itself on the move. For every ten immigrants who arrive in the nation's largest cities, nine native-born inhabitants leave for a residence elsewhere. Because most of those leaving metropolitan areas are non-Hispanic whites, the United States population has grown more geographically divided even as it becomes more ethnically diverse.

As in the past, the South remains the region with the fewest foreign immigrants. But it is experiencing a rapid influx of native-born migrants--black and white. Reversing the great northward migration of the early and mid-twentieth century, a significant number of African Americans are abandoning northern industrial cities and are returning to big cities in the South.

Work has always been the great magnet attracting migrants to the United States. Historically, immigrants tackled jobs that native-born American avoided, such as digging canals, building railroads, or working in steel mills and garment factories. Today, many immigrants help meet need for highly skilled professionals, while other less-educated immigrants find employment as maids, janitors, farm workers, and poorly paid, non-unionized employees.

Each wave of immigrants has also sparked a wave of anti-immigrant sentiment. Since the first wave of mass immigration from Germany and Ireland in the 1840s, nativists have expressed fear that immigrants depress wages, displace workers, and threaten the nation's cultural values and security.

Even though the United States conceives of itself as a refuge for the poor and tempest-tossed, it is also a society that has experienced periodic episodes of intense anti-immigrant fervor, particularly in times of economic and political uncertainty. Unlike nineteenth-century nativists who charged that Catholic immigrants were subservient to a foreign leader, the Pope, or later xenophobes who accused immigrants of carrying subversive ideologies, today's immigration critics are more concerned about immigration's effects on the country's economic well-being.

Many fear that newcomers make use of services like welfare or unemployment benefits more frequently than natives. Others argue that the new wave of immigrants is less skilled than its predecessors and is therefore more likely to become a burden on the government. Many worry that the society is being split into separate and unequal societies divided by skin color, ethnic background, language, and culture. Fear that immigrants are attracted to the United States by welfare benefits, led Congress in 1996 to restrict non-citizens' access to social services.

Census data present a complicated picture of today's immigrants. They show that many immigrants are better educated than the native born, while others are less educated. Today, about 12 percent of immigrants over the age of 25 have graduate degrees, compared with 8 percent of the native born. Yet 36 percent have not graduated from high school, compared with 17 percent of the native born. Some immigrants have found employment as highly skilled engineers, mathematicians, and scientists; but about a third of immigrants live in poverty. On average they earn about $8,000 a year, compared with native-born average of nearly $16,000.

Yet if some Americans express anxiety about immigration, others are hopeful that increasing population diversity will teach Americans will learn to tolerate and even cherish the extraordinary variety of their country's people.

Migration Today

At the end of the twentieth century, long-distance migration increasingly involves the movement of people from Third World to advanced industrial countries. Contributing to this immigration flow is a growing income gap between the richer and poorer countries; Third World populations increasing faster than economic growth; political conflicts that create large numbers of refugees; and improved means of communication and transportation, which alert migrants to opportunities available in affluent countries and make it easier to travel to them.

Perhaps the most important factor stimulating global migration in the late twentieth century is the advanced countries' need for workers to perform low wage jobs that their own citizens are unwilling to take. A heightened demand for low-wage laborers from underdeveloped areas arose at mid-century. During World War II, the United States instituted the bracero program to bring migrant farmworkers from Mexico. After the war, many Western European countries brought in guestworkers to work in construction, manufacturing, and service occupations. Many of these guestworkers came from North Africa and Eastern and Southern Europe and former European colonies.

During the prolonged period of economic stagnation and inflation that began with the oil price hikes of 1973, immigration became an increasingly contentious political issue. Many European countries encouraged guestworkers to return to their homeland. Across the western world, societies debated whether to restrict immigration.

Due to advances in communications, including the spread of cable television, the development of videocasettes, and the declining cost of long distance telephone service, migrants are able to maintain contact with their native culture to a much greater degree than in the past.

Evaluating the Economic Costs and Benefits of Immigration

Few subjects arouse more controversy than the economic impact of immigration. Some argue that migration benefits societies economically by providing a pool of young, energetic, reliable workers. Other argue, in contrast, that immigration overloads the labor force, overburdens social services, and overwhelms society's capacity to absorb and assimilate newcomers.

Critics of immigration make several economic arguments. They contend that immigrants take jobs away from low-skilled native born workers and depress wages. They maintain that immigrants make greater use of public services such as welfare, health services, and public education than do the native born. They also argue that the immigrants who are currently arriving into countries such as Canada, France, Germany, and the United States are relatively less-skilled and less-educated than those who came in the past, and that they are more cut off from mainstream culture than those who arrived earlier in history. Such critics argue that restricting immigration would open up job opportunities for many native-born minority workers and reduce tax burdens.

Proponents of immigration respond to such arguments in several ways. For one thing, they argue that in evaluating the costs and benefits of immigration, it is important to recognize the ways that immigrants contribute to living standards, particularly for the middle class. Although low-wage immigrant workers are often blamed for unemployment and depressed wages, in fact they make it cheaper to buy many goods and services--everything from fresh fruit and vegetables to clothing, construction, and childcare. As birth rates fall, immigrants assume many necessary but less desirable jobs, picking crops, washing dishes in restaurants, laundering clothes, staffing hospitals, and running small shops.

Undocumented (or "illegal") immigrants, they maintain, are overwhelmingly employed in sectors of the economy paying low wages, offering little job security, and few or no benefits. Because of the lack of opportunities for advancement, few native-born workers are attracted to these jobs.

Proponents of immigration further argue that immigrants are not simply producers, but consumers as well, who create demand that helps invigorate an economy. They create markets for housing, clothing, and other products and services. In general, immigrants are attracted to areas of high economic growth and labor shortages and as a result they note that immigration has little or no effect on the wages or unemployment rate.

Migration as a Key Theme in U.S. and World History

The massive movement of peoples as a result of voluntary choice, forced removal, and economic and cultural dislocation has been one of the most important forces for social change over the past 500 years. Changes produced by migration--such as urbanization or expansion into frontier regions--transformed the face of the modern world. Migration has also played a pivotal role in the formation of modern American culture. Our most cherished values as well as our art, literature, music, technology, and cultural beliefs and practices have been shaped by an intricate process of cultural contact and interaction. Because ours is a nation of immigrants, drawn from every part of the world, the study of migration provides a way to recognize and celebrate the richness of our population's ancestral cultures.

Historical understanding, however, demands more than a solid grasp of historical events and personalities. It also requires students to understand basic historical and sociological terms and concepts. The following skills-building exercises are designed to increase understanding of the complexity of migration.

The Language of Migration: Key Concepts

Demographers, historians, and sociologists have developed a technical vocabulary that is useful in understandings the nature, varieties, and results of migration. 

• Career Migration: The movement of people or households in response to occupational opportunities in business enterprises, government bureaucracies, or the military.

• Chain Migration: The movement of clusters of individuals from a common place of origin to another place. The earlier migrants provide later migrants with aid and information.

• Circular Migration: A well-defined pattern of migration, such as seasonal work or grazing of livestock or sending children temporarily into domestic service in another family's home, in which migrants return to their place of origin.

• Diaspora: The dispersion abroad of a group of people.

• Emigration: The departure of people from their homeland to take up residence in a new place of residence.

• Forced Migration: Migration that takes place when the migrant has no choice about whether or not to move.

• Global Migration: Human movement across continents.

• Immigration: The movement into a country of which one is not a native.

• Impelled Migration: Migration that takes place under great economic, political, or social pressures.

• Internal Migration: The movement of people from one part of a country or region to another.

• Local Migration: Migration within a narrow geographical area, often within a single labor market or agricultural market.

• Long-Distance Movements: The movement of people from one country or region to another.

• Repatriation: The return of migrants or displaced persons to their place of origin or citizenship.

• Repeat Migration: Individuals who repeatedly migrate from and return to their place of origin.

• Seasonal Migration: Migration at a particular time of the year.

Kinds of Migrants

When we think of migrants, one image quickly comes to mind: people who permanently depart their place of birth and travel hundreds and even thousands of miles to make a new home. But this kind of migration represents only one of many forms of migration.

Migration may be voluntary or involuntary. Involuntary migrants are those people who are forced to move--by organized persecution or government pressure. Migration may be temporary or permanent. Approximately a third of the European immigrants who arrived in the United States between 1820 and World War I eventually returned to live in their country of origin. These "birds of passage," as they are known, often returned to the United States several times before permanently settling in their homeland.

Migration may also be short distance or long distance. Short-distance migrants might move from a rural community to a nearby urban area or from a smaller city to a larger one. Migration may be cyclical and repetitive, like the rhythmic migrations of nomadic livestock herders or present day farm laborers. Or it may be tied to a particular stage in the life cycle, like the decision of an adolescent to leave home to go to college.

The motives behind immigration may also vary widely. Migration may occur in reaction to poverty, unemployment, overcrowding, persecution, or dislocation. It may also arise in response to employment opportunities or the prospects of religious or political freedom. In distinguishing between different kinds of migration, it is important to look at:

--The distance traveled;

--The causes of migration;

--Whether migration is temporary, semi-permanent, or permanent;

--Whether the migration is voluntary, involuntary, or the result of pressure. ~When we think of migrants, one image quickly comes to mind: people who permanently depart their place of birth and travel hundreds and even thousands of miles to make a new home. But this kind of migration represents only one of many forms of migration.

Migration may be voluntary or involuntary. Involuntary migrants are those people who are forced to move--by organized persecution or government pressure. Migration may be temporary or permanent. Approximately a third of the European immigrants who arrived in the United States between 1820 and World War I eventually returned to live in their country of origin. These "birds of passage," as they are known, often returned to the United States several times before permanently settling in their homeland.

Migration may also be short distance or long distance. Short-distance migrants might move from a rural community to a nearby urban area or from a smaller city to a larger one. Migration may be cyclical and repetitive, like the rhythmic migrations of nomadic livestock herders or present day farm laborers. Or it may be tied to a particular stage in the life cycle, like the decision of an adolescent to leave home to go to college.

The motives behind immigration may also vary widely. Migration may occur in reaction to poverty, unemployment, overcrowding, persecution, or dislocation. It may also arise in response to employment opportunities or the prospects of religious or political freedom. In distinguishing between different kinds of migration, it is important to look at:

--The distance traveled;

--The causes of migration;

--Whether migration is temporary, semi-permanent, or permanent;

--Whether the migration is voluntary, involuntary, or the result of pressure. 

• "Coolie" Laborer: A pejorative term referring to a contract manual laborer, usually from South or East Asia.

• Displaced Person: A person displaced from a place of residence by war, political strife, or natural catastrophe.

• Nomad: A member of a people who have no fixed place of residence, usually a migratory pastoral people.

• Pioneer: An individual who helps to open up a frontier region to settlement.

• Refugee: A person who flees to a foreign country to escape danger or persecution.

• Slave: A person who is held in servitude as another person's chattel property.

• Sojourner: A temporary resident.

The Stages of Migration

Migration usually involves a series of distinct steps or stages. These include:~Migration usually involves a series of distinct steps or stages. These include:

• The stimuli that lead individuals to migrate

• Preparations to move

• Departure

• The transit to a new environment

• Arrival

• Acclimation to a new location

• Reception of immigrants into the new environment

• Establishment of a new identity

The Language of Cultural Mixture and Persistence

The study of migration encourages us to think about the process of cultural adjustment and adaptation that takes place after migrants move from one environment to another. In the early twentieth century, Americans commonly thought of migration in terms of a "melting pot," in which immigrants shed their native culture and assimilated into the dominant culture. Today, we are more likely to speak of the persistence and blending of cultural values and practices. ~The study of migration encourages us to think about the process of cultural adjustment and adaptation that takes place after migrants move from one environment to another. In the early twentieth century, Americans commonly thought of migration in terms of a "melting pot," in which immigrants shed their native culture and assimilated into the dominant culture. Today, we are more likely to speak of the persistence and blending of cultural values and practices.

• Assimilation: Absorption into the cultural tradition of another group.

• Creolization: Cultural patterns and practices that reflect a mixture of cultural influences. In terms of language, creolization refers to the way that a subordinate group incorporates elements of a dominant group's language, simplifying grammar and mixing each group's vocabulary.

• Fusion: The melding together of various cultural practices.

• Hybridization: A fusion of diverse cultures or traditions.

• Redefinition: To alter the meaning of an existing cultural practice, tradition, or concept.

• Survival: The persistence of an earlier cultural practice in a new setting.

• Syncretization: The way that a group of people adapts to a changing social environment by selectively incorporating the beliefs or practices of a dominant group.

Music and Migration

As they traveled from one environment to another, immigrant groups carried their musical traditions with them. These musical traditions included ceremonial music, folk music, work songs, dance music, instrumental music, and popular songs as well as distinctive forms of musical instrumentation.

The New World slaves, for example, created the banjo in the New World, modeled on earlier African musical instruments. West Africa had one of the most complex rhythmic cultures in the world, and in developing musical forms in the New World, African Americans made extensive use of rhythmic syncopation. This musical term refers to temporarily breaking the regular beat in a piece of music by stressing the weak beat and singing and embellishing around the beat. African Americans drew upon these earlier traditions to create music as diverse as the Spiritual (which blended together rhythmic and melodic gestures drawn from African music with white church music), Ragtime (which combined a syncopated melodic line with rhythms drawn from musical marches), the Blues (songs of lamentation which made extensive use of ambiguities of pitch), and Jazz (an amalgam of blues, ragtime, and Broadway musical forms).

Migration resulted in the creation of new musical hybrids, styles, and genres. The polka, a popular dance of the mid-19th century, represented an American adaptation of German tradition.

One of the earliest forms of commercial popular music was the minstrel song, which accompanied a popular form of 19th century theatrical entertainment. Minstrel songs represented an adaptation of earlier ethnic and popular musical traditions. The minstrel song--typified by Stephen Foster's Old Folk's At Home and his Camptown Race Track--drew its rhythm partly from the polka and its spicy syncopation from African and African American music.

In Latin America, the music of various ethnic groups blended together to form musical and dance forms that would become recognizable worldwide. Spanish, African, and various Indian musical traditions combined in intricate ways to form the tango, the cha-cha, the mamba, the rumba, and reggae. ~As they traveled from one environment to another, immigrant groups carried their musical traditions with them. These musical traditions included ceremonial music, folk music, work songs, dance music, instrumental music, and popular songs as well as distinctive forms of musical instrumentation.

The New World slaves, for example, created the banjo in the New World, modeled on earlier African musical instruments. West Africa had one of the most complex rhythmic cultures in the world, and in developing musical forms in the New World, African Americans made extensive use of rhythmic syncopation. This musical term refers to temporarily breaking the regular beat in a piece of music by stressing the weak beat and singing and embellishing around the beat. African Americans drew upon these earlier traditions to create music as diverse as the Spiritual (which blended together rhythmic and melodic gestures drawn from African music with white church music), Ragtime (which combined a syncopated melodic line with rhythms drawn from musical marches), the Blues (songs of lamentation which made extensive use of ambiguities of pitch), and Jazz (an amalgam of blues, ragtime, and Broadway musical forms).

Migration resulted in the creation of new musical hybrids, styles, and genres. The polka, a popular dance of the mid-19th century, represented an American adaptation of German tradition.

One of the earliest forms of commercial popular music was the minstrel song, which accompanied a popular form of 19th century theatrical entertainment. Minstrel songs represented an adaptation of earlier ethnic and popular musical traditions. The minstrel song--typified by Stephen Foster's Old Folk's At Home and his Camptown Race Track--drew its rhythm partly from the polka and its spicy syncopation from African and African American music.

In Latin America, the music of various ethnic groups blended together to form musical and dance forms that would become recognizable worldwide. Spanish, African, and various Indian musical traditions combined in intricate ways to form the tango, the cha-cha, the mamba, the rumba, and reggae.

The 1920s

The 1920s - An Overview

In 1931, a journalist named Frederick Lewis Allen published a volume of popular history that did more to shape the popular image of the 1920s than any book ever written by a professional historian. Entitled Only Yesterday, it depicted the 1920s as a cynical, hedonistic interlude between the Great War and the Great Depression, a decade of dissipation, jazz bands, raccoon coats, and bathtub gin. Allen argued that World War I shattered Americans' faith in reform and moral crusades, leading the younger generation to rebel against traditional taboos while their elders engaged in an orgy of consumption and speculation.

The popular image of the 1920s as a decade of prosperity and riotous living, of bootleggers and gangsters, flappers and hot jazz, flagpole sitters and marathon dancers, is indelibly etched in the American psyche. But this image is also profoundly misleading. The 1920s was a decade of deep cultural conflict. Unlike the pre-Civil War decades, when the fundamental conflicts in American society involved geographic region, or the gilded Age, when conflicts centered on ethnicity and social class, the conflicts of the 1920s were primarily cultural, pitting a more cosmopolitan, modernist urban culture against a more provincial, traditionalist, rural culture.

The decade witnessed a titanic struggle between an old and a new America. Immigration, race, alcohol, evolution, gender politics, sexual morality all became major cultural battlefields during the '20s. Wets battled drys, religious modernists battled religious fundamentalists, urban ethnics battled the Ku Klux Klan.

The 1920s was a decade of profound social change. The most obvious signs of change were the rise of a consumer-oriented economy and of mass entertainment, which helped to bring about a "revolution in morals and manners." Sexual mores, gender roles, hair styles, and dress all changed profoundly during the 1920s. Many Americans regarded these changes as a liberation from the country's Victorian past. But for others, morals seemed to be decaying and the United States seemed to be changing in undesirable ways. The result was a thinly veiled "cultural civil war."

The Postwar Red Scare

On May 1, 1919--May Day--postal officials discovered 20 bombs in the mail of prominent capitalists, including John D. Rockefeller and J.P. Morgan, Jr., as well as government officials like Supreme Court Justice Oliver Wendell Holmes. A month later, bombs exploded in eight American cities.

On September 16, 1920, a bomb left in a parked horse-drawn wagon exploded near Wall Street in Manhattan's financial district, killing 30 people and injuring hundreds. The suspicion was that the bomb was the work of alien radicals. Authorities came up with a list of subjects and even questioned the man who had recently reshod the wagon's horse. But despite the offer of an $80,000 reward, no one was charged with the crime.

The end of World War I was accompanied by a panic over political radicalism. Fear of bombs, Communism, and labor unrest produced a "Red Scare." In Hammond, Ind., a jury took two minutes to acquit the killer of an immigrant who had yelled "To Hell with the United States." At a victory pageant in Washington, D.C., a sailor shot a man who refused to stand during the playing of the Star-Spangled Banner, while the crowd clapped and cheered. A clerk in a Waterbury, Conn. clothing store was sentenced to jail for six months for remarking to a customer that the Russian revolutionary Lenin was "the brainiest" or "one of the brainiest" world leaders.

In November 1919 in the Washington State lumber town of Centralia, American Legionaires stormed an attack on an office of the International Workers of the World. Four attackers died in a gunfight before townspeople overpowered the IWW members and took them to jail. A mob broke into the jail, seized one of the IWW members, and hanged him from a railroad bridge. Federal officials subsequently prosecuted 165 IWW leaders, who received sentences of up to 25 years in prison.

Congress and state legislatures joined in the attack on radicalism. In May 1919, the House refused to seat Victor Berger, a Socialist from Milwaukee, after he was convicted of sedition. The House again denied him his seat following a special election in December 1919. Not until he was reelected again in 1922, after the government dropped the sedition charges, did Congress finally seat him. In 1920, the New York State Legislature expelled five members. They were told that they had been elected on a platform "absolutely inimical to the best interests" of New York State.

In 1919 and 1920, President Wilson's Attorney General A. Mitchell Palmer led raids on leftist organizations such as the Communist party and the radical labor union, the International Workers of the World. Palmer hoped to use the issue of radicalism as a way to become president in 1920. He created the precursor to the Federal Bureau of Investigation, which collected the names of thousands of known or suspected Communists.

In November 1919, Palmer ordered government raids that resulted in the arrests of 250 suspected radicals in eleven cities. The Palmer Raids reached their height on January 2, 1920, when government agents made raids in 33 cities. Nationwide, more than 4000 alleged Communists were arrested and jailed without bond. 556 aliens were deported, including the radical orator Emma Goldman.

Palmer claimed to be riding the country of the "moral perverts and hysterical neurasthenic women who abound in communism." But his tactics alienated many, who viewed them as a violation of civil liberties.

Postwar Labor Tensions

The years following the end of World War I were a period of deep social tensions, aggrevated by high wartime inflation. Food prices more than doubled between 1915 and 1920; clothing costs more than tripled. A steel strike that began in Chicago in 1919 became much more than a simple dispute between labor and management. The Steel Strike of 1919 became the focal point for profound social anxieties, especially fears of Bolshevism.

Organized labor had grown in strength during the course of the war. Many unions won recognition and the 12-hour workday was abolished. An 8-hour days was instituted on war contract work and by 1919, half the country's workers had a 48-hour work week.

The war's end, however, was accompanied by labor turmoil, as labor demanded union recognition, shorter hours, and raises exceeding the inflation rate. Over 4 million workers--one fifth of the nation's workforce--participated in strikes in 1919, including 365,000 steelworkers and 400,000 miners. The number of striking workers would not be matched until the Depression year of 1937.

The year began with a general strike in Seattle. Police offers in Boston went on strike, touching off several days of rioting and crime. But the most tumultuous strike took place in the steel industry. Then some 350,000 steelworkers in 24 separate craft union went on strike as part of a drive by the American Federation of Labor to unionize the industry. From management's perspective, the steel strike represented the handiwork of radicals and professional labor agitators. The steel industry's leaders regarded the srike as a radical conspiracy to get the company to pay a 12-hour wage for eight hours' work. At a time when Communists were seizing power in Hungary and were staging a revolt in Germany, and workers in Italy were seizing factories, some industrialists feared that the steel strike was the first step toward overturning the industrial system.

The strike ended with the complete defeat of the unions. From labor's perspective, the corporations had triumphed through espionage, blacklists, and the denial of freedom of speech and assembly and through the complete unwillingness to recognize the right of collective bargaining with the workers' representatives.

During the 1920s, many of labor's gains during World War I and the Progressive era were rolled back. Membership in labor unions fell from 5 million to 3 million. The U.S. Supreme Court outlawed picketing, overturned national child labor laws, and abolished minimum wage laws for women.

Prohibition

At midnight, January 16, 1920, the United States went dry. Breweries, distilleries, and saloons were forced to close their doors.

Led by the Anti-Saloon League and the Women's Christian Temperance Union, the dry forces had triumphed by linking Prohibition to a variety of Progressive era social causes. Proponents of Prohibition included many women reformers who were concerned about alcohol's link to wife beating and child abuse and industrialists such as Henry Ford who were concerned about the impact of drinking on labor productivity. Advocates of Prohibition argued that outlawing drinking would eliminate corruption, end machine politics, and help Americanize immigrants.

Even before the 18th Amendment was ratified, about 65 percent of the country already banned alcohol. In 1916, seven states adopted anti-liquor laws, bringing to 19 the number of states prohibiting the manufacture and sale of alcoholic beverages. America's entry into World War I made Prohibition seem patriotic, since many breweries were owned by German Americans. Wayne Wheeler, lobbyist for the Anti-Saloon League, urged the federal government to investigate "a number of breweries around the country which are owned in part by alien enemies." In December 1917, Congress passed the 18th Amendment. A month later, President Woodrow Wilson instituted partial prohibition to conserve grain for the war effort. Beer was limited to 2.75 percent alcohol content and production was held to 70 percent of the previous year's production. In September, the president issued a ban on the wartime production of beer.

National Prohibition was defended as a war measure. The amendment's proponents argued that grain should be made into bread for fighting men and not for liquor. Anti-German sentiment aided Prohibition's approval. The Anti-Saloon League called Milwaukee's brewers "the worst of all our German enemies," and dubbed their beer "Kaiser brew."

Unsuccessfully, the brewing industry argued that taxes on liquor were paying more for the war effort than were liberty bonds. Yet even after Prohibition was enacted, many ethnic Americans viewed beer or wine drinking as an integral part of their culture, not as a vice.

The wording of the 18th Amendment banned the manufacture and sale (but not the possession, consumption, or transportation) of "intoxicating liquors." Many brewers hoped that the ban would not apply to beer and wine. But Congress was controlled by the drys, who advocated a complete ban on alcohol. A year after ratification, Congress enacted the Volstead Act, which definied intolicating beverages as anything with more than 0.5 percent alcohol. This meant that beer and wine, as well as whiskey and gin, were barred from being legally sold.

Advocates did not believe it would be necessary to establish a large administrative apparatus to enforce the law. The federal government never had more than 2,500 agents enforcing the law. A few states did try to help out. Indiana banned the sale of cocktail shakers and hip flasks. Vermont required drunks to identify the source of their alcohol. The original Congressional appropriation for enforcement was $5 million. Several years later, the government estimated enforcement would cost $300 million.

Enforcing the law proved almost impossible. Smuggling and bootlegging were widespread. Two New York agent, Izzie Einstein and Mo Smith, relied on disguises while staging their raids, once posing as man and wife. But after a raid on New York City's 21, that trapped some of the city's leading citizens, their efforts were halted. In New York, 7,000 arrests for liquor law violations resulted in 17 convictions.

Enforcement of Prohibition was originally assigned to the Internal Revenue Service, which is why the enforcement agents who destroyed moonshine stills were called revenuers. Only in 1930 was enforcement transferred to the Justice Department. After Prohibition, tax collection on liquor was returned to the IRS, which was also charged with the registration of machine guns and sawed-off shotguns and enforcement of taxes on tobacco. These responsibilities were spun off in 1972 to the Bureau of Alcohol, Tobacco, and Firearms.

Prohibition failed because it was unenforceable. By 1925, half a dozen states, including New York, passed laws banning local police from investigating violations. Prohibition had little support in the cities of the Northeast and Midwest.

Prohibition did briefly pay some public health dividends. The death rate from alcoholism was cut by 80 percent by 1921 from pre-war levels, while alcohol-related crime dropped markedly. But seven years after Prohibition went into effect, the total deaths from adulterated liquor reached approximately 50,000, and many more cases of blindness and paralysis. According to one story, a potential buyer who sent a liquor sample to a laboratory for analysis was shocked when a chemist replied: "Your horse has diabetes."

Prohibition quickly produced bootleggers, speakeasies, moonshine, bathtub gin, and rum runners smuggling supplies of alcohol across state lines. In 1927, there were an estimated 30,000 illegal speakeasies, twice the number of legal bars before Prohibition. Many people made beer and wine at home. Finding a doctor to sign a prescription for medicinal whiskey, sold at drugstores was relatively easy.

Cleveland had 1,200 legal bars in 1919, a year before Prohibition went into effect. By 1923, the city had an estimated 3,000 illegal speakeasies, along with 10,000 stills. An estimated 30,000 city residents sold liquor during Prohibition and another 100,000 made home brew or bathtub gin for themselves and friends.

Prohibition also fostered corruption and contempt for law and law enforcement among large segments of the population. Harry Daughtery, attorney general under Warren Harding, accepted bribes from bootleggers. George Remus, a Cincinnati bootlegger, had a thousand salesmen on his payroll, many of them police officers. He estimated that half his receipts went as bribes. Al Capone's Chicago organization reportedly took in $60 million in 1927 and had half the city's police on its payroll.

Popular culture glamorized bootleggers like Chicago's Capone, who served as the model for the central characters in such films as Little Caesar and Scarface. In rural areas, moonshiners became folk heroes. The fashion of the flapper, dancing the Charleston in a short skirt, was incomplete without a hip flask.

With a huge consumer market unmet by legitimate, organized crime filled the vacuum left by the closure of the legal alcohol industry. Homicides increased in many cities, party as a result of gang wars but also because of an increase in drunkenness.

Prohibition devastated the nation's brewing industry. St. Louis had 22 breweries before Prohibition. When it ended in 1933, only nine reopened. Anheiser-Busch made it through Prohibition by making ice cream, near beer, corn syrup, ginger ale, root beer, yeast, malt extract, refrigerated cabinets and automobile and truck bodies.

When the country entered the Great Depression, the jobs and tax revenue that a legal liquor industry would generate looked attractive. During his presidential campaign in 1932, New York Governor Franklin D. Roosevelt, who never hid his fondness for martinis, called for Prohibition's repeal.

The noble experiment ended at 3:32 p.m., December 5, 1933, when Utah became the 36th state to ratify the 21st Amendment, repealing Prohibition. By then, even some proponents admtted that the 18th Amendment resulted in "evil consequences." The Rev. Sam Small, an evangelist and temperance advocate said that Prohibition had created "an orgy of lawlessness and official corruption." John D. Rockefeller, a teetotaler, observed in 1932, "drinking has generally increased, the speakeasy has replaced the saloon; a vast army of lawbreakers has been recruited and financed on a colossal scale."

Even today, debate about the impact of Prohibition rages. Critics argue that the amendment failed to eliminate drinking, made drinking more popular among the young, spawned organized crime, disrespect for law, and encouraged solitary drinking and led beer drinker to hard liquor and cocktails. (One wit joked that "Prohibition succeeded in replacing good beer with bad gin.") The lesson these critics draw is that it is counterproductive to try to legislative morality.

Their opponents argue alcohol consumption declined dramatically during Prohibition, probably by 30 to 50 percent. Deaths from cirrhosis of the liver for men fell from 29.5 per 100,000 in 1911 to 10.7 in 1929.

Was Prohibition a "noble experiment" or a misguided effort to use government to shape morality? Even today, the answer is not entirely clear. Alcohol remains a serious cause of death, disability, and domestic abuse. It was not until the 1960s that alcohol consumption levels returned to their pre-Prohibition levels. Today, alcohol is linked each year to more than 23,000 motor vehicle deaths and more than half the nation's homicides and is closely linked to domestic violence.

Race

Some of the most vicious racial violence in American history took place between 1917 and 1923, partly due to dramatic shifts in the demography of race. Black workers who had been historically confined to the South had begun to move north and compete with whites for factory jobs--often as strikebreakers, which was the only way many could get hired. In addition, black veterans returned from World War I insisting on the civil rights that they had fought for in Europe. In Chicago, Longview Tex., Omaha, Neb., Rosewood, Fl., Tulsa, Okla., Washington, D.C., white mobs burned and killed in black neighborhoods.

In Tulsa 40 city blocks were leveled, and 23 African American churches and a thousand homes and businesses were destroyed. In 1921, Tulsa, which was about 12 percent black, had the Southwest's most prosperous African American business community, which Booker T. Washington had called "the black Wall Street."

The incident that led to the violence was the arrest of a 19-year-old African-American bootblack for supposedly assaulting a white teenaged elevator operator. Police later concluded that the young man had stumbled into the woman as he was getting off the elevator. An inflammatory newspaper articles that helped touch off the violence was headlined "To Lynch Negro Tonight."

The death toll is in dispute. A government report said 26 blacks and 10 whites had died and another 317 were injured. A recent scholarly study concluded that black deaths approached one hundred and may have been much higher.

Another incident of racial violence took place on New Year's Day in 1923, in the tiny black settlement of Rosewood, Fl. A white mob, which came from as far away as Georgia and was purportedly searching for an alleged rapist, burned the town of 150. Only one structure, a house owned by the community's only white resident, was not destroyed. Newspaper accounts differ on the total number of people killed; one lists seven, another 21. One Rosewood resident, a blacksmith, was hanged.

Lacking hard evidence, historians have had to rely on oral history. One man, who was eleven at the time of the attack, recalled his father's reports of the violence. He described a black man who was forced to dig his own grave, then shot and shoved into it; a man was hanged from a tree in his front yard when he told a posse that he could not lead them to the alleged rapist; and a pregnant woman was shot as she tried to crawl under her porch for protection. In 1994, the state of Florida paid $2.1 million in reparations.

The Great Migration

The racial composition of the nation's cities underwent a decisive change during and after World War I. In 1910, three out of every four black Americans lived on farms, and nine out of ten lived in the South. World War I changed that profile. Hoping to escape tenant farming, sharecropping, and peonage, 1.5 million southern blacks moved to cities in the late teens and 1920s. During the 1910s and '20s, Chicago's black population grew 148 percent, Cleveland's by 307 percent, Detroit's by 611 percent.

During this massive movement of people, access to housing became a major source of friction between blacks and whites. To keep blacks out of predominantly white neighborhoods, many cities adopted residential segregation ordinances After the Supreme Court declared municipal resident segregation ordinances unconstitutional in 1917, whites resorted to the restrictive covenant, a formal deed restriction binding white property owners in a given neighborhood not to sell to blacks. Whites who broke these agreements could be sued by "damaged" neighbors. Not until 1948 did the Supreme Court strike down restrictive covenants.

Confined to all-black neighborhoods, African Americans created cities-within-cities during the 1920s. The largest was Harlem, in upper Manhattan, where 200,000 African Americans lived in a neighborhood that had been virtually all-white fifteen years before.

African American Protests

In World War I, a higher proportion of black soldiers than white soldiers had lost their lives: 14.4 percent compared to 6.3. Many African Americans believed that this sacrifice would be repaid when the war was over. In the words of one Texan, "our second emancipation will be the outcome of this war." It was not to be. The federal government denied black soldiers the right to participate in the victory march down Paris's Champs-Elysees boulevard--even though black troops from European colonies did. Ten African American soldiers were among the 70 blacks lynched in 1919. Twenty-five anti-black riots took place that year.

African Americans did not respond passively to these outrages. Even before the war, African Americans had stepped up protests against discrimination. The National Urban League, organized in 1911 by social workers, white philanthropists, and black leaders, concentrated on finding jobs for urban African Americans. The National Association for the Advancement of Colored People (NAACP) won important Supreme Court decisions against the grandfather clause (1915) and restrictive covenants (1917). The NAACP also fought school segregation in northern cities during the 1920s, and lobbied hard, though unsuccessfully, for a federal anti-lynching bill.

No black leader was more successful in touching the aspirations and needs of the mass of African Americans than Marcus Garvey. A flamboyant and charismatic figure from Jamaica, Garvey rejected integration and preached racial pride and black self help. He declared that Jesus Christ and Mary were black and he exhorted his followers to glorify their African heritage and revel in the beauty of their black skin. "We have a beautiful history," he told his followers, "and we shall create another one in the future."

In 1917 Garvey moved to New York, where he organized the University Negro Improvement Association (UNIA), the first mass movement in African American history. By the mid-1920s, Garvey's organization had 700 branches in 38 states and the West Indies and published a newspaper with as many as 200,000 subscribers. The UNIA operated grocery stores, laundries, restaurants, printing plants, clothing factories, and a steamship line.

In the mid-1920s, Garvey was charged with mail fraud, jailed and finally deported. Still, the "Black Moses" left behind a rich legacy. At a time when magazines and newspapers overflowed with advertisements for hair straighteners and skin lightening cosmetics, Garvey's message of racial pride struck a responsive chord in many African Americans.

The Harlem Renaissance

The movement for black pride found its cultural expression in the Harlem Renaissance--the first self-conscious literary and artistic movement in African American history.

For over three decades, African Americans had shown increasing interest in black history and African American folk culture. As early as the 1890s, W.E.B. Du Bois, Harvard's first African American Ph.D., began to trace black culture in the United States to its African roots; Fisk University's Jubilee Singers introduced Negro spirituals to the general public; and the American Negro Academy, organized in 1897, promoted African American literature, arts, music, and history. A growing spirit of racial pride was evident, as a group of talented writers, including Charles Chestnut, Paul Lawrence Dunbar, and James Weldon Johnson, explored life in black communities; as the first Negro dolls appeared; and as all-Negro towns were founded in Whitesboro, New Jersey, and Allensworth California.

Signs of growing racial consciousness proliferated during the 1910s. Fifty new black newspapers and magazines appeared in that decade, bringing the total to 500. The Associated Negro Press, the first national black press agency, was founded in 1919. In 1915, Carter Woodson, a Harvard Ph.D., founded the first permanent Negro historical association--the Association for the Study of Negro Life and History--and began publication of the Journal of Negro History.

During the 1920s, Harlem, in upper Manhattan, became the capital of black America, attracting black intellectuals and artists from across the country and the Caribbean as well. Soon, the Harlem Renaissance was in full bloom. The poet Countee Cullen eloquently expressed black artists' long-suppressed desire to have their voices heard: "Yet do I marvel at a curious thing: To make a poet black, and bid him sing!"

Many of the greatest works of the Harlem Renaissance sought to recover links with African and folk traditions. In "The Negro Speaks of Rivers," the poet Langston Hughes reaffirmed his ties to an African past: "I looked upon the Nile and raised the pyramids above it." In Cane (1923), Jean Toomer--the grandson of P.B.S. Pinchback, who served briefly as governor of Louisiana during Reconstruction--blended realism and mysticism, poetry and prose, to describe the world of the black peasantry in Georgia and in the ghetto of Washington, D.C. Zora Neale Hurston, a Columbia University trained anthropologist, incorporated rural black folklore and religious beliefs into her stories.

A fierce racial conscious and a powerful sense of racial pride animated the literature of the Harlem Renaissance. The West Indian-born poet Claude McKay expressed the new spirit of defiance and protest with militant words: "If we must die--oh let us nobly die...dying, but fighting back!"

The Ku Klux Klan

After the Civil War, the Ku Klux Klan, led by former Confederate General Nathaneal Beford Forrest, used terrorist tactics to intimidate former slaves. A new version of the Ku Klux Klan arose during the early 1920s. During the early 1920s, immigration, fear of radicalism, and a revolution in morals and manners fanned anxiety in large parts of the country. Roman Catholics, Jews, African Americans, and foreigners were only the most obvious targets of the Klan's fearmongering. Bootleggers and divorcees were also targets.

Contributing to the Klan's growth was a post-war depression in agriculture, the African American migration into northern cities, and a swelling of religious bigotry and nativism in the years after World War I. Klan members considered themselves defenders of Prohibition, traditional morality, and true Americanism. Directed against Arican Americans, Jews, Catholics, and immigrants.

In 1920, two Atlanta publicists took over an organization with 3,000 members and in three years built it into a national organization with three million members. Edward Clarke, a former Atlanta journalist, and Bessie Tyler, a former madam, had formed the Southern Publicity Association in 1917 to promote World War I fund drives. After the war, they built up membership in the Klan by giving Klansmen part of the $10 induction fee of every new member they signed up.

During the early 1920s, the Klan helped elect 16 U.S. Senators and many Representatives and local officials. By 1924, when the Klan had reached its peak in numbers and influence, it claimed to control 24 of the nation's 48 state legislatures. That year, it succeeded in blocking the nomination of Al Smith, a New York Catholic, at the Democratic National Convention.

The three million members of the Klan after World War I were quite open in their activities. Many were small-business owners, independent professionals, clerical workers, and farmers. Members marched in parades, patronized Klan merchants, and voted for Klan-endorsed political candidates. The Klan was particularly strong in the Deep South, Oklahoma, and Indiana. Historians once considered the Ku Klux Klan a group of marginal misfits, rural traditionalists unable to cope with the coming of a modern urban society. But recent scholarship shows that Klan members were a cross-section of native Protestants and that many were women and many came from urban areas.

The leader of Indiana's Klan was David Curtis Stephenson, a Texan who had worked as a printer's apprentice in Oklahoma before becoming a salesman in Indiana. Given control of the Klan in Indiana in 1922 and the right to organize in 20 other states, he soon became a millionaire from the sale of robes and hoods. A crowd estimated at 200,000 attended one Klan gathering in Kokomo, Ind. in 1923.

A public defender of Prohibition and womanhood, Stephenson was, in private, a heavy drinker and a womanizer. In 1925 he was tried, convicted, and sentenced to life in prison for kidnapping and sexually assaulting 28-year-old Madge Oberholtzer, who ran a state program to combat illiteracy. Stephenson's downfall, which was followed by the indictment and prosecution of many Klan-supported politicians on corruption charges, led members to abandon the organization in droves. Within a year, the number of Klansmen in Indiana fell from 350,000 to 15,000. By 1930, the Klan had just 45,000 members in the nation as a whole.

Sacco and Vanzetti

During the twentieth century a number of trials have excited widespread public interest. One of the first cause celeries was the case of Nicola Sacco, a 32-year-old shoemaker, and Bartolomeo Vanzetti, a 29-year-old fish peddler.

They were accused of double murder. A paymaster and a payroll guard had been shot to death during a payroll heist in Braintree, Mass., near Boston. About three weeks later, Sacco and Vanzetti were charged with the crime. Their trial aroused intense controversy because of a widespread belief that the evidence against the men was flimsy and that they were being prosecuted for their immigrant background and their radical political beliefs. Sacco and Vanzettti were Italian immigrants and avowed anarchists who advocated the violent overthrow of capitalism.

It was the height of the post-World War I Red Scare, and the atmosphere was seething about anxieties about Bolshevism, aliens, domestic bombings, and labor unrest. Revolutionary upheavals had been triggered by the war, and one-third of the U.S. population consisted of immigrants or the children of immigrants.

U.S. Attorney General A. Mitchell Palmer had ordered foreign radicals rounded up for deportation. Just three days before Sacco and Vanzetti were arrested, one of the people seized during the Palmer raids, an anarchist editor, had died after falling from a 14th floor window of the New York City Department of Justice office. The police, judge, jury, and newspapers were deeply concerned about labor unrest.

No witnesses had gotten a good look at the perpetrators of the murder and robbery. The witnesses described a shootout in the street and the robbers escaping in a Buick, scattering tacts to deter pursuers. Anti-immigrant, anti-radical sentiment led the police to focus on local anarchists.

Sacco and Vanzetti were followers of Luigi Galleani, a radical Italian anarchist who had instigated a wave of bombings against public officials just after World War I. An Italian anarchist had blown himself up while trying to plant a bomb at Attorney General Palmer's house.

Sacco and Vanzetti acted nervously and the arresting officer testified that they were reaching for weapons when they were apprehended. But neither man had a criminal record and a criminal gang had been carrying out a string of armed robberies in Massachusetts and Rhode Island.

Police linked Sacco's gun to the double murder, the only piece of physical evidence that connected the men to the crime. But the defense argued that the link was overstated.

In 1921, they were convicted in a trial that was marred by prejudice against Italians, immigrants, and radical beliefs. The evidence was ambiguous as to the pairs' guilt or innocence, and the trial was a sham. The prosecution played heavily on the pairs' radical beliefs. The men were kept in an iron cage during the trial. The jury foreman muttered unflattering stereotypes about Italians. In his instructions to the jury, the presiding judge urged the jury to remember their "true American citizenship."

The pair were electrocuted in 1927. As the guards adjusted his straps, Vanzetti said in broken English:

I wish to tell you I am innocent and never connected with any crime... I wish to forgive some people for what they are now doing to me."

Their execution divided the nation and produced an uproar in Europe. Harvard Law Professor and later U.S. Supreme Court justice Felix Frankfurter condemned the prejudice of the presiding judge (who reportedly said in 1924, "Did you see what I did with those anarchistic bastards the other day?") and procedural errors during the trial. These errors included the prosecution's failure to disclose eyewitness evidence favorable to the defense. A commission that included the presidents of Harvard and MIT defended the trial's fairness.

Today, many historians now believe Sacco was probably guilty and Vanzetti innocent, but that the evidence was insufficient to convict either one.

Immigration Restriction

Before World War I, American industry, steamship companies, and railroads promoted immigration and financed groups opposed to immigration restriction. While the United States did institute registration and literacy requirements for immigrants, opponents of restriction succeeded in blocking efforts to establish immigration quotas.

World War I revealed that the economy could function effectively without foreign immigration and opposition to restriction withered away. Not only had World War I demonstrated that immigrants had become "Americanized," but with the establishment of new European nation states interest in European politics faded away. While some opponents of immigration argued that it threatened the nation's culture, most of the arguments advanced against immigration were economic. Among the chief proponents of immigration restriction were the unions of the American Federation of Labor. Organized labor feared that American workers' wages would decline if unskilled immigrant workers flooded the labor market. Meanwhile, many businessmen feared dangerous foreign radicals.

During the twenties, most ethnic groups agreed that the overall volume of immigration reduced. The only issue was how to distribute the immigration quotas. A compromise was easily reached: to make the quotas proportionate to the current population, so that future immigration would not change the balance of ethnic groups.

In 1924, Congress reduced the number of immigrants allowed into the United States each year to two percent of each nationality group counted in the 1890 census. It also barred Asians entirely.

Fundamentalism and Pentecostalism

Religion was a pivotal cultural battleground during the 1920s. The roots of this religious conflict were planted in the late nineteenth century. Before the Civil War, the Protestant denominations were united in a belief that the findings of science confirmed the teachings of religion. But during the 1870s a lasting division had occurred in American Protestantism over Charles Darwin's theory of evolution. Religious modernists argued that religion had to be accommodated to the teachings of science, while religious traditionalists sought to preserve the basic tenets of their religious faith.

As an organized movement, Fundamentalism can be said to have started with a set of twelve pamphlets entitled The Fundamentals: A Testimony, published between 1909 and 1912. Financed by two wealthy laymen, the pamphlets were to be sent free to "every pastor, evangelist, missionary, theological student, Sunday school superintendent, Y.M.C.A. and Y.W.C.A. secretary in the English speaking world." Eventually, some three million copies were distributed. The five fundamentals testified to in these volumes were the infallibility of the Bible literally interpreted and the actuality of the virgin birth, the atonement, the resurrection, and the second coming of Christ.

Another current in Protestant revivalism is known as Pentecostalism, a movement which began on New Year's Day in 1901, when a female student at Bethel Bible College in Topeka, Kans., began speaking in tongues, unintelligible speech that accompanies religious excitation. To many evangelicals, speaking in tongues was evidence of the descent of the Holy Spirit into a believer.

Pentecostals rejected the idea that the age of miracles had ended. During the 1920s, many Americans became aware of Pentecostalism as a result of the activities of charismatic faith healers who claimed to be able to cure the sick and allow the crippled to throw away their crutches. Pentecostalism spread particularly rapidly among lower-middle-class and poorer Protestants who sought a more spontaneous and emotional religious experience than that offered by the mainstream religious denominations. The most prominent of the early Pentecostal revivalists was Aimee Semple McPherson.

The Fundamentalist and Pentecostal movements arose in the early twentieth century as a backlash against modernism, secularism, and scientific teachings that contradicted their religious beliefs. Early fundamentalist doctrine attacked competing religions--especially Catholicism, which it portrayed as an agent of the Antichrist--and insisted on the literal truth of the Bible, a strict return to fundamental principles, and a thoroughgoing rejection of modernity.

Between 1921 and 1929, Fundamentalists introduced 37 anti-evolution bills into twenty state legislatures between 1921 and 1929. The first to pass was a law in Tennessee.

During the summer of 1925, a high school teacher in Dayton, Tennessee, was tried for violating the prohibition on the teaching of evolution in tax-supported schools. The statute forbade the teaching in public schools of any scientific theory that denied the literalness of the Biblical account of creation. The legal issue that the Scopes case raised was the validity of a law that seemed to violate the constitutional separation of church and state.

The Scopes Trial

It is one of the best known trials in American history because it symbolizes the conflict between science and theology, faith and reason, individual liberty and majority rule. The object of intense publicity at the time as a clash between urban sophistication and rural fundamentalism, the trial was further popularized by the 1955 play Inherit the Wind, which became a hit film in 1960, and which cast the trial as a struggle of truth and freedom against repression and ignorance. 

In the summer of 1925, a young schoolteacher named John Scopes stood trial in Dayton, Tenn., for violating the state law against the teaching of evolution. Two of the country's most famous attorneys faced off in the trial. William Jennings Bryan, 65 years old and a three time Democratic presidential nominee, prosecuted; 67-year-old Clarence Darrow, who was a staunch agnostic and who had defended Nathan Leopold and Richard Loeb the year before, represented the defense. Bryan declared that "the contest between evolution and Christianity is a duel to the death."

The five-years-old American Civil Liberties Union had taken out newspaper advertisements offering to defend anyone who flouted the Tennessee law. George Rappelyea, a Dayton, Tenn., booster, realized that the town would get enormous attention if a local teacher was arrested for teaching evolution. He enlisted John Scopes, a science teacher and football coach, who arranged to teach from George Hunter's Civic Biology, a high school textbook promoting Charles Darwin's arguments in The Descent of Man. 

The trial was marked by hoopla and a carnival-like atmosphere. Thousands of people swelled the town of a thousand. For 12 days in July, 1925, 100 reporters sent dispatches.

On the trial's seventh day, the defense team called Bryan to testify as an expert on the Bible, since the trial judge had prohibited the defense from using scientists as witnesses. Darrow subjected Bryan to a withering cross-examination. He got Bryan to say that Creation was not completed in a week, but over a period of time that "might have continued for millions of years."

The play Inherit the Wind would caricature Bryan as a Bible-thumping buffoon, but in actuality Bryan's position was complex. He opposed the mandated teaching of evolution in public schools because he thought the people should exercise local control over school curricula. He also opposed Darwin's theory of evolution by natural selection because these ideas had been used to defend laissez-faire capitalism on the grounds that a perfectly free market promotes the "survival of the fittest." As early as 1904, Bryan had denounced social Darwinism as "the merciless law by which the strong crowd out and kill off the weak." 

In addition, he opposed Darwinism as a justification for war and imperialism. In The Descent of Man, Darwin has argued that "at some future period, not very distant as measured by centuries, the civilized races of man will almost certainly exterminate and replace the savage races." The textbook that Scopes taught from, Civic Biology, identified five "races of man" ("Ethiopian, Malay, American Indian, and Mongolian") and "finally, the highest type of all, the Caucasians, represented by the civilized white inhabitants of Europe and America." Bryan was also unhappy with Darwin's assumption that the entire evolutionary process was purposeless and not the product of a larger design. 

Not a Biblical literalist, Bryan was aware of serious scientific difficulties with Darwinism, such as Darwin's theory that slight, random variations were enough to generate life from non-life and to produce a vast array of biological species. But Bryan mistook the lack of consensus about the mechanisms that Darwin advanced to explain the evolutionary process for a lack of scientific support for the concept of evolution itself.

The day after this exchange, Darrow changed his client's plea to guilty. Scopes was convicted and fined $100. But the conviction was thrown out on a technicality by the Tennessee Supreme Court (that the judge, and not the jury, had determined the $100 fine). In 1967, the Supreme Court struck down Tennessee's anti-evolution law for violating the Constitution's prohibition against establishment of religion.

Five days after the trial's conclusion, Bryan died of apoplexy. The journalist H.L. Mencken wrote of Bryan: "He came into life a hero, a Galahad, in bright and shining armor. He was passing out a poor mountebank." As for Scope, he left teaching and became a chemical engineer in the oil industry. He died at age 70 in 1970.

The Scopes trial resulted in two enduring conclusion: that legislatures should not restrain the freedom of scientific inquiry and that society should respect academic freedom.

Leopold and Loeb

It was the first "Crime of the Century." It took place in 1924. Two teenagers with every social advantage kidnapped and killed and mutilated a 14-year-old neighbor.

Nathan Leopold and Richard Loeb came from highly privileged Chicago families. At 19, Leopold was already a University of Chicago graduate and spoke 14 languages. 18-year-old Richard was the youngest graduate in the history of the University of Michigan. Leopold would describe the two as evil geniuses who were above ordinary standards of morality.

Theirs was a thrill killing, which included various sexual perversions with their victim's body. They even mutilated the boy's genitals with acid. And yet they were not executed. Their defender, the attorney Clarence Darrow, introduced the psychiatric defense into the legal system. He claimed that the youths had been sexually abused by their governess and scarred by feelings of physical inferiority. He maintained that Leopold had been traumatized by his mother's death and that Loeb had been pushed into extreme academic overachievement. In addition, Leopold and Loeb had indulged in extreme sexual fantasies.

Before the 1920s, the dominant view of violent juveniles emphasized deficiency and deprivation. Juvenile killers were generally thought of as subnormal in intelligence. The conventional view is that delinquents had been neglected by their families and deprived of education. But the Leopold and Loeb case challenged that view. The case was interpreted to mean that any parent could have raised these two youthful murderers. Said a prominent judge:

Let no parent flatter himself that the Leopold-Loeb case has no lesson for him....It is more than the story of a murder. It is the story of modern youth, of modern parents, of modern economic and social conditions, and of modern education.

Rather than blaming the young men's parents, the jury and the press accepted Clarence Darrow's argument that society, schools, and violent social conditions were largely to blame for the crime. Darrow also succeeded in putting the morality of the death penalty on trial. He acknowledged his clients' guilt and admonished the jury to hate the sin but not the sinner. He succeeded in persuading the jury to give the two murderers life sentences.

Politics During the 1920s

The expansion of government activities during World War I was reversed during the 1920s. Government efforts to break up trusts and regulate business practices gave way to a new emphasis on partnerships between government and business.

In 1920, an Ohio political operator named Harry Daugherty offered a prediction about what would happen at that year's Republican presidential nominating convention:

The convention will be deadlocked, and after the other candidates have gone their limit, some 12 or 15 men, worn out and bleary eyed for lack of sleep, will sit down about 2 o'clock in the morning around a table in a smoke-filled room in some hotel and decide the nomination. When the time comes, [Warren] Harding will be nominated.

Daugherty was right. In 1920, a divided Republican convention selected Harding, a U.S. Senator from Ohio, as its presidential nominee.

Harding's presidency is best known for a series of scandals that marred his time in office. But he also had some genuine accomplishments. He pardoned the imprisoned Socialist party leader, Eugene Debs, and persuaded the steel industry to end the 12-hour day and replace it with an 8-hour day. He also called an international disarmament conference in Washington which slowed down the arms race. At the end of the conference, a treaty was signed that provided for every five battleships that the United States and Britain were each allowed to build, the Japanese could build three and the Italians and the French could build one-and-three-quarters each.

The son of a poor Ohio farmer, he spent two years at a rural academy that called itself Ohio Central College, and received a diploma at the age of 16. For several years, he taught school and sold insurance before buying a local newspaper. He guaranteed its success by mentioning every town resident in the paper at least twice a year. Harding described his editorial policy as "inoffensivism." He later entered Republican politics, rising from lieutenant governor to U.S. Senator before being nominated for the presidency.

Harding made few major pronouncements during the campaign. The Republican party followed an associate's advice: "Keep Warren at home. If he goes out on tour, somebody's sure to ask him questions, and Warren's just the sort of damned fool that will try to answer them." Harding largely confined his speeches to uncontroversial platitudes about the need to avoid moral crusades and return to "normalcy":

America's present need is not heroics but healing; not nostrums, but normalcy; not revolution, but restoration; not agitation, but adjustment; not surgery, but serenity....

Harding had few illusions about his qualifications for the presidency. "I am a man of limited talents from a small town," he said. He appointed a number of sleazy and corrupt officials to office. His administration was marred by scandals involving bribes and kickbacks at the Justice Department and the Veterans Bureau. After his sudden death from a stroke in 1923, his administration's biggest scandal, known as Teapot Dome, was revealed. His Interior Secretary, Albert B. Fall, was sent to prison for accepting $360,000 in bribes for leasing U.S. naval oil reserves in Wyoming to oil operators in exchange for above ground petroleum storage. Private oil companies were also draining oil from federal lands.

Harding's death made Vice President Calvin Coolidge president. Coolidge had come to national attention in 1919, when, as governor of Massachusetts, he broke the Boston police strike after declaring "there is no right to strike against the public interest, anytime, anywhere."

His well-deserved nickname was "Silent Cal." Some acquaintances wagered whether they could make him say more than two words. His answer: "You lose." During the 1924 presidential race reporters asked him whether he had any statement about the campaign. "No," he replied. He was then asked whether he had anything to say about the world situation. "No," he answered. Did he have anything to say about Prohibition? "No." Then he told the reporters, "Now remember, don't quote me." At the end of his presidency, he was asked whether he had a farewell message for the American people, he paused and said, "good-bye."

Coolidge slept ten hours a night, napped every afternoon, and seldom worked more than four hours a day. He spoke out ardently on behalf of the nation's business culture. "The man who builds a factory builds a temple," said Coolidge. "The man who works there, worships there." The president was convinced that the formula for economic prosperity was simple: "The chief business of the American people is business." If government kept its hands of the economy, business would prosper.

Among the most notable acts of his presidency were vetoes of bills to assist farmers to develop government power plants along the Tennessee River. The best known accomplishment of his presidency was the Kellogg-Briand Pact, an international agreement outlawing the use of force to settle international disputes. Embodying the anti-war sentiment of the 1920s, this agreement lacked any methods of enforcement.

The Democratic Convention of 1924

In 1924, Democratic prospects in the upcoming presidential election seemed promising. The administration of Republican Calvin Coolidge was rocked by a scandal, called Teapot Dome, that involved secret leasing of the navy's oil fields to private businesses.

But the Democratic Party was deeply divided. The Democratic party was an uneasy coalition of diverse elements: Northerners and Southerners, Westerners and Easterners, Catholics and Jews and Protestants, conservative landowners and agrarian radicals, progressives and big city machines, urban cosmopolitans and small-town traditionalists. On one side were defenders of the Ku Klux Klan, prohibition, and fundamentalism. On the other side were northeastern Catholics and Jewish immigrants and their children. A series of issues that bitterly divided the country during the early 1920s were on display at the 1924 Democratic Convention in New York, including prohibition and religious and racial tolerance. The Northeasterners wanted an explicit condemnation of the Ku Klux Klan. The final vote was 546.15 for the Klan, 542.85 against it.

The two leading candidates symbolized a deep cultural divide. Al Smith, New York's governor, was a Catholic and an opponent of prohibition and was bitterly opposed by Democrats in the South and West. Former Treasury Secretary William Gibbs McAdoo, a Protestant, defended prohibition and refused to repudiate the Ku Klux Klan, making himself unacceptable to Catholics and Jews in the Northeast.

Newspapers called the convention a "Klanbake," as pro-Klan and anti-Klan delegates wrangled bitterly over the party platform. The convention opened on a Monday and by Thursday night, after 61 ballots, the convention was deadlocked. The next day, July 4, 20,000 Klan supporters held a picnic in New Jersey, wearing white hoods and robes. One speaker denounced the "clownvention in Jew York." They threw baseballs at an effigy of Al Smith. The event culminated with a cross-burning.

Al Smith and William Gibbs McAdoo withdrew after the 99th ballot. On the 103rd ballot, the weary convention nominated John Davis of West Virginia, a former ambassador to Britain. The nomination proved worthless. Liberals deserted the Democrats and voted for Robert La Follette, a third party candidate. Apathy and disgust kept many home. Just half of those eligible went to the polls. The Democrat got 8 million votes. The Republican candidate, incumbent president Calvin Coolidge, 15 million.

The Election of 1928

At stake in the presidential election of 1928 was whether or not the United States could elect a Catholic president.

In 1928, the Democrats were resolved to avoid a repeat of the divisions of 1924. The convention was held in Houston. The party platform stood in favor of aid to farmers and workers, collective bargaining, abolition of labor injunctions, and stricter regulation of power companies. Al Smith was nominated for president; a southern advocate of prohibition was chosen for the vice president.

Al Smith was born in 1873, an Irish Catholic from New York's Lower East Side. For almost a decade as governor of the nation's largest state, he had made New York a model for efforts to use government to improve the public's well-being. Under his leadership, New York granted women a 40-hour work week and instituted the nation's first public housing program. He also established state parks and a system of public hospitals.

Smith doubled the Democratic vote of 1924. But it was not enough. He lost New York and much of the South. There was no single cause for the defeat. Economic prosperity and Prohibition played a part. But it appears that anti-Catholicism did him in. Ministers called New York Satan's seat; Smith was attacked as the candidate of the saloon, prostitution, and gambling. Smith refused to pretend that he was anything but what he was: a Catholic, who kept a picture of the Pope over his desk.

But Smith did make gains. He carried Massachusetts, the first Democrat to do so since the Civil War. He awakened a great army of immigrant voters in the big cities--Italian, Jewish, Polish, as well as Irish. And he helped shift the African American vote toward the Democrats.

Herbert Hoover

Smith was defeated by Herbert Hoover, a wealthy mining engineer who had served as Commerce Secretary for both Presidents Harding and Coolidge.

He had headed European relief efforts after World War I. "He is certainly a wonder," exclaimed the young Franklin D. Roosevelt, "and I wish we could make him president of the United States. There couldn't be a better one."

As Secretary of Commerce, Hoover proposed to eliminate destructive economic competition through the establishment of trade associations working in cooperation with government. By 1929 more than 2,000 trade associations had been created.

When he campaigned for the presidency, the country seemed headed toward ever greater economic prosperity. In a campaign speech, he said:

We in America today are nearer to the final triumph over poverty than ever before in the history of any land. We shall soon with the help of God be in sight of the day when poverty will be banished from this nation.

Among his first steps in office was to create a Federal Farm Board to increase farmers' income. He also called for a system of private, voluntary old age pensions. But the collapse of the stock market also shattered his vision of a private economy operating largely free from government intervention.

The Consumer Economy and Mass Entertainment

By the end of the 1920s, many of the bitter cultural tensions that had divided Americans had begun to subside, overwhelmed by the rise of a modern consumer culture. The growth of exciting new opportunities to buy cars, appliances, and stylish clothing made the country's cultural conflicts seem less significant. The collapse of the new economy at the decade's end would generate economic debates as intense as the cultural conflicts of the early and mid-1920s.

Americans in the 1920s were the first to wear ready-made, exact-sized clothing. They were the first to play electric phonographs or use electric vacuum cleaners or to listen to commercial radio broadcasts or drink fresh orange juice year-round. In countless ways, large and small, American life was transformed during the 1920s, at least in urban areas. Cigarettes, cosmetics, and synthetic fabrics such as rayon became staples of American life. Newspaper gossip columns, illuminated billboards, commercial airplane flights--all were novelties during the 1920s. In that decade, the United States became a consumer society.

Two automotive titans--Henry Ford and Alfred Sloan--symbolized the profound transformations that took place in American industry during the 1910s and '20s. In 1913, 50-year old Ford had revolutionized American manufacturing by introducing the automated assembly line. By using conveyor belts to bring automobile parts to workers, he reduced the assembly time for a Ford car from 12 1/2 hours in 1912 to just 1 1/2 hours in 1914. Declining production costs allowed Ford to cut prices--six times between 1921 and 1925, reducing a new Ford's cost to just $290. This was less than three months wages for an average American worker, and it made cars affordable for the average family. To lower employee turnover and raise productivity, Ford also introduced a minimum wage of $5 in 1914--twice what most workers earned--and shortened the workday from 9 hours to 8. Twelve years later, Ford reduced his work week from six days to five. Ford demonstrated the dynamic logic of mass production: that expanded production allows manufacturers to reduce costs and therefore increase the number of products sold; and that higher wages allow workers to buy more products.

Alfred Sloan, the president of General Motors from 1923 to 1941, built his company into the world's largest automaker not by refining the production process but by adopting new approaches to advertising and marketing. Sloan summed up his philosophy with these blunt words: "The primary object of the corporation was to make money, not just to make cars." Unlike Ford, a farmer's son who wanted to produce an inexpensive, functional vehicle with few frills (he said that his customers could have any color that they wanted as long as it was black), Sloan was convinced that Americans were willing to pay extra for luxury and prestige. He advertised his cars as symbols of wealth and status, and in 1927 he introduced the yearly model change, to convince motorists to trade in old models for newer ones with flashier styling. He also developed a series of divisions, differentiated by status, price, and level of luxury, with Chevrolets less expensive than Buicks or Cadillacs. To make his cars affordable, he set up the nation's first national consumer credit agency in 1919. If Henry Ford demonstrated the efficacy of mass production, Sloan revealed the importance of merchandising in a modern consumer society.

Cars were the symbol of the new consumer society that emerged in the 1920s. In 1919, there were just 6.7 million cars on American roads. By 1929, there were more than 27 million--or nearly a car for every household in the United States. In that year, one American in every five had a car--compared to one in every 37 English and one in every 40 French. With car manufacturers and banks encouraging the public to buy the car of their dreams on credit, the American love affair with the car could truly begin. In 1929, a quarter of all American families purchased a car. About 60 percent bought cars on credit, often paying interest rates of 30 percent or more.

Cars revolutionized the American way of life. Enthusiasts claimed the automobile promoted family togetherness through evening rides, picnics, and weekend excursions. Critics decried squabbles between parents and teenagers over use of the automobile, and an apparent decline in church attendance resulting from Sunday outings. Worst of all, charged critics, automobiles gave young people freedom and privacy, serving as "portable bedrooms" that couples could take anywhere.

The automobile also transformed the American landscape, quickly obliterating all traces of the horse and buggy past. During the 1920s, the country doubled its system of roads and highways. The nation spent over $2 billion annually building and maintaining roads and by 1929, there were 852,000 miles of roads in the United States, compared to just 369,000 in 1920. The car also brought with it pollution, congestion, and nearly 30,000 traffic deaths a year.

The automobile industry provided an enormous stimulus for the national economy. By 1929, the industry produced 12.7 percent of all manufacturing output, and employed 1 out of every 12 workers. Automobiles in turn stimulated the growth of steel, glass, and rubber industries, along with the gasoline stations, motor lodges, camp grounds, and hot dog stands that dotted the nation's roadways.

Alongside the automobile, other emblems of the consumer economy were the telephone and electricity. By 1930, two-thirds of all American households had electricity and half had telephones. As more and more of America's homes received electricity, new appliances followed: refrigerators, washing machines, vacuum cleaners, and toasters quickly took hold. Advertisers claimed that "labor saving" appliances would ease the sheer physical drudgery of housework, but they did not shorten the average housewife's work week. Women had to do more because standards of cleanliness kept rising. Sheets had to be changed weekly; the house had to be vacuumed daily. In short, social pressure expanded household chores to keep pace with the new technology. Far from liberating women, appliances imposed new standards of cleanliness.

Ready-to-wear clothing was another important innovation in America's expanding consumer economy. During World War I, the federal government defined standard clothing sizes to help the nation's garment industry meet the demand for military uniforms. Standard sizes meant that it was now possible to mass produce ready-to-wear clothing. Since there was no copyright on clothing designs until the 1950s, garment manufacturers could pirate European fashions and reproduce them using less expensive fabrics.

Even the public's eating habits underwent far-reaching shifts, as American began to consume fewer starches (like bread and potatoes) and more fruit and sugar. But the most striking development was the shift toward processed foods. Instead of preparing food from scratch at home (plucking chickens, roasting nuts, or grinding coffee beans) an increasing number of Americans purchased foods that were ready-to-cook. Important innovations in food processing occurred during World War I, as manufacturers learned how to efficiently produce canned and frozen foods. Processed foods saved homemakers enormous amounts of time in peeling, grinding, and cutting.

Accompanying the rise of new consumer-oriented businesses were profound shifts in the ways that business operated. To stimulate sales and increase profits, businesses expanded advertising, offered installment credit, and created the nation's first regional and national chains.

The nation's first million advertising campaign--for Uneeda Bisquits in a waterproof box--demonstrated advertising's power. Before the 1920s, most advertisements consisted of vast expanses of print. Absent were brand names, pictures, or catch phrases. During the 1920s, advertising agencies hired psychologists (including John B. Watson, the founder of behaviorism, and Edward Bernays, Sigmund Freud's nephew) to design the first campaigns. They touted products by building up name brand identification, creating memorable slogans, manipulating endorsements by doctors or celebrities, and appealing to consumers' hunger for prestige and status. By 1929, American companies spent $3 billion annually to advertise their products, five times more than in 1914.

Installment credit soared during the 1920s. Banks offered the country's first home mortgages, while manufacturers for everything from cars to irons allowed consumers to pay "on time." About 60 percent of all furniture and 75 percent of all radios were purchased on the installment plan. In contrast to a Victorian society that had placed a high premium on thrift and saving, the new consumer society emphasized spending and borrowing.

A fundamental shift took place in the American economy during the 1920s. The nation's families spent a declining proportion of their income on necessities--food, clothing, and utilities--and an increasing share on appliances, recreation, and a host of new consumer products. As a result, older industries, such as textiles, railroads, and steel, declined, while newer industries, such as appliances, automobiles, aviation, chemicals, entertainment, and processed foods, surged ahead rapidly.

During the 1920s, the chain store movement revolutionized retailing. Chains of stores multiplied across the country, like Woolworth's, the five-and-dime chain. The largest grocery chain, A&P, had 17,500 stores by 1928. Alongside drug store and cigar store chains, there were also interlocking networks of banks and utility companies. These banks and utilities played a critical role in promoting the financial speculation of the late 1920s that would be one of the causes for the Great Depression.

The Formation of Modern American Mass Culture

Many of the defining features of modern American culture emerged during the 1920s. The record chart, the book club, the radio, the talking picture, and spectator sports all became popular forms of mass entertainment. But the primary reason the 1920s stand out as one of the most important periods in American cultural history is because the decade produced a generation of artists, musicians, writers who were among the most innovative and creative in the country's history.

Mass Entertainment 

Of all the new appliances to enter the nation's homes during the '20s, none had a more revolutionary impact than radio. Sales soared from $60 million in 1922 to $426 million in 1929. The first commercial radio station began broadcasting in 1919, and during the 1920s the nation's airwaves were filled with musical variety shows and comedies.

Radio drew the nation together by bringing news, entertainment, and advertisements to more than ten million households by 1929. Radio blunted regional differences and imposed similar tastes and lifestyles and no other media had the power to create heroes and villains so quickly. When Charles Lindbergh became the first person to fly nonstop across the Atlantic from New York to Paris in 1928, radio brought this incredible feat into American homes and transformed him into a celebrity overnight.

Radio also disseminated racial and cultural caricatures and derogatory stereotypes. The nation's most popular radio show, "Amos 'n Andy," which first aired in 1926 on Chicago's WMAQ, spread vicious racial stereotypes into homes whose white occupants knew little about African Americans. Other minorities fared no better. The Italian gangster and the tightfisted Jew became stock characters in radio programming.

The phonograph was not far behind the radio in importance. The 1920s saw the record player enter American life in full force. Piano sales sagged as phonograph production rose from just 190,000 in 1923 to 5 million in 1929. The popularity of jazz, blues, and "hillbilly" music fueled the phonograph boom. The novelist F. Scott Fitzgerald called the 1920s the "Jazz Age"--and the decade was truly jazz's golden age. Duke Ellington wrote the first extended jazz compositions; Louis Armstrong popularized "scat" (singing of nonsense syllables); Fletcher Henderson pioneered big band jazz; and trumpeter Jimmy McPartland and clarinetist Benny Goodman popularized the Chicago school of improvisation.

The blues craze erupted in 1920, when a black singer named Mamie Smith released a recording called "Crazy Blues." The record became a sensation, selling 75,000 copies in a month and a million copies in seven months. Recordings by Ma Rainey, the "Mother of the Blues," and Bessie Smith, the "Empress of the Blues," brought the blues, with their poignant and defiant reaction to life's sorrows, to a vast audience.

"Hillbilly" music broke into mass culture in 1923, when a Georgia singer named "Fiddlin' John" Carson sold 500,000 copies of his recordings. Another country artist, Vernon Dalhart, sold 7 million copies of a recording of "The Wreck of Old 97." "Country" music's appeal was not limited to the rural South or West -- city folk, too, listened to country songs, reflecting a deep nostalgia for a simpler past.

The single most significant new instrument of mass entertainment was the movies. Movie attendance soared from 50 million a week in 1920 to 90 million weekly in 1929. According to one estimate, Americans spent 83 cents of every entertainment dollar going to the movies, and three-fourths of the population went to a movie theater every week.

During the late teens and 1920s, the film industry took on its modern form. In cinema's earliest days, the film industry was based in the nation's theatrical center -- New York. By the 1920s, the industry had relocated to Hollywood drawn by cheap land and labor, the ready accessibility of varied scenery, and a climate ideal for year-round filming. (Some filmmakers moved to avoid lawsuits from individuals like Thomas Edison who owned patent rights over the filmmaking process.) Each year, Hollywood released nearly 700 movies, dominating worldwide film production. By 1926, Hollywood had captured 95 percent of the British market and 70 percent of the French.

A small group of companies consolidated their control over the film industry and created the "studio system" that would dominate film production for the next thirty years. Paramount, 20th Century Fox, MGM and other studios owned their own production facilities, ran their own worldwide distribution networks, and controlled theater chains committed to showing their companies' products. In addition, they kept stables of actors, directors, and screenwriters under contract.

The popularity of the movies soared as films increasingly featured glamour, sophistication, and sex appeal. New kinds of movie stars appeared: the mysterious sex goddess, personified by Greta Garbo; the passionate hotblooded lover, epitomized by Rudolph Valentino; and the flapper, with her bobbed hair and skimpy skirts. New film genres also debuted including swashbuckling adventures, sophisticated sex comedies, and tales of flaming youth and the new sexual freedom. Americans flocked to see Hollywood spectacles such as Cecil B. DeMille's Ten Commandments with its "cast of thousands" and dazzling special effects. Comedies, such as the slapstick masterpieces of Charlie Chaplin and Buster Keaton enjoyed great popularity as well.

Like radio, movies created a new popular culture, with common speech, dress, behavior, and heroes. And like radio, Hollywood did its share to reinforce racial stereotypes by denigrating minority groups. The radio, the electric phonograph, and the silver screen both molded and mirrored mass culture.

Spectator Sports 

Spectator sports attracted vast audiences in the 1920s. The country yearned for heroes in an increasingly impersonal, bureaucratic society, and sports provided them. Prize fighters like Jack Dempsey became national idols. Team sports flourished, but Americans focused on individuals superstars, people whose talents or personalities made them appear larger than life. Knute Rockne and his "Four Horsemen" at Notre Dame spurred interest in college football. Professional football began during the 1920s. In 1925, Harold "Red" Grange, the "Galloping Ghost" halfback for the University of Illinois, attracted 68,000 fans to a professional football game at Brooklyn's Polo Grounds.

Baseball drew even bigger crowds than football. The decade began with the sport mired in scandal. In 1920, three members of the Chicago White Sox told a grand jury that they and five other players had thrown the 1919 World Series. As a result of the "Black Sox" scandal, eight players were banished from the sport. But baseball soon regained its popularity, thanks to George Herman ("Babe") Ruth, the sport's undisputed superstar. Up until the 1920s Ty Cobb's defensive brand of baseball, with its emphasis on base hits and stolen bases, had dominated the sport. Ruth transformed baseball into the game of the home-run hitter. In 1921, the New York Yankee slugger hit 59 home runs -- more than any other team. In 1927, the "Sultan of Swat" hit 60.

Lowbrow and Middlebrow Culture

"It was a characteristic of the Jazz Age," the novelist F. Scott Fitzgerald wrote, "that it had no interest in politics at all." What, then, were Americans interested in? Entertainment was Fitzgerald's answer. Parlor games like Mah Jong and crossword puzzles became enormously popular during the 1920s. Contract bridge became the most durable of the new pastimes, followed closely by photography. Americans hit golf balls, played tennis, and bowled. Dance crazes like the fox trot, the Charleston, and the jitterbug swept the country.

New kinds of pulp fiction found a wide audience. Edgar Rice Burroughs' Tarzan of the Apes became a runaway best-seller. For readers who felt concerned about urbanization and industrialization, the adventures of a lone white man in "dark Africa" revived the spirit of frontier individualism. Zane Grey's novels, such as Riders of the Purple Sage, enjoyed even greater popularity, using the tried but true formula of romance, action, and a moralistic struggle between good and evil, all in a western setting. Between 1918 and 1934, Grey wrote 24 books and became the best-known writer of popular fiction in the country.

Other readers wanted to be titillated, as evidenced by the boom in "confession magazines." Urban values, liberated women, and Hollywood films had all relaxed Victorian standards. Confession magazines rushed in to fill the vacuum, purveying stories of romantic success and failure, divorce, fantasy, and adultery. Writers survived the censors' cut by placing moral tags at the end of their stories, advising readers to avoid similar mistakes in their own lives.

Readers too embarrassed to pick up a copy of True Romance could read more urbane magazines such as The New Yorker or Vanity Fair offering entertainment, amusement, and gossip to those with sophisticated tastes. They could also join The Book of the Month Club or the Literary Guild, both of which were founded during the decade.

The Avant-Garde

Few decades have produced as many great works of art, music, or literature as the 1920s. At the decade's beginning, American culture stood in Europe's shadow. By the decade's end, Americans were leaders in the struggle to liberate the arts from older canons of taste, form, and style. It was during the twenties that Eugene O'Neill, the country's most talented dramatist, wrote his greatest plays, and that William Faulkner, Ernest Hemingway, F. Scott Fitzgerald, and Thomas Wolfe published their first novels.

American poets of the 1920s such as Hart Crane, E. E. Cummings, Hart Crane, Countee Cullen, Langston Hughes, Edna St. Vincent Millay and Wallace Stevens experimented with new styles of punctuation, rhyming, and form. Likewise, artists like Charles Demuth, Georgia O'Keeffe, and Joseph Stella challenged the dominant realist tradition in American art and pioneered non-representational and expressionist art forms.

The 1920s marked America's entry into the world of serious music. It witnessed the founding of fifty symphony orchestras and three of the country's most prominent music conservatories -- Julliard, Eastman, and Curtis Institution. This decade also produced America's first great classical composers--including Aaron Copland and Charles Ives--and saw George Gershwin create a new musical form by integrating jazz into symphonic and orchestral music.

World War I left many American intellectuals and artists disillusioned and alienated. Neither Wilsonian idealism nor Progressive reformism appealed to America's postwar writers and thinkers, who believed that the crusade to end war and to make the world safe for democracy had been a senseless mistake. "Here was a new generation," wrote the novelist F. Scott Fitzgerald, "...grown up to find all Gods dead, all wars fought, all faiths in man shaken."

During the 1920s, many of the nation's leading writers exposed the shallowness and narrow-mindedness of American life. The United States was a nation awash in materialism and devoid of spiritual vitality, a "wasteland," wrote the poet T.S. Eliot, inhabited by "hollow men." No author offered a more scathing attack on middle class boorishness and smugness than Sinclair Lewis, who in 1930 became the first American to win the Nobel Prize for Literature. In Main Street (1920) and Babbitt (1922) he satirized the narrow-minded complacency and dullness of small town America, while in Elmer Gantry (1922) he exposed religious hypocrisy and bigotry.

As editor of Mercury magazine, H.L. Mencken wrote hundreds of essays mocking practically every aspect of American life. Calling the South a "gargantuan paradise of the fourth rate," and the middle class the "booboisie," Mencken directed his choicest barbs at reformers, whom he blamed for the bloodshed of World War I and the gangsters of the 1920s. "If I am convinced of anything," he snarled, "it is that Doing Good is in bad taste."

The writer Gertrude Stein defined an important group of American intellectuals when she told Ernest Hemingway in 1921, "You are all a lost generation." Stein was referring to the expatriate novelists and artists who had participated in the Great War only to emerge from the conflict convinced that it was an exercise in futility. In their novels, F. Scott Fitzgerald and Hemingway pointed toward a philosophy now known as "existentialism," which maintains that life has no transcendent purpose and that each individual must salvage personal meaning from the void. Hemingway's fiction lionized toughness and "manly virtues" as a counterpoint to the softness of American life. In The Sun Also Rises (1926) and A Farewell to Arms (1929) he emphasized meaningless death and the importance of facing stoically the absurdities of the universe. In the conclusion of The Great Gatsby (1925), Fitzgerald gave pointed expression to an existentialist outlook: "so we beat on, boats against the current, borne back ceaselessly into the past."

The New Woman

In 1920, after 72 years of struggle, American women received the right to vote. After the 19th Amendment passed, reformers talked about female voters uniting to clean up politics, improve society, and end discrimination.

At first, male politicians moved aggressively to court the women's vote, passing legislation guaranteeing women's right to serve on juries and hold public office. Congress also passed legislation to set up a national system of women's and infant's health care clinics as well as a constitutional amendment prohibiting child labor, a measure supported by many women's groups.

But the early momentum quickly dissipated, as the women's movement divided within and faced growing hostility from without. The major issue that split feminists during the 1920s was a proposed Equal Rights Amendment to the Constitution outlawing discrimination based on sex. The issue pitted the interests of professional women against those of working class women, many of whom feared that the amendment would prohibit "protective legislation" that stipulated minimum wages and maximum hours for female workers.

The women's movement also faced mounting external opposition. During the Red Scare following World War I, the War Department issued the "Spider Web" chart which linked feminist groups to foreign radicalism. Many feminist goals went down to defeat in the mid-1920s. Opposition from many southern states and the Catholic church defeated the proposed constitutional amendment outlawing child labor. The Supreme Court struck down a minimum wage law for women workers, while Congress failed to fund the system of health care clinics.

Women did not win new opportunities in the workplace. Although the American work force included eight million women in 1920, more than half were black or foreign-born. Domestic service remained the largest occupation, followed by secretaries, typists, and clerks--all low-paying jobs. The American Federation of Labor (AFL) remained openly hostile to women because it did not want them competing for men's jobs. Female professionals, too, made little progress. They consistently received less pay than their male counterparts. Moreover, they were concentrated in traditionally "female" occupations such as teaching and nursing.

During the 1920s, the organized women's movement declined in influence partly as a result of the rise of the new consumer culture, which made the suffragists and settlement house workers of the Progressive era seem old-fashioned. Advertisers tried self-consciously to co-opt many of the themes of pre-World War I feminism, arguing that the modern economy was filled with exciting and liberating opportunities for consumption. To popularize smoking among women, advertisers staged parades down New York's 5th Avenue, imitating the suffrage marches of the 1910s, in which young women carried "torches of freedom" -- cigarettes.

The Great Depression

Charles Ponzi

Few people ever see their name enter the English language, but Charles Ponzi did. A "Ponzi scheme" has become synonymous with wild speculation. In September 1919, Ponzi was a 42 year old former vegetable dealer with just $150 to his name. He promised to return $15 to anyone who lent him $10 for 90 days. His plan, he explained, was to buy foreign currencies at low prices and sell them at higher prices. After newspapers reported his scheme, dollars began to pour in--$1 million a week. An admirer described him as the greatest Italian of all time. "Columbus discovered America...but you discovered money."

It was too good to be true. Ponzi took in $15 million in eight months--and less than $200,000 was ever returned to investors.

Speculative Manias 

Ponzi symbolized the "get-rich-quick" mentality that infected the public during the 1920s. A vivid example was the Florida land boom. During the 1920s, sun worshipping northerners discovered Florida's warm winter climate and its sun drenched beaches. Could there be a safer investment? Real estate promoters--including the former presidential candidate William Jennings Bryan--offered seafront lots to investors for 10 percent down. Investors snapped up the properties--much of which turned out to be swamp and scrub land. Prices skyrocketed. A lot 40 miles from Miami sold for $20,000. A beach lot sold for $75,000. Ponzi himself sold lots "near Jacksonville"--actually 65 miles west of the city. He divided each acre into 23 lots.

In the fall of 1926 the bubble burst. Two hurricanes ripped through Florida, killing more than 400 people. Property valued at $1 billion in 1925 dropped to $143 million in 1928.

A wave of stock swindles and business frauds took place during the 1920s. But the most striking manifestation of the decade's speculative frenzy was the stock market boom of 1928 and 1929. After rising steadily during the 1920s, stock prices began to soar in March 1928. Between March 3, 1928, and September 3, 1928, AT&T rose from 179 1/2 to 335 5/8, General Motors from 139 3/4 to 181 7/8, and Westinghouse from 91 5/8 to 313. By the beginning of the fall of 1929, stock prices were four times higher than five years before.

Brokerage houses lured investors into the market by selling stock on margin, requiring investors to only put down 10 or 20 percent of the stock's price in cash and borrowing the rest. By 1929, 1.5 million Americans had invested in securities.

Boosters like John Jacob Raskob, the chairman of the Democratic Party, encouraged ordinary people to invest in stocks. In an article in the Ladies' Home Journal entitled "Everybody Ought to be Rich," he explained that a person who invested $15 a month in the stock market for 20 years would have a nest egg of $80,000. Leading economists encouraged investors to believe that the stock market would continue to rise. Irving Fisher of Yale University announced: "Stock prices have reached what looks like a permanently high plateau." At the end of October, 1929, the seemingly endless surge in stock prices came to a crashing halt.

The Market Crashes

On Thursday, October 24, 1929, an unprecedented wave of sell orders shook the New York Stock Exchange. Stock priced tumbled, falling $2, $5, and even $10 between trades. As prices fell, brokers required investors who had bought stock on margin to put up money to cover their loans. To raise money, many investors dumped stocks for whatever price they could fetch. During the first three hours of trading stock values plunged by $11 billion.

At noon, a group of prominent bankers met at the offices of J.P. Morgan and Co. To stop the hemorrhaging of stock prices, the bankers' pool agreed to buy stocks well above the market. At 1:30 p.m. they put their plan into action. Within an hour, U.S. Steel was up $15 a share, AT&T up $22, General Electric up $21, Montgomery Ward up $23.

Even though the market recovered its morning losses, public confidence was badly shaken. Rumor spread that eleven stock speculators had killed themselves and that government troops were surrounding the exchange to protect traders from an angry mob. President Hoover sought to reassure the public by declaring that the "fundamental business of the country...is on a sound and prosperous basis."

Prices held steady on Friday, and then slipped on Saturday. Monday, however, brought fresh disaster. Eastman Kodak plunged $41 a share. AT&T went down $24; New York Central Railroad, $22. The worst was yet to come. It occurred on Black Tuesday, October 29, the day the stock market experienced the greatest crash in its history.

As soon as the stock exchange's gong sounded a mad rush to sell began. Trading volume soared to an unprecedented 16,410,030 shares and the average price of a share fell 12 percent. Stocks were sold for whatever price they would bring. White Sewing Machine had reached a high of $48 a share. One purchaser--reportedly a messenger boy--bought a block of the stock for $1 a share.

The bull market of the late 1920s was over. By 1932, the index of stock prices had fallen from a 1929 high of 210 to 30. Stocks were valued at just 12 percent of what they had been worth in September 1929. Altogether, between September 1929 and June 1932, the nation's stock exchanges lost $179 billion in value.

The great stock market crash of October 1929 brought the economic prosperity of the 1920s to a symbolic end. For the next ten years, the United States was mired in a deep economic depression. By 1933, unemployment had soared to 25 percent of the workforce, up from just 3.2 percent in 1929. Industrial production declined by fifty percent. In 1929, before the crash, investment in the U.S. economy totaled $16 billion. By 1933, the figure had fallen to $340 million, a decrease of 98 percent.

Why It Happened

Why did the seemingly boundless prosperity of the 1920s end so suddenly? And why, once an economic downturn began, did the Great Depression last so long?

Economists have been hard pressed to explain why "prosperity's decade" ended in financial disaster. In 1929, the American economy appeared to be extraordinarily healthy. Employment was high and inflation was virtually non-existent. Industrial production had risen 30 percent between 1919 and 1929 and per capita income had climbed from $520 to $681. The United States accounted for nearly half the world's industrial output. Still, the seeds of the Depression were already present in the "boom" years of the 1920.

For many groups of Americans, the prosperity of the 1920s was a cruel illusion. Even during the most prosperous years of the Roaring Twenties, most families lived below what contemporaries defined as the poverty line. In 1929, economists considered $2,500 the income necessary to support a family. In that year, more than 60 percent of the nation's families earned less than $2,000 a year, the income necessary for basic necessities, and over 40 percent earned less than $1,500 annually. Although labor productivity soared during the 1920s because of electrification and more efficient management, wages stagnated or fell in mining, transportation, and manufacturing. Hourly wages in coal mines sagged from 84.5 cents in 1923 to just 62.5 cents in 1929.

Prosperity bypassed specific groups of Americans entirely. A 1928 report on the condition of Native Americans found that half owned less than $500 and that 71 percent lived on less than $200 a year. Mexican Americans, too, had failed to share in the prosperity. During each year of the 1920s, 25,000 Mexicans migrated to the United States. Most lived in conditions of extreme poverty. In Los Angeles, the infant mortality rate was five times that for Anglos and most homes lacked toilets. A survey found that a substantial minority of Mexican Americans had virtually no meat, fresh vegetables in their diet; 40 percent said that they could not afford to give their children milk.

The farm sector had been mired in depression since 1921. Farm prices had been depressed ever since the end of World War I, when European agriculture revived and grain from Argentina and Australia entered the world market. Strapped with long-term debts, high taxes and a sharp drop in crop prices, farmers lost ground throughout the 1920s. In 1910, a farmer's income was 40 percent of a city worker's. By 1930, it had sagged to just 30 percent.

The decline in farm income reverberated throughout the economy. Rural consumers stopped buying farm implements, tractors, automobiles, furniture, and appliances. Millions of farmers defaulted on their debts, placing tremendous pressure on the banking system. Between 1920 and 1929, more than 5,000 of the country's 30,000 banks failed.

Because of the banking crisis, thousands of small businesspeople failed because they could not secure loans. Thousands more went bankrupt because they had lost their working capital in the stock market crash. A heavy burden of consumer debt also weakened the economy. Consumers built up an unmanageable amount of consumer installment and mortgage debt, taking out loans to buy cars, appliances, and homes in the suburbs. To repay these loans, consumers cut back sharply on discretionary spending. Drops in consumer spending led inevitably to reductions in production and worker layoffs. Unemployed workers then spent less and the cycle repeated itself.

A poor distribution of income compounded the country's economic problems. During the 1920s, there was a pronounced shift in wealth and income toward the very rich. Between 1919 and 1929, the share of income received by the wealthiest one percent of Americans rose from 12 percent to 19 percent, while the share received by the richest five percent jumped from 24 percent to 34 percent. Over the same period, the poorest 93 percent of the non-farm population actually saw its disposable income fall. Because the rich tend to spend a high proportion of their income on luxuries--such as large cars, entertainment, and tourism--and save a disproportionately large share of their income, there was insufficient demand to keep employment and investment at a high level.

Even before the onset of the Depression, business investment had begun to decline. Residential construction boomed between 1924 and 1927, but in 1929 housing starts fell to less than half the 1924 level. A major reason for the depressed housing market was the 1924 immigration law that had restricted foreign immigration. Soaring inventories also led businesses to reduce investment and production. During the mid-1920s, manufacturers expanded their productive capacity and built up excessive inventories. At the decade's end, they cut back sharply, directing their surplus funds into stock market speculation.

The Federal Reserve, the nation's central bank, played a critical, if inadvertent, role in weakening the economy. In an effort to curb stock market speculation, the Federal Reserve slowed the growth of the money supply, then allowed the money supply to fall dramatically after the stock market crash, producing a wrenching "liquidity crisis." Consumers found themselves unable to repay loans, while businesses did not have the capital to finance business operations. Instead of actively stimulating the economy by cutting interest rates and expanding the money supply--the way monetary authorities fight recessions today--the Federal Reserve allowed the country's money supply to decline by 27 percent between 1929 and 1933.

Finally, Republican tariff policies damaged the economy by depressing foreign trade. Anxious to protect American industries from foreign competitors, Congress passed the Fordney-McCumber Tariff of 1922 and the Hawley-Smoot Tariff of 1930, raising tariff rates to unprecedented levels. American tariffs stifled international trade, making it difficult for European nations to pay off their debts. As foreign economies foundered, those countries imposed trade barriers of their own, choking off U.S. exports. By 1933, international trade had plunged 30 percent.

All these factors left the economy ripe for disaster. Yet the depression did not strike instantly; it infected the country gradually, like a slow-growing cancer. Measured in human terms, the Great Depression was the worst economic catastrophe in American history. It hit urban and rural areas, blue-and white-collar families alike. In the nation's cities, unemployed men took to the streets to sell apples or shine shoes. Thousands of others hopped freight trains and wandered from town to town, looking for jobs or handouts.

Unlike most of Western Europe, the United States had no federal system of unemployment insurance. The relief burden fell on state and municipal governments working in cooperation with private charities, such as the Red Cross and the Community Chest. Created to handle temporary emergencies, these groups lacked the resources to alleviate the massive suffering created by the Great Depression. Poor southerners, whose states had virtually no relief funds, were particularly hard hit.

Urban centers in the North fared little better. Most city charters did not permit public funds to be spent on work relief. Adding insult to injury, several states disqualified relief clients from voting, while other cities forced them to surrender their automobile license plates. "Prosperity's decade" had ended in economic disaster.

The Great Depression in Global Perspective

Unlike previous economic downturns, which generally were confined to a handful of nations or specific regions, the Great Depression was a global phenomenon. Africa, Asia, Australia, Europe, and North and South America all suffered from the economic collapse. International trade fell 30 percent, as nations tried to protect their industries by raising tariffs on imported goods. "Beggar-thy-neighbor" trade policies were a major reason why the depression persisted as long as it did. By 1932, an estimated 30 million people were unemployed around the world.

Also, in contrast to the relatively brief economic "panics" of the past, the Great Depression dragged on with no end in sight. As the depression deepened, it had far-reaching political consequences. One response to the depression was military dictatorship--a response that could be found in Argentina and in many countries in Central America. Western industrialized countries cut back sharply on the purchase of raw materials and other commodities. The price of coffee, cotton, rubber, tin, and other commodities dropped 40 percent. The collapse in raw material and agricultural commodity prices led to social unrest, resulting in the rise of military dictatorships that promised to maintain order.

A second response to the depression was fascism and militarism--a response found in Germany, Italy, and Japan. In Germany, Adolph Hitler and his Nazi Party promised to restore the country's economy and rebuild its military. After becoming chancellor in 1932, Hitler outlawed labor unions, restructured German industry into a series of cartels, and, after 1935, instituted a massive program of military rearmament that ended high unemployment. In Italy, fascism arose even before the depression's onset under the leadership of Italian dictator Benito Mussolini. In Japan, militarists seized control of the government during the 1930s. In an effort to relieve the depression, Japanese military officers conquered Manchuria, a region rich in raw materials, and coastal China in 1937.

A third response to the depression was totalitarian communism. In the Soviet Union, the Great Depression helped solidify Joseph Stalin's grip on power. In 1928, Stalin instituted a planned economy. His first Five Year Plan called for rapid industrialization and "collectivization" of small peasant farms under government control. To crush opposition to his program, which required peasant farmers to give their products to the government at low prices, Stalin exiled millions of peasant to labor camps in Siberia, and instituted a program of terror called the Great Purge. Historians estimate that as many as 20 million Soviets died during the 1930s as a result of famine and deliberate killings.

A final response to the depression was welfare capitalism, which could be found in countries including Canada, Great Britain, and France. Under welfare capitalism, government assumed ultimate responsibility for promoting a reasonably fair distribution of wealth and power and providing security against the risks of bankruptcy, unemployment and destitution.

Compared to other industrialized countries, the economic decline brought on by the Depression was steeper and more protracted in the United States. The unemployment rate rose higher and remained higher longer than in any other western society. While European countries significantly reduced unemployment by 1936, as late as 1939, when World War II began in Europe, the American jobless rate still exceeded 17 percent. It did not drop below 14 percent until 1941.

The Great Depression transformed the American political and economic landscape. It produced a major political realignment, creating a coalition of big city ethnics, African American, and Southern Democrats committed, to various degrees, to interventionist government. It strengthened the federal presence in American life, producing such innovations as national old age pensions, unemployment compensation, aid to dependent children, public housing, federally subsidized school lunches, insured bank deposits, the minimum wage, and stock market regulation. It fundamentally altered labor relations, producing a revived labor movement and a national labor policy protective of collective bargaining. It transformed the farm economy, by introducing federal price supports and rural electrification. Above all, the Great Depression produced a fundamental transformation in public attitudes. It led Americans to view the federal government as the ultimate protector of public well-being.

The Human Toll

After more than half a century, images of the Great Depression remain firmly etched in the American psyche--breadlines, soup kitchens, tin-can shanties and tar paper shacks known as "Hoovervilles," penniless men and women selling apples on street corners, and gray battalions of Arkies and Okies packed into Model A Fords heading to California.

The collapse was staggering in its dimensions. Unemployment jumped from less than 3 million in 1929 to 4 million in 1930, 8 million in 1931, and 12 1/2 million in 1932. In that year, a quarter of the nation's families did not have a single employed wage earner. Even those fortunate enough to have jobs suffered drastic pay cuts and reductions in hours. Only one company in ten failed to cut pay, and in 1932, three-quarters of all workers were on part-time schedules, averaging just 60 percent of the normal work week.

The economic collapse was terrifying in its scope and impact. By 1933 average family income had tumbled 40 percent, from $2,300 in 1929 to just $1,500 four years later. In the Pennsylvania coal fields, three or four families crowded together in one-room shacks and lived on wild weeds. In Arkansas, families were found inhabiting caves. In Oakland, California, whole families lived in sewer pipes.

Vagrancy shot up as many families were evicted from their homes for nonpayment of rent. The Southern Pacific Railroad boasted that it threw 683,000 vagrants off its trains in 1931. Free public flophouses and missions in Los Angeles provided beds for 200,000 of the uprooted.

To save money, families neglected medical and dental care. Many families sought to cope by planting gardens, canning food, buying used bread, and using cardboard and cotton for shoe soles. Despite a steep decline in food prices, many families did without milk or meat. In New York City, milk consumption declined a million gallons a day.

President Herbert Hoover declared, "Nobody is actually starving. The hoboes are better fed than they have ever been." But in New York City in 1931, there were 20 known cases of starvation; in 1934, there were 110 deaths caused by hunger. There were so many accounts of people starving in New York that the West African nation of Cameroon sent $3.77 in relief.

The Depression had a powerful impact on families. It forced couples to delay marriage and drove the birthrate below the replacement level for the first time in American history. The divorce rate fell, for the simple reason that many couples could not afford to maintain separate households or pay legal fees. But rates of desertion soared. By 1940, 1.5 million married women were living apart from their husbands. More than 200,000 vagrant children wandered the country as a result of the breakup of their families.

The Depression inflicted a heavy psychological toll on jobless men. With no wages to punctuate their power, many men lost power as primary decision makers. Large numbers of men lost self-respect, became immobilized and stopped looking for work, while others turned to alcohol or became self-destructive or abusive to their families.

In contrast to men, many women saw their status rise during the Depression. To supplement the family income, married women entered the work force in large numbers. Although most women worked in menial occupations, the fact that they were employed and bringing home paychecks elevated their position within the family and gave them a say in family decisions.

Despite the hardships it inflicted, the Great Depression drew some families closer together. As one observer noted, "Many a family has lost its automobile and found its soul." Families had to devise strategies for getting through hard times because their survival depended on it. They pooled their incomes, moved in with relatives in order to cut expenses, bought day-old bread, and did without. Many families drew comfort from their religion, sustained by the hope things would turn out well in the end, while others placed their faith in themselves, in their own dogged determination to survive that so impressed observers like Woody Guthrie. But many Americans no longer believed the problems could be solved by people acting alone or through voluntary associations. Increasingly, they looked to the federal government for help.

The Dispossessed

Economic hardship and loss visited all sections of the country. One-third of the Harvard class of 1911 confessed that they were hard up, on relief, or dependent on relatives. Doctors and lawyers saw their incomes fall 40 percent. But no groups suffered more from the depression than African Americans and Mexican Americans.

A year after the stock market crash, 70 percent of Charleston's black population was unemployed and 75 percent of Memphis's. In Macon County, Alabama, home of Booker T. Washington's famous Tuskegee Institute, most black families lived in homes without wooden floors or windows or sewage disposal and subsisted on salt pork, hominy grits, corn bread, and molasses. Income averaged less than a dollar a day.

Conditions were also distressed in the North. In Chicago, 70 percent of all black families earned less than a $1,000 a year, far below the poverty line. In Chicago and other large northern cities, most African Americans lived in "kitchenettes." Six-room apartments, previously rented for $50 a month, were divided into six kitchenettes renting for $32 dollars a month, assuring landlords of a windfall of an extra $142 a month. Buildings that previously held 60 families now contained 300.

The depression hit Mexican American families especially hard. Mexican Americans faced serious opposition from organized labor, which resented competition from Mexican workers as unemployment rose. Bowing to union pressure, federal, state and local authorities "repatriated" more than 400,000 people of Mexican descent to prevent them from applying for relief. Since this group included many United States citizens, the deportations constituted a gross violation of civil liberties.

Private and Public Charity 

The economic crisis of the 1930s overwhelmed private charities and local governments. In south Texas, the Salvation Army provided a penny per person each day. In Philadelphia, private and public charities distributed $1 million a month in poor relief. But this provided families with only $1.50 a week for groceries. In 1932, total public and private relief expenditures amounted to $317 million--$26 for each of the nation's 12 1/2 million jobless.

Franklin D. Roosevelt

In June 1932, Franklin D. Roosevelt received the Democratic presidential nomination. At first glance he did not look like a man who could relate to other peoples' suffering, for Roosevelt had spent his entire life in the lap of luxury. No fewer than 16 of his ancestors had come over on the Mayflower. A fifth cousin of Teddy Roosevelt, he was born in 1882 to one of New York's wealthiest families. Roosevelt enjoyed a privileged youth. He attended Groton, an exclusive private school, then went to Harvard University and Columbia Law School. After three years in the New York state senate, Roosevelt was tapped by President Wilson to serve as assistant secretary of the navy in 1913. His status as the rising star of the Democratic party was confirmed when James Cox chose Roosevelt as his running mate in the presidential election of 1920.

The Election of 1932 

Handsome and outgoing, Roosevelt seemed to have a bright political future. Then disaster struck. In 1921 he was stricken with polio. The disease left him paralyzed from the waist down and confined to a wheelchair for the rest of his life. Instead of retiring, however, Roosevelt labored diligently to return to public life. "If you had spent two years in bed trying to wiggle your toe," he later declared, "after that anything would seem easy."

Buoyed by an exuberant optimism and devoted political allies, Roosevelt won the governorship of New York in 1928, one of the few Democrats to survive the Republican landslide. Surrounding himself with able advisors, Roosevelt labored to convert New York into a laboratory for reform, involving conservation, old age pensions, public works projects, and unemployment insurance.

In his acceptance speech before the Democratic convention in Chicago, Roosevelt promised "a New Deal for the American people." Although his speech contained few concrete proposals, Roosevelt radiated confidence, giving many desperate voters hope. He even managed during the campaign to turn his lack of a blueprint into an asset, offering instead a policy of experimentation. "It is common sense to take a method and try it," he declared, "if it fails, admit it frankly and try another."

The First 100 Days

The nation's plight March 4, 1933, the day Franklin Roosevelt assumed the presidency, was desperate. A quarter of the nation's workforce was jobless. A quarter million families had defaulted on their mortgages the previous year. During the winter of 1932 and 1933, 1.2 million Americans were homeless. Scores of shantytowns, called Hoovervilles, sprouted up.

About 9,000 banks, holding the savings of 27 million families, had failed since 1929 with 1,456 in 1932 alone. Farm foreclosures were averaging 20,000 a month. The public was desperate for action. Hamilton Fish, a conservative Republican Congressman, promised the president that Congress would "give you any power that you need."

A month before taking office, Giuseppe Zangara, a mentally ill bricklayer, tried to assassinate the president-elect in Miami. Chicago's mayor was killed, but Roosevelt miraculously escaped injury. In his inaugural address, Roosevelt expressed confidence that his administration could end the depression. "The only thing we have to fear," he declared, "is fear itself."

In his first hundred days in office, the president pushed 15 major bills through Congress, which would reshape every aspect of the economy, from banking and industry to agriculture and social welfare. The president promised decisive action. He called Congress into special session and demanded "broad executive power to wage a war against the emergency, as great as the power that would be given me if we were in fact invaded by a foreign foe."

He attacked the bank crisis first, declaring a national bank holiday, which closed all banks. In just four days, his aides drafted the Emergency Banking Relief Act which permitted solvent banks to reopen under government supervision, and allowed the RFC to buy the stock of troubled banks and keep them open until they could be reorganized. The law also gave the president broad powers over the Federal Reserve System. The law radically reshaped the nation's banking system; it passed Congress in 8 hours.

To generate support for his program, Roosevelt appealed directly to the people. On March 12 he conducted the first of many radio "fireside chats." Using the radio the way later presidents exploited television, he explained what he had done in plain, simple terms and told the public to have "confidence and courage." When the banks reopened the following day, people demonstrated their faith by making more deposits than withdrawals. One of Roosevelt's key advisors did not exaggerate when he later boasted, "Capitalism was saved in eight days."

The president quickly pushed ahead on other fronts. The Federal Emergency Relief Act pumped $500 million into state-run welfare programs. The Homeowners Loan Act provided the first federal mortgage financing and loan guarantees. By the end of Roosevelt's first term it provided more than 1 million loans totaling $3 billion. The Glass-Steagall Act provided a federal guarantee of all bank deposits under $5,000, separated commercial and investment banking, and strengthened the Federal Reserve's ability to stabilize the economy.

In addition, Roosevelt took the nation off the gold standard; devalued the dollar; and ordered the Federal Reserve System to ease credit. Other important laws passed during the 100 days included the Agricultural Adjustment Act, the nation's first system of agricultural price and production supports; the National Industrial Recovery Act, the first major attempt to plan and regulate the economy; and the Tennessee Valley Authority Act, the first direct government involvement in energy production.

The Farmers' Plight

Roosevelt moved aggressively to address the crisis facing the nation's farmers. No group was harder hit by the depression than farmers and farm workers. At the start of the depression, a fifth of all American families still lived on farms, but they were in deep trouble. Farm income fell by a staggering two-thirds during the depression's first three years. A bushel of wheat that sold for $2.94 in 1920 dropped to $1 in 1929 and 30 cents in 1932. In one day, a quarter of Mississippi's farm acreage was auctioned off to pay for debts.

The farmers' problem, ironically, was that they grew too much. Worldwide crop production soared--a result of more efficient farm machinery, stronger fertilizers, and improved plant varieties--but demand fell, as people ate less bread, Europeans imposed protective tariffs, and consumers replaced cotton with rayon. Too much was being grown, and the glut caused prices to fall. To meet farm debts in 1932, it was necessary to grow 2.5 times as much corn as in 1929, 2.7 times as much wheat, and 2.4 times as much cotton.

As farm incomes fell, farm tenancy soared; two-fifths of all farmers worked on land that they did not own. The Gudgers, a white southern Alabama sharecropping family of six, illustrate the plight of tenants, who were slipping deeper and deeper into debt. Each year, their landlord provided them with 20 acres of land, seed, an unpainted one-room house, a shed, a mule, fertilizer, and $10 a month. In return, they owed him half their corn and cotton crop and 8 percent interest on their debts. In 1934 they were $80 in debt; by 1935, their debts had risen another $12.

Nature itself seemed to have turned against farmers. In the South, the boll weevil devoured the cotton crop, while on the Great Plains, the top soil literally blew away, piling up in ditches like "snow drifts in winter." The Dust Bowl produced unparalleled human tragedy, but it had not occurred by accident. The Plains had always been a harsh, arid inhospitable environment. Nevertheless, a covering of tough grass-roots called sod permitted the land to retain moisture and support vegetation. During the 1890s, however, overgrazing by cattle severely damaged the sod. Then, during World War I, demand for wheat and the use of gasoline-powered tractors allowed farmers to plow large sections of the prairie for the first time. The fragile skin protecting the prairie was destroyed. When drought struck, beginning in 1930, and temperatures soared (to 108 degrees in Kansas for weeks on end) the wind began to blow the soil away. One Kansas county, which produced 3.4 million bushels of wheat in 1931, harvested just 89,000 bushels in 1933.

Tenant farmers found themselves evicted from their land. By 1939, a million Dust Bowl refugees and other tenant farmers left the Plains to work as itinerant produce pickers in California. As a result, whole counties were depopulated. In one part of Colorado, 2,811 homes were abandoned, while another 1,522 simply disappeared.

The New Deal attacked farm problems through a variety of programs. Rural electrification programs meant that for the first time Americans in Appalachia, the Texas hill country, and other areas would have the opportunity to share in the benefits of electric light and running water. As late as 1935 more than six million of America's 6.8 million farms had no electricity. Unlike their sisters in the city, farm women had no washing machines, refrigerators, or vacuum cleaners. Nor did private utility companies intend to change things. Private companies insisted that it would be cost prohibitive to provide electrical service to rural areas.

Roosevelt disagreed. Settling on the 40,000 square mile valley of the Tennessee River as his test site, Roosevelt decided to put the government into the electric business. Two months after he took office Congress passed a bill creating the Tennessee Valley Authority (TVA). The TVA was authorized to build 21 dams to generate electricity for tens of thousands of farm families. In 1935 Roosevelt signed an executive order creating the Rural Electrification Administration (REA) to bring electricity generated by government dams to America's hinterland. Between 1935 and 1942 the lights came on for 35 percent of America's farm families.

Nor was electricity the only benefit the New Deal bestowed on farmers. The Soil Conservation Service helped farmers battle erosion; the Farm Credit Administration provided some relief from farm foreclosures and the Commodity Credit Corporation permitted farmers to use stored products as collateral for loans. Roosevelt's most ambitious farm program, however, was the Agriculture Adjustment Act (AAA).

The AAA, led by Secretary of Agriculture Henry Wallace, sought a partnership between the government and major producers. Together the new allies would raise prices by reducing the supply of farm goods. Under the AAA, the large producers, acting through farm cooperatives, would agree upon a "domestic allotment" plan that would assign acreage quotas to each producer. Participation would be voluntary. Farmers who cut production to comply with the quotas would be paid for land left fallow.

Unfortunately for its backers, the AAA got off to a horrible start. Because the 1933 crops had already been planted by the time Congress established the AAA, the administration ordered farmers to plow their crops under, paying them over $100 million to plow under 10 million acres of cotton. The government also purchased and slaughtered six million pigs, salvaging only one million pounds for the needy. The public neither understood nor forgave the agency for destroying food while jobless people went hungry.

Overall, the AAA's record was mixed. It raised farm income, but did little for sharecroppers and tenant farmers, the groups hardest hit by the agricultural crisis. Farm incomes doubled between 1933 and 1936, but large farmers reaped most of the benefits. Many large landowners used government payments to purchase tractors and combines allowing them to mechanize farm operations, increasing crop yields and reducing the need for sharecroppers and tenants. One Mississippi planter bought 22 tractors with his payments and subsequently evicted 160 tenant families. An unintentional consequence of the New Deal farm policies was to force at least 3 million small farmers from the land. For all its inadequacies, however, the AAA established the precedence for a system of farm price supports, subsidies, and surplus purchases that continues more than half a century later.

The National Recovery Administration

To help industry and labor, Congress established the National Recovery Administration (NRA), which sought to revive industry through rational planning. The idea behind the NRA was simple: representatives of business, labor, and government would establish codes of fair practices which would set prices, production levels, minimum wages, and maximum hours within each industry. The NRA also supported workers' right to join labor unions. By ending ruinous competition, overproduction, labor conflicts, and deflating prices, the NRA sought to stabilize the economy.

Led by General Hugh Johnson, the new agency got off to a promising start. By midsummer 1933, over 500 industries had signed codes covering 22 million workers. In New York City, burlesque show strippers' agreed on a code limiting the number of times that they would undress each day. By the end of the summer the nation's ten largest industries had been won over, as well as hundreds of smaller businesses. All across the land businesses displayed the "Blue Eagle," the insignia of the NRA, in their windows. Thousands participated in public rallies and spectacular torchlight parades.

The NRA's success was short-lived. Johnson proved to be an overzealous leader who alienated many businesspeople. Instead of creating a smooth-running corporate state, Johnson presided over a chorus of endless squabbling. The NRA boards, which were dominated by representatives of big business, drafted codes that favored their interests over those of small competitors. Moreover, even though they controlled the new agency from the outset, many leaders of big business resented the NRA for interfering in the private sector. Many quipped that the NRA stood for "national run-around."

For labor the NRA was a mixed blessing. On the positive side, the codes abolished child labor and established the precedent of federal regulation of minimum wages and maximum hours. In addition, the NRA boosted the labor movement by drawing large numbers of unskilled workers into unions. On the negative side, however, the NRA codes set wages in most industries well below what labor demanded, and large occupational groups, such as farm workers, fell outside the codes' coverage.

Jobs Programs

Harry Hopkins, one of Roosevelt's most trusted advisors, asked why the federal government could not simply hire the unemployed and put them to work. Reluctantly, Roosevelt agreed.

The first major program to attack unemployment through public works was the Public Works Administration (PWA). It was supposed to serve as a "pump-primer," providing people with money to spend on industrial products. In six years the PWA spent $6 billion, building such projects as Brownsville, Texas's port, the Grand Coulee Dam, and Chicago's sewer system. Unfortunately, the man who headed the program, Harold Ickes, was so concerned about potential graft and scandal that the PWA did not spend sufficient money to significantly reduce unemployment.

One of the New Deal's most famous jobs programs was the Civilian Conservation Corps (CCC). By mid-1933, 300,000 jobless young men between 18 and 25 were hired to work in the nation's parks and forests. For $30 a month, CCC workers planted saplings, built fire towers, restocked depleted streams, and restored historic battlefields. Workers lived in wilderness camps, earning money which they passed along to their families. By 1942, when the program ended, 2.5 million men had served in Roosevelt's "Tree Army." Despite its immense popularity, the CCC failed to make a serious dent in depression unemployment. It excluded women, imposed rigid quotas on blacks, and offered employment to only a miniscule number of the young people who needed work.

Far more ambitious was the Civil Works Administration (CWA), established in November 1933. Under the energetic leadership of Harry Hopkins, the CWA put 2.6 million men to work in its first month. Within two months it employed four million men building 250,000 miles of road, 40,000 schools, 150,000 privies, and 3,700 playgrounds. In March 1934, however, Roosevelt scrapped the CWA because he (like Hoover) did not want to run a budget deficit or create a permanent dependent class.

Roosevelt badly underestimated the severity of the crisis. As government funding slowed down and economic indicators leveled off, the depression deepened in 1934, triggering a series of violent strikes, which culminated on Labor Day, 1934, when 500,000 garment workers launched the single largest strike in the nation's history. All across the land, critics attacked Roosevelt for not doing enough to combat the depression, charges that did not go unheeded in the White House.

Following the congressional elections of 1934, in which the Democrats won 13 new House seats and 9 new Senate seats, Roosevelt abandoned his hopes for a balanced budget, deciding that bolder action was required. He had lost faith in government planning and the proposed alliance with business, which left only one other road to recovery - government spending. Encouraged by the CCC's success, he decided to create more federal jobs for the unemployed.

In January 1935 Congress created the Works Progress Administration (WPA), Roosevelt's program to employ 3.5 million workers at a "security wage" - twice the level of welfare payments but well below union scales. To head the new agency, Roosevelt again turned to Harry Hopkins. Since the WPA's purpose was to employ men quickly, Hopkins opted for labor intensive tasks, creating jobs that were often makeshift and inefficient. Jeering critics said the WPA stood for "We Piddle Along," but the agency built many worthwhile projects. In its first five years alone the WPA constructed or improved 2,500 hospitals, 5,900 schools, 1,000 airport fields (including New York's LaGuardia Airport), and nearly 13,000 playgrounds. By 1941 it had pumped $11 billion into the economy.

The WPA's most unusual feature was its spending on cultural programs. About five percent of the WPA's spending went to the arts. While folksingers like Woody Guthrie honored the nation in ballads, other artists were hired to catalog it, photograph it, paint it, record it, and write about it. In photojournalism, for example, the Farm Security Agency (FSA) employed scores of photographers to create a pictorial record of America and its people. Under the auspices of the WPA, the Federal Writers Project sponsored an impressive set of state guides and dispatched an army of folklorists into the backcountry in search of tall tales. Oral historians collected slave narratives, and musicologists compiled an amazing collection of folk music. Other WPA programs included the Theatre Project, which produced a live running commentary on everyday affairs, and the Art Project, which decorated the nation's libraries and post offices with murals of muscular workmen, bountiful wheat fields, and massive machinery.

Valuable in their own right, the WPA's cultural programs had the added benefit of providing work for thousands of writers, artists, actors, and other creative people. In addition, these programs established the precedent of federal support to the arts and the humanities, laying the groundwork for future federal programs to promote the life of the mind in the United States.

In 1939, a Gallup Poll asked Americans what they liked best and what they liked worst about Franklin Delano Roosevelt's New Deal. The answer to both questions was the WPA, the Works Projects Administration.

Work crews were criticized for spending days moving leaf piles from one side of the street to the other. Unions struck to protest the program's refusal to pay wages equal to those of the private sector. But President Ronald Reagan, a staunch critic of large-scale government programs, was one of the WPA's defenders. "Some people," he said, "have called it boondoggle and everything else. But having lived through that era and seen it, no, it was probably one of the social programs that was most practical in those New Deal days."

About five percent of its budget was devoted to the arts. WPA alumni include writers Saul Bellow, John Cheever, Ralph Ellison, and Richard Wright, the artist Jackson Pollack, and actor and director Orson Welles.

The WPA was not especially efficient. In Washington, D.C., construction costs typically ran three to four times the cost of private work. But this was intentional. The WPA avoided cost-saving machinery in order to hire more workers. At its peak, the WPA spent $2.2 billion a year, or approximately $30 billion annually in current dollars.

Roosevelt's Critics

By 1935, Roosevelt's programs were provoking strong opposition. Many conservatives regarded his programs as infringements on the rights of the individual, while a growing number of critics argued that they did not go far enough. Three figures stepped forward to challenge Roosevelt: Huey Long, a Louisiana senator; Father Charles Coughlin, a Catholic priest from Detroit; and Francis Townsend, a retired California physician.

Of the three, Huey Long attracted the widest following. Ambitious, endowed with supernatural energy, and totally devoid of scruples, Long was a fiery, spellbinding orator in the tradition of southern populism. As governor and then U.S. senator, he ruled Louisiana with an iron hand, keeping a private army equipped with sub-machine guns and a "deduct box," where he kept funds deducted from state employees' salaries. Yet the people of Louisiana loved him because he attacked the big oil companies, increased state spending on public works, and improved public schools. Although he backed Roosevelt in 1932, Long quickly abandoned the president and opposed the New Deal as too conservative.

Huey Long was immensely popular, especially among the poor. Part of his appeal lay in his style; he dressed in vanilla ice cream white suits and called himself "the Kingfish," after a character in "Amos 'n Andy." By playing up his country origins and ridiculing the rich, he became a popular legend. In one incident, he issued a "budget" showing how millionaires could economize by living on $10,000 a day.

Early in 1934 Long announced his "Share Our Wealth" program. Vowing to make "Every Man a King," he promised to soak the rich by imposing a stiff tax on inheritances over $5 million and by levying a 100 percent tax on annual incomes over $1 million. The confiscated funds, in turn, would be distributed to the people, guaranteeing every American family an annual income of no less than $2,000, in Long's words more than enough to buy "a radio, a car, and a home." By February 1935 Long's followers had organized over 27,000 "Share Our Wealth" clubs. Roosevelt had to take him seriously, for a Democratic poll revealed Long could attract three to four million voters to an independent presidential ticket.

Like Long, Father Charles Coughlin was an early supporter who turned sour on the New Deal. For about sixteen years, from the mid-twenties until the United States entered World War II, Father Charles Coughlin was probably the most influential religious figure in the United States. His radio program, "The Golden Hour of the Shrine of the Little Flower", had a weekly audience of 16 million. His parish in suburban Detroit had to build a post office to handle his mail.

Coughlin blamed the depression on greedy bankers and challenged Roosevelt to solve the crisis by nationalizing banks and inflating the currency. When Roosevelt refused to heed his advice, Coughlin broke with Roosevelt and in 1934 formed the National Union for Social Justice. The National Union's weekly newspaper serialized "The Protocols of the Elders of Zion," an anti-Semitic forgery.

Father Coughlin helped to invent a new kind of preaching that made effective use of the microphone and radio. Coughlin exemplified what historian Richard Hofstadter called the "paranoid style." He believed that Jews and Communists, in league with bankers and capitalists, were out to get the little man.

Roosevelt's least likely critic was Dr. Francis Townsend, a California public health officer who found himself unemployed at the age of 67, with only $100 in savings. Seeing many people in similar or worse straits, Townsend embraced old age relief as the key to ending the depression. In January 1934 Townsend announced his plan, demanding a $200 monthly pension for every citizen over the age of 60. In return, recipients had to retire and spend their entire pension every month within the United States. Younger Americans would inherit the jobs vacated by senior citizens, and the economy would be stimulated by the increased purchasing power of the elderly. Although critics lambasted the Townsend plan as ludicrous, several million Americans found his plan refreshingly simple.

The Wagner Act

In 1932, George Barnett, a prominent economist and president of the American Economics Association, forecast a bleak future for organized labor. "The changes, occupational and technological, which checked the advance of unionism in the last decade, appear likely to continue in the same direction," he intoned.

In 1930, only 3.4 million workers belonged to labor unions--down from 5 million in 1920. Union members were confined to a few industries, such as construction, railroads, and local truck delivery. The nation's major industries, like autos and steel, remained unorganized.

In 1935, Congress passed the landmark Wagner Act, which spurred labor to historic victories, including a sit-down strike by auto workers in Fint, Michigan in 1937, which led General Motors to recognize the United Automobile Workers. Union membership soared from 3.4 million in 1932 to 10 million in 1942 and 16 million in 1952.

As the depression dragged on, bitter labor-management warfare erupted. In 1934, 1.5 million workers went on strike. Auto and steel workers and longshoremen became involved in violent strikes. In Minneapolis police shot 67 striking Teamsters. In August, textile workers staged the largest strike the country had ever seen. 110,000 workers struck in Massachusetts, 60,000 in Georgia--a total of 500,000 workers in 20 states. While some of the strikes aimed at higher wages, fully a third demanded union recognition.

Labor unrest forced the federal government to step into labor relations and forge a compromise between management and labor. Under the Wagner Act of 1935 (the National Labor Relations Act), the federal government guaranteed the right of employees to form unions and bargain collectively. It also set up the National Labor Relations Board (NLRB), which had the power to prohibit unfair labor practices by employers.

During the mid-1930s, a bitter dispute broke out within labor's ranks. It involved an issue that had been simmering for half a century: Should labor focus its efforts on unionizing skilled workers; or should labor unionize all workers in industry, regardless of skill level? The country's major labor federation, the American Federation of Labor, consisted of craft unions, organized by occupation. In late 1935, a group of union leaders--including John L. Lewis of the United Mine Workers, David Dubinsky of the Amalgamated Clothing Workers, and Sidney Hillman of the International Ladies' Garment Workers--formed the Committee of Industrial Organization (CIO)--to organize unskilled workers in America's mass production industries. The CIO formed unions in the auto, glass, radio, rubber, and steel industries, and by the end of 1937, it had more members than the AFL--3.7 million against 3.4 million.

A 44-day sit-down strike in Flint, Michigan, in 1937 forced General Motors to recognize the United Auto Workers. A few weeks later, U.S. Steel accepted unionization without a strike, but the "Little Steel" companies--Bethlehem, Inland, National, Republic, and Youngstown Sheet & Tube--vowed to resist the steel workers union. 75,000 workers walked out and violence flared. In May 1937, police in South Chicago opened fire on marchers at the Republic mill, killing ten. Soon after, the strike was broken, but in 1941, the National Labor Relations Board ordered "Little Steel" to recognize the United Steelworkers of America and reinstate all workers fired for union activity.

Social Security

A goal of reformers since the Progressive Era, the 1935 Social Security Act aimed to alleviate the plight of America's visible poor - the elderly, dependent children, and the handicapped. A major political victory for Roosevelt, the Social Security Act was a triumph of social legislation. It offered workers 65 or older monthly stipends based on previous earnings, and it gave the indigent elderly small relief payments, financed by the federal government and the states. In addition, it provided assistance to blind and handicapped Americans, and to dependent children who did not have a wage-earning parent. The act also established the nation's first federally sponsored system of unemployment insurance. Mandatory payroll deductions levied equally on employees and employers financed both the retirement system and the unemployment insurance.

While conservatives argued that the Social Security Act placed the United States on the road to socialism, the legislation was also profoundly disappointing to reformers, who demanded "cradle to grave" protection as the birthright of every American. The new system authorized pitifully small payments; its retirement system left huge groups of workers uncovered, such as migrant workers, civil servants, domestic servants, merchant seamen, and day laborers; its budget came from a regressive tax scheme that placed a disproportionate tax burden on the poor; and it failed to provide health insurance.

Despite these criticisms, the Social Security Act introduced a new era in American history. It committed the government to a social welfare role by providing for elderly, disabled, dependent, and unemployed Americans. By doing so, the act greatly expanded the public's sense of entitlement, and the support people expected government to give to all citizens.

African Americans and the New Deal

Until the New Deal, blacks had shown their traditional loyalty to the party of Lincoln by voting overwhelmingly Republican. By the end of Roosevelt's first administration, however, one of the most dramatic voter shifts in American history had occurred. In 1936, 75 percent of black voters supported the Democrats. Blacks turned to Roosevelt in part because his spending programs gave them a measure of relief from the depression and in part because the GOP had done little to repay their earlier support.

Still, Roosevelt's record on civil rights was modest at best. Instead of using New Deal programs to promote civil rights, the administration consistently bowed to discrimination. In order to pass major New Deal legislation, Roosevelt needed the support of southern Democrats. Time and time again, he backed away from equal rights to avoid antagonizing southern whites, although his wife, Eleanor, did take a public stand in support of civil rights.

Most New Deal programs discriminated against blacks. The NRA, for example, not only offered whites the first crack at jobs but authorized separate and lower pay scales for blacks. The Federal Housing Authority (FHA) refused to guarantee mortgages for blacks who tried to buy in white neighborhoods, and the CCC maintained segregated camps. Furthermore, the Social Security Act excluded those job categories blacks traditionally filled.

The story in agriculture was particularly grim. Since 40 percent of all black workers made their living as sharecroppers and tenant farmers, the AAA acreage reduction hit blacks hard. White landlords could make more money by leaving land untilled than by putting land into production. As a result, the AAA's policies forced more than 100,000 blacks off the land in 1933 and 1934. Even more galling to black leaders, the president failed to support an anti-lynching bill and a bill to abolish the poll tax. Roosevelt feared that conservative southern Democrats, who had seniority in Congress and controlled many committee chairmanships, would block his bills if he tried to fight them on the race question.

Yet the New Deal did record a few gains in civil rights. Roosevelt named Mary McLeod Bethune, a black educator, to the advisory committee of the National Youth Administration (NYA), and thanks to her efforts, blacks received a fair share of NYA funds. The WPA was colorblind, and blacks in northern cities benefited from its work relief programs. Harold Ickes, a strong supporter of civil rights who had several blacks on his staff, poured federal funds into black schools and hospitals in the South. Most blacks appointed to New Deal posts, however, served in token positions as advisors on black affairs. At best they achieved a new visibility in government.

Mexican Americans

In February 1930 in San Antonio, Tex., 5000 Mexicans and Mexican Americans gathered at the city's railroad station to depart the United States for settlement in Mexico. In August, a special train carried another 2000 to central Mexico.

Most Americans are familiar with the forced relocation in 1942 of 112,000 Japanese Americans from the West Coast to internment camps. Far fewer are aware that during the Great Depression, the Federal Bureau of Immigration (after 1933, the Immigration and Naturalization Service) and local authorities rounded up Mexican immigrants and naturalized Mexican American citizens and shipped them to Mexico to reduce relief roles. In a shameful episode, more than 400,000 repatriodos, many of them citizens of the United States by birth, were sent across the U.S.-Mexico border from Arizona, California, and Texas. Texas' Mexican-born population was reduced by a third. Los Angeles also lost a third of its Mexican population. In Los Angeles, the only Mexican American student at Occidental College sang a painful farewell song to serenade departing Mexicans.

Even before the stock market crash, there had been intense pressure from the American Federation of Labor and municipal governments to reduce the number of Mexican immigrants. Opposition from local chambers of commerce, economic development associations, and state farm bureaus stymied efforts to impose an immigration quota, but rigid enforcement of existing laws slowed legal entry. In 1928, United States consulates in Mexico began to apply with unprecedented rigor the literacy test legislated in 1917.

After President Hoover appointed William N. Doak as secretary of labor in 1930, the Bureau of Immigration launched intensive raids to identify aliens liable for deportation. The secretary believed that removal of undocumented aliens would reduce relief expenditures and free jobs for native-born citizens. Altogether, 82,400 were involuntarily deported by the federal government.

Federal efforts were accompanied by city and county pressure to repatriate destitute Mexican American families. In one raid in Los Angeles in February 1931, police surrounded a downtown park and detained some 400 adults and children. The threat of unemployment, deportation, and loss of relief payments led tens of thousands of people to leave the United States.

The New Deal offered Mexican Americans a little help. The Farm Security Administration established camps for migrant farm workers in California, and the CCC and WPA hired unemployed Mexican Americans on relief jobs. Many, however, did not qualify for relief assistance because as migrant workers they did not meet residency requirements. Furthermore, agricultural workers were not eligible for benefits under workers' compensation, Social Security, and the National Labor Relations Act.

Native Americans

The so-called "Indian New Deal" was the only bright spot in the administration's treatment of minorities. In the late nineteenth century, American Indian policy had begun to place a growing emphasis on erasing a distinctive Native American identity. To weaken the authority of tribal leaders, Congress in 1871 ended the practice of treating tribes as sovereign nations. To undermine older systems of tribal justice, Congress, in 1882, created a Court of Indian Offenses, to try Indians who violated government laws and rules. Indian schools took Indian children away from their families and tribes and sought to strip them of their tribal heritage. School children were required to trim their hair and speak English and were prohibited from practicing Indian religions.

The culmination of these policies was the 1887 Dawes Act, which allocated reservation lands to individual Indians. The purpose of the act was to encourage Indians to become farmers, but the plots were too small to support a family or to raise livestock. Government policies reduced Indian owned lands from 155 million acres to just 48 million acres in 1934.

When Roosevelt became president in 1933, he appointed John Collier, a leading reformer, commissioner of Indian affairs. At Collier's request, Congress created the Indian Emergency Conservation Program (IECP), a CCC-type project for the reservations which employed more than 85,000 Indians. Collier also made certain that the PWA, WPA, CCC, and NYA hired Native Americans.

Collier had long been an opponent of the 50-year-old government allotment program which broke up and distributed tribal lands. In 1934 he persuaded Congress to pass the Indian Reorganization Act, which terminated the allotment program of the Dawes Severalty Act of 1887; provided funds for tribes to purchase new land; offered government recognition of tribal constitutions; and repealed prohibitions on Native American languages and customs. That same year, federal grants were provided to local school districts, hospitals, and social welfare agencies to assist Native Americans.

The New Deal in Decline

In 1936, President Roosevelt was overwhelmingly reelected. He carried every state but Maine and Vermont, easily defeating the Republican candidate Governor Alf Landon of Kansas. Democrats won an equally lopsided victory in the congressional races: 331 to 89 in the House and 76 to 16 in the Senate.

In his second inaugural address in early 1937, Franklin Roosevelt promised to press for new social legislation. "I see one-third of a nation ill-housed, ill-clad, ill-nourished," he told the country. Yet instead of pursuing new reforms, he allowed his second term to bog down in political squabbles. He wasted his energies on an ill-conceived battle with the Supreme Court and an abortive effort to purge the Democratic Party.

Court Packing

On "Black Monday," May 27, 1935, the Supreme Court struck down a basic part of Roosevelt's program of recovery and reform. A kosher chicken dealer sued the government, charging that the NRA was unconstitutional. In its famous "dead chicken" decision, Schechter v. the U.S., the court agreed, declaring that Congress had delegated excessive authority to the president and had improperly involved the federal government in regulating intrastate commerce. Complained Roosevelt, "We have been relegated to the horse-and-buggy definition of interstate commerce."

In June 1936, the court ruled another of the measures enacted during the 100 days--the Agricultural Adjustment Act--unconstitutional. Then, six months later, the high court declared a New York state minimum wage law invalid. Roosevelt was aghast. The court, he feared, had established a "'no-man's land' where no Government--State or Federal--can function."

Roosevelt feared that every New Deal reform--such as the prohibition on child labor or regulation of wages and hours--was at risk. In 1936, his supporters in Congress responded by introducing over a hundred bills to curb the judiciary's power. After his landslide reelection in 1936, the president proposed a controversial "court-packing scheme." His reelection led the president to propose reorganizing the Supreme Court. In an effort to make his opponents on the Supreme Court resign so he could replace them with justices more sympathetic to his policies, Roosevelt announced a plan to add one new member to the Supreme Court for every judge who had reached the age of 70 without retiring (six justices were over 70). To offer a carrot with the stick, Roosevelt also outlined a generous new pension program for retiring federal judges.

The court-packing scheme was a political disaster. Conservatives and liberals alike denounced Roosevelt for attacking the separation of powers and critics accused him of trying to become a dictator. Fortunately, the Court itself ended the crisis by shifting ground. In two separate cases the Court upheld the Wagner Act and approved a Washington state minimum wage law, furnishing proof that it had softened its opposition to the New Deal.

Yet Roosevelt remained too obsessed with the battle to realize he had won the war. He lobbied for the court-packing bill for several months, squandering his strength on a struggle that had long since become a political embarrassment. In the end, the only part of the president's plan to gain congressional approval was the pension program. Once it passed, Justice Willis Van Devanter, the most obstinate New Deal opponent on the Court, resigned. By 1941 Roosevelt had named five justices to the Supreme Court. Few legacies of the president's leadership proved more important, for the new "Roosevelt Court" significantly expanded the government's role in the economy and in civil liberties.

The Depression of 1937

The sweeping Democratic electoral victory in 1936 was followed by a deep economic relapse known as the "Roosevelt Recession." In just a few months, industrial production fell by 40 percent; unemployment rose by 4 million; stock prices plunged 48 percent.

Several factors contributed to the "little depression." Reassured by good economic news in 1936, Roosevelt slashed government spending the following year. The budget cuts knocked the economy into a tailspin. Roosevelt's virulent attacks on "economic royalists" also undermined business confidence.

By the end of 1938 the reform spirit was gone. A conservative alliance of southern Democrats and northern Republicans in Congress blocked all efforts to expand the New Deal. In the congressional elections of 1938, Roosevelt campaigned against five conservative senators who opposed the New Deal. All won reelection. Roosevelt's failures showed conservative Democrats that they could defy the president with impunity.

Popular Culture During the Great Depression

The popular culture of the 1930s was fraught with contradictions. It was, simultaneously, a decade of traditionalism and of modernist experimentation, of sentimentality and "hard-boiled" toughness, of longings for a simpler past and fantastic dreams of the future.

It was a decade in which many Americans grew increasingly interested in tradition and folk culture. Under the leadership of Alan Lomax, the Library of Congress began to collect folk songs, while folk singers like Woody Guthrie and Pete Seeger attracted large audiences.

Henry Ford, who had revolutionized the American landscape through the mass production of cars, devoted his energies and fortune to a new project: Greenfield Village, a collection of historic homes and artifacts, located near Detroit. At the same time, the Rockefeller family restored colonial Williamsburg in Virginia.

Seeing modern society as excessively individualistic and fragmented, many prominent intellectuals looked to the past. Eleven leading white southern intellectuals, known as the Southern Agrarians, issued a manifesto titled I'll Take My Stand, urging a return to an agrarian way of life. Another group of distinguished intellectuals known as the New Humanists, led by Irving Babbitt and Paul Elmer More, extolled classical civilization as a bulwark against modern values. One of the decade's leading social critics was Lewis Mumford. In volumes like Technics and Civilization (1934), Mumford examined how the values of a pre-machine culture could be blended into modern capitalist civilization.

And yet, for all the emphasis on tradition, the '30s was also a decade in which modernism in architecture and the arts became increasingly pronounced. Martha Graham developed American modern dance. William Faulkner experimented with "stream-of-consciousness" in novels like As I Lay Dying (1930), while John Dos Passos's avant garde U.S.A. trilogy combined newspaper headlines, capsule biographies, popular song lyrics, and fiction to document the disintegration of depression-era society. The architect R. Buckminster Fuller and the industrial designer Walter Dorwin Teague employed curves and streamlining to give their projects a modern appearance. Nothing better illustrated the concern with the future than the 1939 New York World's Fair, the self-proclaimed "Fair of the Future" which promised to show fairgoers "the world of tomorrow."

Beset by deep anxieties and insecurities, many Americans in the 1930s hungered for heroes. Popular culture offered many: superheroes like Superman and Batman, who appeared in the new comic books of the '30s; tough, hard-boiled detectives in the fiction of Dashiell Hammett and Raymond Chandler; and radio heroes like "The Lone Ranger" or "The Shadow."

The depression was, in certain respects, a powerful unifying experience. A new phrase, "the American way of life," entered the language--as did public opinion polls and statistical surveys that gave the public a better sense of what the "average American" thought, voted, and ate. The new photojournalism that appeared in new magazines like Life helped to create a common frame of reference. Yet regional, ethnic, and class differences occupied an important place in the literature of the 1930s. The great novels of the decade successfully combined social criticism and rich detail about the facts of American life in specific social settings. In his novels of fictional Yoknapatwpha County, William Faulkner explored the traditions and history of the South. James T. Farrell's Studs Lonigan trilogy (1932-1935) analyzed the impact of urban industrial decay on Catholic youth, while Henry Roth's Call It Sleep (1934) analyzed the assimilation of Jewish youth to American life. John Steinbeck's Grapes of Wrath (1939) examined the struggle of a poor Oklahoma farming family migrating to California. Richard Wright's classic Native Son (1940) analyzed the ways that poverty and prejudice in Chicago drove a young African American to crime.

Hollywood during the Great Depression

During the Great Depression, Hollywood played a valuable psychological role, providing reassurance to a demoralized nation. Even at the depression's depths, 60 to 80 million Americans attended movies each week.

During the depression's earliest years, movies reflected a despairing public's mood, as Tommy-gun toting gangsters, haggard prostitutes, and sleazy backroom politicians and lawyers appeared on the screen. Screen comedies released in these years expressed an almost anarchistic disdain for traditional institutions and values. The Marx Brothers spoofed everything from patriotism to universities; W.C. Fields ridiculed families; and Mae West used sexual innuendo to poke fun at the middle class code of sexual propriety.

A renewed sense of optimism generated by the New Deal combined with industry self-censorship to produce new kinds of films during the depression's second half. G-men, detectives, western heroes, and other defenders of law and order replaced gangsters. Audiences enjoyed Frank Capra comedies and dramas in which a little man stands up against corruption and restores America to itself. A new comic genre arose--the screwball comedy--which presented a world in which rich heiresses wed impoverished young men, keeping alive a vision of America as a classless society.

In the face of economic disaster, the fantasy world of the movies sustained a traditional American faith in individual initiative, in government, and a common American identity transcending social class.

Legacy of the New Deal

The New Deal did not end the Depression. Nor did it significantly redistribute income. But it did provide Americans with economic security they had never known before. Its legacies include unemployment insurance, old age insurance, and insured bank deposits. The Wagner act reduced violence in labor relations. The Securities and Exchange Commission protected stock market investments of millions of small investors. The Federal Housing Administration and Fannie Mae enabled a majority of Americans to become homeowners.

But the New Deal's greatest legacy was a shift in government philosophy. As a result of the New Deal, Americans came to believe that the federal government has a responsibility to ensure the health of the nation's economy and the welfare of its citizens.

................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download