Добавил:
Upload Опубликованный материал нарушает ваши авторские права? Сообщите нам.
Вуз: Предмет: Файл:
About the USA.docx
Скачиваний:
22
Добавлен:
25.03.2015
Размер:
4.35 Mб
Скачать

About the United States

GEOGRAPHY

The vast and varied expanse of the United States of America stretches from the heavily industrialized, metropolitan Atlantic seaboard, across the rich flat farms of the central plains, over the majestic Rocky Mountains to the fertile, densely populated west coast, then halfway across the Pacific to the semi-tropical island-state of Hawaii. Without Hawaii and Alaska the continental U.S. measures 4,505 kilometers from its Atlantic to Pacific coasts, 2,574 kilometers from Canada to Mexico; it covers 9,372,614 square kilometers. In area, it is the fourth largest nation in the world (behind the Soviet Union, Canada and China).

The sparsely settled far-northern state of Alaska, is the largest of America's 50 states with a land mass of 1,477,887 square kilometers. Alaska is nearly 400 times the size of Rhode Island, which is the smallest state; but Alaska, with 521,000 people, has half the population of Rhode Island.

Airlines service 817 cities throughout the country. A flight from New York to San Francisco takes five-and-a-half hours. Train service is also available: The most frequent service is between Washington, D.C., New York and Boston in the East; St. Louis, Chicago and Milwaukee in the Midwest; and San Diego, Los Angeles and San Francisco in the West. A coast-to-coast trip by train takes three days. The major means of intercity transportation is by automobile. Motorists can travel over an interstate highway system of 88,641 kilometers, which feeds into another 6,365,590 kilometers of roads and highways connecting virtually every city and town in the United States. A trip by automobile from coast to coast takes five to six days.

America is a land of physical contrasts, including the weather. The southern parts of Florida, Texas, California, and the entire state of Hawaii, have warm temperatures year round; most of the United States is in the temperate zone, with four distinct seasons and varying numbers of hot and cold days each season, while the northern tier of states and Alaska have extremely cold winters. The land varies from heavy forests covering 2,104 million hectares, to barren deserts, from high-peaked mountains (McKinley in Alaska rises to 6193.5 meters), to deep canyons (Death Valley in California is 1,064 meters below sea level).

The United States is also a land of bountiful rivers and lakes. The northern state of Minnesota, for example, is known as the land of 10,000 lakes. The broad Mississippi River system, of great historic and economic importance to the U.S., runs 5,969 kilometers from Canada into the Gulf of Mexico—the world's third longest river after the Nile and the Amazon. A canal south of Chicago joins one of the tributaries of the Mississippi to the five Great Lakes—making it the world's largest inland water transportation route and the biggest body of fresh water in the world. The St. Lawrence Seaway, which the U.S. shares with Canada, connects the Great Lakes with the Atlantic Ocean, allowing seagoing vessels to travel 3,861 kilometers inland, as far as Duluth, Minnesota, during the spring, summer and fall shipping season.

America's early settlers were attracted by the fertile land along the Atlantic coast in the southeast and inland beyond the eastern

Appalachian mountains. As America expanded westward, so did its farmers and ranchers, cultivating the grasslands of the Great Plains, and finally the fertile valleys of the Pacific Coast. Today, with 1,214 million hectares under cultivation, American farmers plant spring wheat on the cold western plains; raise corn, wheat and fine beef cattle in the Midwest, and rice in the damp heat of Louisiana. Florida and California are famous for their vegetable and fruit production, and the cool, rainy northwestern states are known for apples, pears, berries and vegetables.

Underground, a wealth of minerals provides a solid base for American industry. History has glamorized the gold rushes to California and Alaska and the silver finds in Nevada. Yet America's yearly production of gold ($2,831,000,000) is far exceeded by the value of its petroleum, natural gas, clays, phosphates, lead and iron, even its output of sand, cement and stone for construction. Production value of crude oil alone is about 4.2 thousand million annually, pumped from petroleum reserves that range from the Gulf of Mexico to Alaska's North Slope.

POPULATION TRENDS

America has long been known as an ethnic "melting pot." Its current population is 252.5 million, made up of immigrants or their descendants from virtually every country in the world. It is believed that the first people to arrive—from Siberia, more than 10,000 years ago—were the Native Americans or the American Indians. Today, nearly 1.5 million American Indians and Eskimos live in the United States, many on tribal lands set aside for them in 31 states.

Europe, the major source of U.S. immigration, began sending colonists to America in the early 17th century, primarily from northern and western Europe. Immigration peaked in the period from 1880 to 1920, when tens of millions of immigrants entered the United States, with the largest percentage during that period coming from southern and eastern Europe.

Black Americans, who today number 30.79 million, constitute the largest single ethnic minority in the country. They were first brought to the New World as slaves in the 17th, 18th and early 19th centuries. In the 20th century large numbers of blacks, who historically lived in the South, migrated to the large industrial cities of the North in search of jobs and a better way of life. Hispanics, who number 20.5 million and live primarily in the Southwest, are the next largest ethnic minority group in the United States. Sixty percent are Mexican-Americans with the remainder from Central and South America. The Hispanic community is extremely varied, and includes large Puerto Rican populations in many eastern cities as well as a growing Cuban-American presence in Miami, Florida. The United States' population has also absorbed nearly 6.5 million Asians (from China, Hong Kong, Japan, Laos, the Philippines, Vietnam, South Korea, Cambodia and Thailand.) Many Asian Americans live in Hawaii, where more than two-thirds of the population claim an Asian or Polynesian heritage.

Once a nation of farmers, the United States has become increasingly urban since the turn of the century. Today, 77 percent of the population lives in or near cities, and only 1.9 percent of the population lives on farms. In 1988, the United States counted 10 metropolitan areas of over one million people, and 175 cities with 100,000 or more people

Since 1930, suburbs have grown faster than the cities (as middle-class residents have left the crowded living conditions of most large cities). Suburbs are defined as residential areas within commuting distance to large cities. Most people who live in suburbs own their own homes and commute to work in the city, or they work in nearby offices and factories that have relocated to the suburbs.

Americans as a nation tend to be quite mobile. Over a five year period, one family in 10 moves to a new state. In general, the population currently is shifting south and westward. California has passed New York as the most populous state, although the metropolitan area of New York City (population: 18.1 million) remains the nation's largest, with Los Angeles second (13.7 million), and Chicago third (8.181 million).

During the period from 1945 to 1964, the number of children born in the United States increased dramatically; a total of 76 million babies were born during this period. This sharp increase became known as the "baby boom." As this group, known as the baby boomers, has grown to adulthood, it has brought significant economic, cultural and social changes to the American population.

POLITICAL SYSTEM

The nation's capital, Washington, D.C., has the 10th largest metropolitan population in the country, with a population of over 3.9 million. Laid out by the French architect Pierre L'Enfant in the late 18th century, it was the world's first city especially planned as a center of government.

The city of Washington, in the District of Columbia along the Potomac River, is the capital of a federal union of 50 states. When the United States declared its independence from Great Britain on July 4,1776 (now celebrated as a national holiday), there were 13 original states—each one sovereign, each wanting to control its own affairs. The states tried to keep their sovereignty and independence within a loose confederation, but their attempt proved ineffectual. Therefore, in 1789, they adopted a new Constitution establishing a federal union under a strong central government.

The original 13 states were grouped along the Atlantic Coast. As the frontier moved westward, large areas of what is now the continental United States were added by purchase, treaty and annexation. As each state was settled, governments were first organized as territories and later entered the Union as states when their territorial legislatures petitioned the Congress for admission. There are now 50 states. Alaska and Hawaii, the last states to enter the Union, did so in 1959.

Under the Constitution, the states delegated many of their sovereign powers to this central government in Washington. But they kept many important powers for themselves. Each of the 50 states, for example, retains the right to run its own public school system, to decide on the qualifications of its voters, to license its

doctors and other professionals, to provide police protection for its citizens and to maintain its roads.

In actual practice, and in line with the American tradition of keeping government as close to the people as possible, the states delegate many of these powers to their political subdivisions—counties, cities, towns and villages. Thus, at the lowest political level, residents of small American communities elect village trustees to run their police and fire departments, and elect a board of education to run their schools. On the county level, voters elect executives who are responsible for roads, parks, libraries, sewage and other services, and elect or appoint judges for the courts. The citizens of each state also elect a governor and members of the state legislature.

In addition to the 50 states and the District of Columbia, citizens of the Commonwealth of Puerto Rico, the Commonwealth of the Northern Mariana Islands, Guam, the Virgin Islands and American Samoa vote in federal elections. United States possessions include the Pacific Islands of Wake, Midway, Jarvis, Howland, Baker, Johnston Atoll and Kingman Reef. The United States administers the Republic of Palau under United Nations auspices. Two entities, The Federated States of Micronesia and the Republic of the Marshall Islands, have become sovereign self-governing states in free association with the United States.

Under the Constitution, the federal government is divided into three branches, each chosen in a different manner, each able to check and balance the others.

The Executive Branch is headed by the President, who, together with the Vice President, is chosen in nationwide elections every four years (in every year divisible by four). The elective process for a U.S. President is unique. Americans vote for slates of presidential electors equal to the number of Senators and Representatives each state has in Congress (a total of 535 persons). The candidate with the highest number of votes in each state wins all the electoral votes of that state. The presidential candidate needs 270 electoral votes to be elected; if no candidate has a majority, the House of Representatives makes the decision. (In all other state and local elections, voters cast their votes directly for the candidate or referendum on that particular ballot.) Any natural-born American who is 35 years old or older may be elected to this office. The President proposes bills to Congress, enforces federal laws, serves as Commander- in-Chief of the Armed Forces and, with the approval of the Senate makes treaties and appoints federal judges, ambassadors and other members of the Executive Departments (the Departments of State, Defense Commerce, Justice, etc.). Each Cabinet head holds the title of Secretary and together they form a council called the Cabinet.

The Vice President, elected from the same political party as the President, acts as chairman of the Senate, and in the event of the death or disability of the President, assumes the Presidency for the balance of his term.

The Legislative Branch is made up of two houses: the Senate and the House of Representatives. The 435 seats in the House of Representatives are allocated on the basis of population, although every state has at least one representative. Each state elects two members of the 100-member Senate; a

Senator's term of office is six years.

Both houses must approve a bill for it to become law, but the President may veto or refuse to sign it. If so, Congress reconsiders the bill. If two-thirds of the members of both houses then approve it, the bill becomes law even without the President's signature.

The Judicial Branch is made up of Federal District Courts (at least one in every state), 11 Federal Courts of Appeals and, at the top, the Supreme Court. Federal judges are appointed by the President with the approval of the Senate; to minimize political influences, their appointments are for life. Federal courts decide cases involving federal law, conflicts between states or between citizens of different states. An American who feels he has been convicted under an unjust law may appeal his case all the way to the Supreme Court, which may rule that the law is unconstitutional. The law then becomes void.

In order to amend the Constitution, Congress must pass the proposed amendment by a two-thirds majority vote in each house, and three-fourths of the states must concur. In more than 195 years, the Constitution has been amended 26 times. The first 10 Amendments— the Bill of Rights—guarantee individual liberties: freedom of speech, religion and assembly, the right to a fair trial, the security of one's home. Later amendments chronicle America's struggle for equality and justice for all of its people. These amendments abolish slavery, prohibit any denial of rights because of race, grant the vote to women and to citizens of the District of Columbia and allow citizens to vote at age 18.

ECONOMY

The American economy is a free enterprise system that has emerged from the labors of millions of American workers; from the wants that tens of millions of consumers have expressed in the marketplace; from the efforts of thousands of private business people; and from the activities of government officials at all levels who have undertaken the tasks that individual Americans cannot do.

The nation's income and productivity have risen enormously over the past 70 years. In this period, the money for personal consumption tripled in real purchasing power. The gross national product per capita quadrupled, reflecting growth in worker productivity.

Together, all sectors of the American economy produce almost $4,000 million dollars worth of goods and services annually, and each year they turn out almost $190,000 million more. The consumption of these goods and services is spread widely. Most Americans consider themselves members of the middle economic class, and relatively few are extremely wealthy or extremely poor. According to U.S. Census Bureau figures, 9.6 percent of all American families make more than $50,000 a year, and 7.7 percent of all American families have incomes less than $10,000; the median annual income for all American families is about $28,906.

Americans live in a variety of housing that includes single detached homes (62 percent) with a median cost of $112,500. They also live in apartments, town-houses and mobile homes. Three-fourths of all married couples own their own homes. The size of all dwelling

units has increased in living space. The median number of rooms occupied in each dwelling unit has increased from 4.9 rooms per unit in 1960 to 5.2 rooms today, despite the shrinking family size. About 3.6 percent of all Americans live in public (government- supplied or subsidized) housing.

The government plays an important role in the economy, as is the case in all countries. From the founding of the Republic, the U.S. federal government has strongly supported the development of transportation. It financed the first major canal system and later subsidized the railroads and the airlines. It has developed river valleys and built dams and power stations. It has extended electricity and scientific advice to farmers, and assures them a minimum price for their basic crops. It checks the purity of food and drugs, insures bank deposits and guarantees loans.

America's individual 50 states have been most active in building roads and in the field of education. Each year the states spend some $33.31 million on schools and provide a free public education for 29.1 million primary- school pupils and 11.4 million youth in secondary schools. (In addition, 8.3 million youths attend private primary and secondary schools.) Approximately 60 percent of the students who graduate from secondary schools attend colleges and universities, 77.2 percent of which are supported by public funds. The U.S. leads the world in the percentage of the population that receives a higher education. Total enrollment in schools of higher learning is 13.4 million.

Despite the fact that the United States government supports many segments of the nation's economy, economists estimate that the public sector accounts for only one-fifth of American economic activity, with the remainder in private hands. In agriculture, for example, farmers benefit from public education, roads, rural electrification and support prices, but their land is private property to work pretty much as they desire. More than 86.7 percent of America's 208.8 million farms are owned by the people who operate them; the rest are owned by business corporations. With increasingly improved farm machinery, seed and fertilizers, more food is produced each year, although the number of farmers decrease annually. There were 15,669,000 people living on farms in 1960; by 1989 that total had decreased to 4,801,000. Farm output has increased dramatically: just 50 years ago a farmer fed 10 persons; today the average farmer feeds 75. America exports some 440.9 thousand million worth of farm products each year. The United States produces as much as half the world's soybeans and corn for grain, and from 10 to 25 percent of its cotton wheat, tobacco and vegetable oil.

The bulk of America's wealth is produced by private industries and businesses—ranging from giants like General Motors, which sells $96,371 million worth of cars and trucks each year—to thousands of small, independent entrepreneurs. In 1987, nearly 233,710 small businesses were started in the U.S. Yet by one count, some 75 percent of American products currently face foreign competition within markets in the United States. America has traditionally supported free trade. In 1989, the U.S. exported $360,465 thousand million in goods and imported $475,329 thousand million

GEOGRAPHY AND ENVIROMENT

By Michael Cusack

(Senior Editor, Scholastic Magazine)

Lonely and isolated, early settlements of northern Europeans in the New World huddled near the Atlantic coastline. The Spanish missions on the Pacific coast of the continent and in the area of what is now Mexico, 3,000 miles (4,800 kilometers) distant, were unknown to them. To the east, 3,000 miles of angry ocean separated the settlers from their former homelands. Inland, they faced a vast, frightening, unknown and mostly forested wilderness.

LURE OF THE WILDERNESS

Settlers viewed the wilderness both as a source of danger and a source of wealth. Dangerous beasts posed a serious threat anywhere outside the settlements. The native peoples were numerous, and their reaction to the sudden appearance of new settlers unpredictable. On occasion, scouts and explorers vanished into the immensity of the forests.

Though the wilderness awed the early settlers, it also attracted them. It provided a wealth of timber for building and use as fuel. In many areas, game animals were plentiful. Beavers, foxes and other small animals provided valuable furs for sale to Europe.

Historic first photograph of the geyser known as "Old Faithful,"

located in Yellowstone National Park. Yellowstone is a one million

-hectare wilderness area in the state of Wyoming that was establishe

d as the nation's first national park in 1871.

Eastman House

Explorers who did return from the continent's interior told stories of high mountains, great fertile valleys, grassy plains, mighty rivers and lakes as big as inland seas. To the people who told the stories and to the people who heard them, the wealth and resources of the New World appeared limitless. All that seemed necessary was courage and hard work to create a paradise on Earth.

NEW FRONTIER

People made the difficult and dangerous voyage to America for many different reasons. Some sought adventure. Others wanted gold and silver. Many made the voyage to escape oppression or to be free to practice their religion. Beyond these reasons was the additional drive for living space. Very few of the settlers could have hoped to own land in Europe. But in America, the land seemed to be there for the taking.

Waves of land-hungry settlers established farms and homesteads in the primeval forest. The forest was so vast and overpowering that each clearing was viewed as a victory in "taming the wilderness."

In some places, however, after settlers cut away the trees and removed the brush, they found the soil to be rocky or poor in nutrients. Many areas of New England—the region now made up of Massachusetts, Maine, Connecticut, Rhode Island, New Hampshire and Vermont—have shallow, coarse soil. The winters are harsh and the growing seasons are short.

Under these conditions, pioneer farming in much of New England and parts of New York, New Jersey and Pennsylvania proved to be difficult and disappointing. After years of struggle, some people sold or abandoned their farms and moved westward in search of more fertile land.

Farther south, in what are now the states of Delaware, Maryland, Virginia, North and South Carolina, and Georgia, the soil was generally richer. Except for some swampy coastal areas, the soil is mostly red-yellow clay. And in early colonial times, that soil was very fertile. The long growing season, abundant rainfall, warm climate and relatively flat land made the southern coastal region ideal for certain cash crops. At various times, these included tobacco, rice, sugarcane, corn and cotton. By the mid-1600s, it was clear that these crops could be grown most economically on large landholdings—plantations— worked by slaves.

Many small farms in the South were sold to help make up the large plantations, and their former owners moved west in search of fertile land. They were joined by new settlers from Europe, who bypassed the settled plantation areas near the coast.

Most soil in the South was originally very fertile, but the continuous growing of demanding crops such as tobacco and cotton took nutrients from the soil. In addition, the frequent heavy rains of the region tended to erode—wear away— exposed topsoil. In many areas, this led to a decline in the yield of crops per hectare. Plantation owners often dealt with this problem of worn-out soil by expanding—by buying and using more land for their cash crops. Plantations spread to the west

.

SPIRIT OF THE FRONTIER

The European population of what is now the United States was scattered over a large area. The population density was very low. Farms, towns and villages were spaced far apart. Also, farms tended to be much larger than farms in Europe, partially because the yield per hectare was lower in America.

Except in the area near the seacoast, communication links between early American settlements were very poor. Roads were few and far between; those that existed were usually in terrible shape. To an extent, rivers served as communication links, but waterfalls and rapids often limited their usefulness.

As one traveled inland, the isolation of settlements increased. In search of fertile land, groups of settlers often bypassed large areas that they considered to be wilderness. As a result, a small settlement might be hundreds of miles from other settlements. A family might be a day's journey from another family. This pattern of settlement created frontier communities that had to rely completely on their own resources. Almost everything they used they had to make themselves. They developed their own music, entertainment, folklore, art and forms of religious worship.

In this setting, a frontier spirit developed. It was marked by toughness, independence, self-reliance, caring for others, but a suspicion of outsiders. There was also a restlessness and a sense of curiosity in the frontier spirit.

At any given time during the long westward flow of American settlement (from the early 1600s to the late 1800s), the number of people on the frontier was tiny compared to the number of people "back East" in the settled areas. Yet, the frontier spirit has always had an enormous influence on the entire nation. Politicians have praised the frontier life. Songs and stories have described it in glowing terms. Such frontier heroes as Daniel Boone, Davy Crockett and Jim Bowie have been admired by generations of Americans.

Writing in the 1890s, historian Frederick Jackson Turner claimed that the frontier experience shaped the American character for all time. In his opinion, the geography and environment of America— particularly the westward expansion and the availability of free land—shaped American attitudes and institutions.

Turner wrote: "This perennial (enduring) rebirth, this fluidity of American life, this expansion westward with its new opportunities, its continuous touch with the simplicity of primitive society, furnish the forces dominating American character." Not every modern historian agrees with all of Turner's ideas, but most historians agree that the westward expansion of the frontier has had great significance in American history.

Many values and attitudes—good and bad—of present-day America can be traced to the frontier experience. The westward expansion stressed values of ruggedness, resourcefulness, self-reliance and comradeship. There was always a greater sense of equality on the frontier than in long-settled areas. After the American Civil War (1861-1865), many black Americans who had been recently freed from slavery moved west in search of equal opportunities. Many of these black frontiers­men gained some fame and fortune as cowboys, miners and prairie settlers.

In 1869, the western territory (later the state) of Wyoming became the first place in the world where women could vote and hold elected office.

Because the resources of the West seemed limitless, people developed wasteful attitudes and practices. Herds of buffalo (American bison) were slaughtered. Dry, flat grasslands (prairies) were badly farmed, and in years of drought much of the exposed soil blew away as dust. Open mines were used and abandoned. The western frontier was so large, and people there were so few, that it seemed that the natural resources could never be used up or destroyed. In more recent years, Americans have tried to conserve their resources better.

CROSSING BARRIERS

Westward expansion of European settlements in North America was not a steady movement. It took place as a series of uneven spurts and pauses. Several barriers—geographical, social and political—slowed the westward movement at various times.

In 1700, European settlements in English-speaking North America stretched along the Atlantic coastline from southern Maine to South Carolina. While most settlements were less that 50 miles (80 kilometers) from the coast, a few were located further inland along tidal rivers. This was called "the tidewater phase" of settlement.

Over the next 50 years, the fertile river valleys in New England were settled; so was the valley of the Mohawk River in upstate New York. Settlers moving west from Philadelphia cleared oak forests of central Pennsylvania and produced an area of hilly green farms.

Settlements also spread westward along river valleys in Virginia and, to a lesser extent, in the Carolinas and Georgia. By the 1760s, the westward movement reached the first major barrier—the Appalachian Mountain Range. This mountain range stretches northeast to the southwest. It somewhat parallels the Atlantic coastline.

When they reached the foothills of the Appalachians, settlers found that most rivers flowing from west to east were blocked by waterfalls or rapids.

For a number of years, westward expansion was blocked. Then in 1775 explorer Daniel Boone (1734-1820) and a party of axmen cut the Wilderness Road through the forested Cumberland Gap, a natural pass in the Appalachians. The road through the Cumberland Gap enabled settlers to move with mules, horses and cattle into the fertile lands that now make up the states of Kentucky and Tennessee.

From 1776 to 1783, Britain's 13 American colonies formed the United States of America and fought a War for

Independence. The war was ended by the Treaty of Paris in 1783. The treaty set the western boundary of the United States or the Mississippi River, which flows south from near the Canadian border to the Gulf of Mexico at the port of New Orleans.

Peace led to a great westward movement of people into the new American territories between the Appalachian Mountains and the Mississippi River. The Ohio River, flowing from Fort Pitt (now Pittsburgh) westward to the Mississippi, served as a major highway for the settlers and their goods.

BEYOND THE MISSISSIPPI

For a time, the wide Mississippi and the vast, mostly unexplored Louisiana Territory beyond blocked American expansion to the west. Louisiana had belonged to France in the 17th and early 18th centuries. Then was owned by Spain in the late 18th century and returned to France in 1800.

In 1803, agents of United States President Thomas Jefferson negotiated from the purchase of the vast territory from the French emperor Napoleon. The Louisiana Purchase, stretching north from the Gulf of Mexico to the Canadian border and westward to the Rocky Mountains, almost doubled the land area of the United State.

All or parts of the present states of Louisiana, Arkansas, Oklahoma, Kansas, Missouri, Colorado, Nebraska, Iowa, North Dakota, South Dakota, Minnesota, Wyoming and Montana were acquired by the Louisiana Purchase.

SEA OF GRASS

The Louisiana Purchase also gave the United States a very large, very distinct geographic region known as the Great Plains, or Western Prairies. Around 1803, this was a region of flat or gently rolling land covered with tall grass. There were practically no trees, bushes or exposed rocks. Early travelers through the region called it "a sea of grass."

In general, the Great Plains region is drier than the land east of the Mississippi. Rainfall ranges from around 40 inches (103 cm) a year on the eastern rim of the Great Plains to less than 18 inches (46 cm) year in the western portion. Summers on the Great Plains can be very hot, about 110 degrees Fahrenheit (44 degrees Celsius), and dry. Summer rain, when it comes, is usually in the form of fierce thunderstorms Both droughts and floods are common to some parts of the vast region.

On the Great Plains, the seasons of spring and autumn tend to be brief. Winters particularly in Montana, the Dakotas, Wyoming, Nebraska, Iowa and Minnesota, can be very cold. Temperatures often dip as low as -40 degrees Fahrenheit (also -40 degrees Celsius.) Fierce, windy snow­storms, or blizzards, are not uncommon.

At the time of the Louisiana Purchase, relatively few people lived in the Great Plains region. A number of Indian tribes— primarily Sioux, Pawnee, Comanche and Cheyenne—hunted on the Great Plains

Unlike American Indians to the east, south and far west, these tribes did not live in permanent year-round settlements, nor did they grow crops. They survived by hunting, particularly the buffalo. And they moved with the buffalo herds.

For over 40 years after the Louisiana Purchase, white Americans did not move into the Great Plains in large numbers. During the first half of the 19th century, most westward-bound settlers viewed the Great Plains region as a place to cross on their way to more attractive lands on North America's Pacific Coast.

FROM SEA TO SHINING SEA

Long before the Great Plains region was known to white people, there were European settlements along the California coast. They were created by Spanish missionaries, soldiers, traders and settlers moving north from below what is now the border with Mexico. They included San Diego, Los Angeles, Santa Barbara, San Luis Obispo, Monterey and, finally, San Francisco.

The Spaniards were not the only explorers of America's Pacific Coast region. British, Russian and, later, American seafarers explored the coastline. They marveled at the beautiful harbors, high mountains, very fertile valleys and almost perfect climate.

In 1778, British sea captain James Cook (1728-1779) explored the Pacific coastline north of San Francisco to Alaska. Some of Cook's sailors discovered that sea otter skins bought for around $2 each from the Indians of the American Northwest could be sold in China for around $100 each. This prompted the British as well as Americans to set up a number of trading posts in the region.

In addition to these British and American interests, Russia had claims on the region. In 1741, Vitus Bering, a Danish explorer employed by the Russian emperor, explored the coastal areas of Alaska and the Aleutian Islands. These explorations led to the creation of Russian settlements in Alaska. From these posts, Russian ships sailed up and down the coast from Alaska to California hunting seals and sea otters.

The Pacific coastal region stretching from California to Alaska and inland through the Rocky Mountains became known as the Oregon Country. In 1803, it was claimed by Spain, Russia, the United States and Britain.

Though Russia controlled Alaska and Spain was in possession of California, the British and American claims on Oregon were strongest. Both countries had trading posts on the coast. Then, traveling overland from Canada, British explorer Alexander Mackenzie reached the Oregon coast in 1793. This and other overland explorations opened the interior of the Oregon Country to the British.

The Americans were not far behind. In November 1805, United States explorers Meriwether Lewis (1774-1809) and William Clark (1770-1838) reached the Oregon Coast after a two-and-a-half year journey from the Mississippi River.

A number of American trappers and hunters followed the example of Lewis and Clark and explored new trails to the West. Because they spent much time exploring passes through the Rocky Mountains in California and the Oregon Country, they became known as the "mountain men." They played a big role in the westward expansion of the United States.

Early in the 19th century, Spain and Russia gave up their claims on the Oregon Country. Then in 1818, Britain and the United States agreed to share the vast territory. Also in 1818, the boundary between the United States and British North America (now Canada) was established along the 49th latitude line from the Great Lakes to the Oregon Country (the eastern slopes of the Rocky Mountains).

In the following year, 1819, a treaty with Spain set the boundary between the United States and the Spanish possession of Mexico. At that time, Mexico included what were later to become the American states of Texas, Arizona, Utah, Nevada, New Mexico, California and part of Colorado.

The westward movement of Americans was spurred on by the idea that it was the "manifest destiny" of the United States to span the continent from the Atlantic Ocean to the Pacific Ocean. But California was firmly under Mexican control. Therefore, many Americans felt that the easiest way to expand the United States to the Pacific was by gaining sole control of the Oregon Country.

During the 1820s and '30s, British settlements and trading posts in the Oregon Country greatly outnumbered those of the Americans. As a result, many American political leaders feared that the British would gain sole control of the Oregon Country. A great effort was made to encourage American settlement in Oregon.

The first Americans going to Oregon traveled by ship from the United States' east coast, around South America, through the stormy Strait of Magellan, and up along the Pacific Coast. It was a difficult, dangerous and expensive journey that lasted months. Starting in 1832, groups of settlers traveled overland to Oregon. Usually, the settlers started from Independence, Missouri, and followed a winding trail for over 2,000 miles (3,200 kilometers) to Oregon. The overland route to Oregon became known as the Oregon Trail, but it was never a single marked trail. It was a general direction across the Great Plains with some known river-crossing points and passes through the mountains. It was a very difficult, dangerous trail. Floods, droughts, blizzards, prairie fires, accidents, disease and hostile Indians took a high toll of the would-be settlers.

In 1843, "Oregon fever" gripped many parts of the United States. People in many areas sold or abandoned their worn- out farms, packed all their belongings, and headed west. Soon, the American settlers in the Oregon country outnumbered the British.

Some Americans were prepared to settle for half the Oregon Country—south of latitude 49 degrees. This would extend the border between the United States and British North America all the way to the

Pacific. Many other Americans, however, demanded the entire Oregon Country—all the way north to latitude 54 degrees 40 minutes. They spread the slogan "54-40 or fight." Then, in 1844, a man committed to "manifest destiny," James K. Polk, was elected president of the United States.

In his inaugural speech in 1845, President Polk said that the American claim "to the whole of Oregon is clear and unquestionable." For a time, war between the United States and Britain seemed likely. Then, in 1846, the British foreign secretary Lord Aberdeen offered the Americans the portion of the Oregon Country south of latitude 49. President Polk, faced with the likelihood of war with Mexico, and not wishing to lead the United States into war with two adversaries at the same time, agreed. On June 15, 1846, the southern portion of the Oregon Country (made up of the present states of Washington, Oregon and Idaho with parts of Montana and Wyoming) became part of the United States.

TEXAS AND THE SOUTHWEST

By the 1830s, American settlers in the large Mexican province of Texas outnumbered Mexicans. They also talked about independence from Mexico. This desire for Texas to become independent sharply increased in 1833 when General Antonio Lopez de Santa Anna overthrew the Mexican government and set himself up as dictator of all Mexico. Santa Anna cut off all new American migration to Texas and increased taxes on Americans already living there. In response, the Texans revolted in October 1835 and proclaimed the Lone Star Republic.

At the beginning of the revolt, the Texans suffered several defeats. But they reorganized and formed a small, skilled army. Finally on March 21,1836, the Texans under the command of Sam Houston defeated a much larger Mexican army at the battle of San Jacinto. General Santa Anna was captured, and the independence of Texas was assured.

Many Texans didn't want indepen­dence; they wanted their land to be part of the United States. Several requests were made to have the United States annex (take over) the Lone Star Republic. These requests were politely refused. As a result, the government of Texas started showing increased friendship for Britain. This caused some Americans to worry that Texas might become linked to British North America. Finally, in 1845, Texas became a state of the United States of America.

Mexico refused to recognize this action. This led to numerous raids and small battles along the disputed boundary between Mexico and Texas. On April 24, 1846, 1,600 Mexican soldiers surrounded and killed 63 Americans on land claimed by Texas. President Polk asked Congress for a declaration of war against Mexico. He got it

Mexico was defeated in the war and its capital, Mexico City, was occupied. Some Americans talked about taking over all of Mexico. But President Polk rejected that view. He wanted acceptance of Texas as part of the United States and the purchase of California and the New Mexico regions.

A peace treaty between Mexico and the United States ended the war on February 2, 1848. It set the boundary between Texas and Mexico along the Rio Grande River. For a payment of $18,250,000 Mexico turned over the immense California and New Mexico regions to the United States. These regions include the present states of California, Nevada, Utah, Arizona and New Mexico.

Beyond the continental United States, Alaska was purchased from Russia in 1867 and the Hawaiian Islands were annexed to the United States in 1898. Both these regions did not become states until 1959.

FARMERS AND CATTLEMEN

In 1848, gold was discovered in California, though most Americans did not hear about it until the beginning of 1849. When they did, a massive "gold rush" occurred. The population of California increased from 15,000 in 1848 to about 260,000 in 1852.

Though the United States stretched from ocean to ocean in the late 1840s, the vast region between the Mississippi Valley and the western side of the Rocky Mountains was almost unoccupied. The thousands of people streaming west to California and Oregon saw it as just a difficult, dangerous place to pass through. They certainly did not view it as a desirable place to stay.

This thinking changed in the 1860s. A railroad was being pushed westward to span the continent. Some people realized that they could raise cattle cheaply on "the sea of grass"—the Great Plains—and use the railroad to ship cattle to markets in the eastern states.

Cattle were allowed to graze freely on the plains and were rounded up once a year by cowboys. Ranches were marked off and a few towns were built along the railroad route. In the process, the great buffalo herds were killed off.

Of course, the Plains Indians bitterly resented the coming of the white people and the loss of the buffalo herds. So they fought back. Though the Indians won some battles, it was, in the long run, a losing cause. By the end of the 1800s, the tribes were scat­tered, living on government reservations.

As the Indians and the buffalo disap­peared from the Great Plains, another group, made up mostly of small "homestead" farm­ers, moved in to compete with the cattlemen.

Starting in 1862 and continuing to 1900, the United States government offered 160 acres (65 hectares) of land to each family who would live on the land for five years and improve it. A small fee for registration of the land was also charged. Each lot of 160 acres was called a homestead and the farmers living on such a homestead were called homesteaders.

From the beginning, cattle ranchers and homesteaders were in conflict. They represented different ways of life. Cattle ranchers and cowboys viewed the homestead farms as a waste of good rangeland. At first, the homesteaders' crops were often eaten or trampled by free-roaming cattle, so the homesteaders started fencing their land with barbed wire. This increased conflicts with the cattlemen, particularly when a source of water was fenced in.

In time, the ranchers and farmers learned to live side by side. But their combined use of the land almost destroyed it. Unlike the buffalo herds of the past, cattle grazed the grass close to the soil. Farming was even more difficult on the prairie land. Plowing exposed the loose soil, and in periods of drought, the strong winds of the plains lifted the dry, powdery soil into the air. After a long period of drought in the 1930s, dust storms were frequent.

In an old frontier tradition, many people from the southern plains states abandoned their farms and moved west. But the age of the frontier was over. There was no more free land. Many of the people who moved to California during the "dust bowl" years had to work picking fruit on other peoples' farms.

NEED FOR CONSERVATION

The frontier experience of moving westward and breaking new ground gave Americans several traditions. One was of independence, self-reliance and resourcefulness. Another, unfortunately, was a tradition of wasteful uses of natural resources. Land, water, timber and wild animals seemed so plentiful that people on the frontier thought these resources would never run out. For a long time it was easier and cheaper to abandon worn-out farmland than to nurse it back to productive use.

However, not all Americans feel this way. From the early colonial period, many Americans have argued for the preservation of forests, lakes and rivers and the careful use of farmland. Even the young George Washington, as a Virginia surveyor, was an early pioneer in conservation.

By the mid-1800s the need to conserve natural resources acquired a special urgency. The buffalo herds were rapidly disap­pearing. So were many other wild animals, including wolves, passenger pigeons, fur seals and sea otters. Forests were being destroyed by logging and forest fires. Rivers and lakes were being clogged and polluted with the waste of logging and mining.

Several naturalists called for action by the American people and government to save the nation's natural heritage. Chief among them was John Muir (1838-1914). Muir, who was born in Scotland, roamed through the West studying and describing the natural wonders of his adopted land. He also campaigned vigorously for a national effort to save those natural wonders for future generations. It was largely through his efforts that wilderness lands were set aside as public parks. The first of these was the Yosemite Grant in California. This consists of a beautiful valley surrounded by cliffs and pinnacles. Giant Sequoia trees and other rare plants grow there.

Yosemite was made a national park in 1890, but it wasn't the first. That honor went to Yellowstone, a 2.25 million-acre (one million hectares) tract of wilderness land established as a national park in 1871.

President Theodore Roosevelt, who knew and loved the vast, unspoiled beauty of the American West, began fighting for conservation as soon as he came into office in 1901. Sweeping provisions made to conserve the natural resources of the nation were among the most important achievements of the Roosevelt administration.

Many national parks and national forests were set aside as reserves after 1901, and a National Park Service was set up to administer them in 1916. The National Park system is an American example of conservation that has since been imitated by many countries around the world.

Private groups and government agencies came into existence to regulate and restore wildlife, to conserve soil and water, and to manage fishery resources. However, though most Americans have become committed to conservation, a legitimate debate continues over setting a proper balance between conservation and the development of America's natural resources for the sake of promoting national economic welfare and energy self- sufficiency.

Starting around 1880, a number of programs were set up to reclaim eroded land. Farmers were encouraged to buy or build small windmills to pump irrigation water out of deep wells. Later, rivers were dammed and irrigation canals built to provide additional water. Farmers introduced new types of wheat which could resist cold winters and hot, dry summers, and experimented with contour plowing and crop rotation methods. More recently, agricultural researchers have developed a method of planting without plowing. Known as conservation tillage, it involves leaving the previous crop's residue on the surface to lessen soil erosion. Then, instead of scarring the soil with plow blades, rows of tiny holes are punched in the soil to accept the new seeds.

The westward flow of settlement across the United States first led to wasteful attitudes and practices. Later there developed a popular grass-roots concern for natural resources that gains strength year by year. Americans have pioneered many conservation efforts. In the creation of large national parks and forests, they've set an example for the world.

HISTORY: LEIF ERICSON TO 1865

NEW LAND

In July 1776, members of the Second Continental Congress

sign the Declaration of Independence, which proclaims that "all men are created

equal," and that they possess "certain unalienable rights, that among these

are life, liberty and the pursuit of happiness." Library of Congress

Around the year 1000, a party of Icelandic Vikings under Leif Ericson sailed to the eastern coast of North America. They landed at a place they called Vinland. Remains of a Viking settlement have been found in the Canadian province of Newfoundland. The Vikings may also have visited Nova Scotia and New England. They failed, however, to establish any permanent settlements, and they soon lost contact with the new continent.

Five hundred years later, the need for increased trade and an error in navigation led to another European encounter with America. In late 15th-century Europe, there was a great demand for spices, textiles and dyes from Asia. Christopher Columbus, a mariner from Italy, mistakenly believed that he could reach the Far East by sailing 4,000 miles (6,400 kilometers) west from Europe. In 1492, he persuaded the king and queen of Spain to finance such a voyage. Columbus sailed west, but he did not reach Asia. Instead he landed on one of the Bahama Islands in the Caribbean Sea.

Columbus eventually explored most of the Caribbean area. He never reached the Far East; but he did return home with some gold, and within 40 years treasure-hungry Spanish adventurers had conquered a huge empire in South and Central America. The Spanish also established some of the earliest settlements in North America—St. Augustine in Florida (1565), Santa Fe in New Mexico (1609) and San Diego in California (1769).

The Europeans were initially drawn to the New World in search of wealth. When Columbus and later Spanish explorers returned to Europe with stories of abundant gold in the Americas, each European sovereign hastened to claim as much territory as possible in the New World— along with whatever wealth might be extracted from it.

Enforcing these claims could only be accomplished by establishing settlements of Europeans on the territory. This requirement— combined with the zeal of Spanish priests to convert the indigenous inhabitants of the Americas to Christianity, the need of European religious and political dissenters for refuge from persecution in their homelands, and the thirst for adventure of some individuals—fueled the drive for the establishment of colonies.

ENGLISH SETTLEMENTS

The first successful English colony in the Americas was founded at Jamestown, Virginia, in 1607. The settlement was financed by a London company which expected to make a profit from the settlement. It never did. Of the first 105 colonists, 73 died of hunger and disease within seven months of their arrival. But the colony survived and eventually grew and became wealthy. The Virginians discovered a way to earn money by growing tobacco, which they began shipping to England in 1614.

In New England, the northeastern region of what is now the United States, several settlements were established by English Puritans. These settlers believed that the Church of England had adopted too many practices from Roman Catholicism, and they came to America to escape persecution in England and to found a colony based on their own religious ideals. One group of Puritans, called the "Pilgrims," crossed the Atlantic in the ship Mayflower and settled at Plymouth, Massachusetts in 1620. A much larger Puritan colony was established in the Boston area in 1630. By 1635, some settlers were already migrating to nearby Connecticut.

The Puritans hoped to build "a city upon a hill"—an ideal community. Since that time, Americans have viewed their country as a great experiment, a worthy model for other nations. New England also established another American tradition—a strain of often intolerant moralism. The Puritans believed that governments should enforce God's morality. They strictly punished drunks, adulterers, violators of the Sabbath and heretics. In the Puritan settlements the right to vote was restricted to church members, and the salaries of ministers were paid out of tax revenues.

One Puritan who disagreed with the decisions of the community, Roger Williams, protested that the state should not interfere

with religion. Forced to leave Massachusetts in 1635, he set up the neighboring Rhode Island colony, that guaranteed religious freedom and the separation of church and state. The colonies of Maryland, settled in 1634 as a refuge for Roman Catholics, and Pennsylvania, founded in 1681 by the Quaker leader William Penn, were also characterized by religious toleration. This toleration, in its turn, attracted further groups of settlers to the New World.

Over time, the British colonies in North America were also occupied by many non- British national groups. German farmers settled in Pennsylvania, Swedes founded the colony of Delaware, and African slaves first arrived in Virginia in 1619. In 1626, Dutch settlers purchased Manhattan Island from local Native American, or "Indian" chiefs and built the town of New Amsterdam; in 1664, the settlement was captured by the English and renamed New York.

COLON1AL ERA

To the foreign visitor, America has always appeared to be not one culture, but a mixture of different cultures. In the colonial period, this mixture of contrasting traditions was already taking shape. The narrow idealism of Massachusetts existed beside the more tolerant idealism of Rhode Island, the ethnic variety of Pennsylvania and the practical commercial agriculture of Virginia. Most American colonists worked on small farms. In the southern colonies of Virginia, North Carolina and South Carolina, landowners carved large tobacco and rice plantations out of fertile river basins. These were worked by Africans under the system of slavery, which had evolved slowly since 1619, or by free Englishmen who contracted to work without pay for several years in return for their passage to America.

By 1770, several small but growing urban centers had emerged, each supporting newspapers, shops, merchants and craftsmen. Philadelphia, with 28,000 inhabitants, was the largest city, followed by New York, Boston and Charleston, South Carolina. Unlike most other nations, the United States never had a feudal aristocracy. Land was plentiful and labor was scarce in colonial America, and every free man had an opportunity to achieve economic independence, if not prosperity.

All of the colonies shared a tradition of representative government. The English king appointed many of the colonial governors, but they all had to rule in cooperation with an elected assembly. Voting was restricted to landowning white males, but most white males owned enough property to vote. Britain could not exercise direct control over her American colonies. London was too far away, and the colonists were too independent-minded.

By 1733, English settlers had occupied 13 colonies along the Atlantic coast, from New Hampshire in the north to Georgia in the south. The French controlled Canada and Louisiana, which included the entire Mississippi watershed—a vast empire with few people. Between 1689 and 1815, France and Britain fought several wars, and North America was drawn into every one of them. By 1756, England and France were fighting the Seven Years' War, known in America as the French and Indian War. William Pitt, the British prime minister, invested soldiers and money in North America and won an empire. British forces captured the Canadian strong points of Louisburg (1758), Quebec (1759) and Montreal (1760). The Peace of Paris, signed in 1763, gave Britain title to Canada and all of North America east of the Mississippi River.

Britain's victory led directly to a conflict with its American colonies. To prevent fighting with the Native Americans, known as Indians to the Europeans, a royal proclamation denied colonists the right to settle west of the Appalachian mountains. The British government began punishing smugglers and charged new taxes on sugar, coffee, textiles and other imported goods. The Quartering Act forced the colonies to house and feed British soldiers; and with the passage of the Stamp Act, special tax stamps had to be attached to all newspapers, pamphlets, legal documents and licenses.

These measures seemed quite fair to British politicians, who had spent large sums of money to defend their American colonies during and after the French and Indian War. Surely, they reasoned, the colonists should pay a part of those expenses. But the Americans feared that the new taxes would make trading difficult, and that British troops stationed in the colonies might be used to crush the civil liberties which the colonists had heretofore enjoyed. Overall, these fears were quite groundless, but they were precursors of what have become ingrained traditions in American politics. Americans distrust the power of "big government"; after all, millions of immigrants came to this country to escape political repression. Americans also have always insisted on exercising some control over the system of taxation which supports their government. Speaking as freeborn Englishmen, colonial Americans insisted that they could be taxed only by their own colonial assemblies. "No taxation without representation" was their rallying cry. In 1765, representatives from nine colonies met as the "Stamp Act Congress" and spoke out against the new tax. Merchants refused to sell British goods, mobs threatened stamp distributors and most colonists simply refused to use the stamps. The British Parliament was forced to repeal the Stamp Act, but it enforced the Quartering Act, enacted taxes on tea and other goods and sent customs officers to Boston to collect those tariffs. Again the colonists refused to obey, so British soldiers were sent to Boston.

Tensions eased when Lord North, the new British chancellor of the exchequer, removed all the new taxes except that on tea. In 1773, a group of patriots responded to the tea tax by staging the "Boston Tea Party": Disguised as Indians, they boarded British merchant ships and tossed 342 crates of tea into Boston harbor. Parliament then passed the "Intolerable Acts": The independence of the Massachusetts colonial government was sharply curtailed, and more British soldiers were sent to the port of Boston, which was now closed to shipping. In September 1774, the First Continental Congress, a meeting of colonial leaders opposed to what they perceived to be British oppression in the colonies, met in Philadelphia. These leaders urged Americans to disobey the Intolerable Acts and to boycott British trade. Colonists began to organize militias and to collect and store weapons and ammunition.

REVOLUTION

On April 19, 1775, 700 British soldiers marched from Boston to forestall a rebellion of the colonists by capturing a colonial arms depot in the nearby town of Concord. At the village of Lexington, they confronted 70 militiamen. Someone—no one knows who fired a shot, and the American War of Independence began. The British easily captured Lexington and Concord, but as they marched back to Boston they were harasse by hundreds of Massachusetts volunteers. By June, 10,000 American soldiers had besieged Boston, and the British were fore to evacuate the city in March 1776.

In May 1775, a second Continental Congress had met in Philadelphia and beg to assume the functions of a national government. It founded a Continental Army and Navy under the command of George Washington, a Virginia planter and Vetera of the French and Indian War. It printed paper money and opened diplomatic relations with foreign powers. On July 2, 1776, the Congress finally resolved "That these United Colonies are, and of right ought to be free and independent states." Thomas Jefferson of Virginia, assisted by others, drafted a Declaration of Independence, which the Congress adopte* on July 4, 1776.

The Declaration presented a public defense of the American Revolution, including a lengthy list of grievances against the British king, George III. Most importantly, it explained the philosophy behind the revolution—that men have a natural right to "Life, Liberty and the pursuit of Happiness"; that governments can rule only with "the consent of the governed"; t\ any government may be dissolved when it fails to protect the rights of the people. This theory of politics came from the British philosopher John Locke, and it is central to the Anglo-Saxon political tradition.

At first, the war went badly for the Americans. The British captured New York City in September 1776, and Philadelphia was captured a year later. The tide turned i October 1777, when a British army under General John Burgoyne surrendered at Saratoga, in northern New York. Encouraged by that victory, France seized , opportunity to humble Britain, her traditional enemy. A Franco-American alliance was signed in February 1778. With few provisions and little training, American troops generally fought well, but they might have lost the war if they had not received a from the French treasury and the powerful French Navy.

After 1778, the fighting shifted largely to the south. In 1781, 8,000 British troops under General George Cornwallis were surrounded at Yorktown, Virginia, by a French fleet and a combined French- American army under George Washington' command. Cornwallis surrendered, and soon afterward the British government asked for peace. The Treaty of Paris, signed in September 1783, recognized the independence of the United States and granted the new nation all the territory north of Florida, south of Canada and east of the Mississippi River.

DEVISING A CONSTITUTION

The 13 colonies were now "free and independent states"—but not yet one united nation. Since 1781, they had been governed by the Articles of Confederation, a constitution that set up a very weak central government. The American people had just rebelled against a parliament in distant London, and they did not want to replace it with a tyrannical central authority at home. Under the Articles of Confederation, Congress, comprised of representatives of the people, could not make laws or raise taxes. There was no federal judiciary and no permanent executive. The individual states were almost independent: They could even set up their own tax barriers.

In May 1787, a convention met in Philadelphia with instructions to revise the Articles of Confederation. The delegates— among whom were George Washington, Benjamin Franklin and James Madison—went beyond their mandate and drafted a new and more workable Constitution. It established a stronger federal government empowered to collect taxes, conduct diplomacy, maintain armed forces, and regulate foreign trade and commerce among the states. It provided for a Supreme Court and lesser federal courts, and it gave executive power to an elected president. Most importantly, it established the principle of a "balance of power" to be maintained among the three branches of government—the executive, the legislative and the judicial. Under this principle, each branch was provided the independent means to exercise checks on and to balance the activities of the others, thus guaranteeing that no branch could exert dictatorial authority over the workings of the government.

The Constitution was accepted in 1788, but only after much bitter debate. Many Americans feared that a powerful central government would trample on the liberties of the people, and in 1791, 10 amendments—the Bill of Rights— were added to the Constitution. This document guaranteed freedom of religion, a free press, free speech, the right of citizens to bear arms, protection against illegal house searches, the right to a fair trial by jury and protection against "cruel and unusual punishments."

The Constitution and the Bill of Rights thus struck a balance between two conflicting but fundamental aspects of American politics—the need for a strong, efficient central authority and the need to ensure individual liberties. America's first'two political parties divided along those ideological lines. The Federalists favored a strong president and central government; the Democratic Republicans defended the rights of the individual states, because this seemed to guarantee more "local" control and accountability. This party appealed to small farmers; the Federalist party was the party of the prosperous classes, and it would die out by 1820.

NEW NATION

As the first president of the United States, George Washington governed in a Federalist style. When Pennsylvania farmers refused to pay a federal liquor tax, Washington mobilized an army of 15,000 men to put down the "Whiskey Rebellion." Under his Secretary of the Treasury, Alexander Hamilton, the federal government took over the debts of the individual states and set up a national bank. These fiscal measures were designed to encourage investment and to persuade business interests to support the new government.

In 1797, Washington was succeeded by another Federalist, John Adams, who became involved in an undeclared naval war with France. In an atmosphere of war hysteria, the Federalist-controlled Congress passed the Alien and Sedition Acts in 1798. These measures permitted the deportation or arrest of "dangerous" aliens, and they prescribed fines or imprisonment for publishing "false, scandalous, and malicious" attacks on the government. Ten Republican editors were convicted under the Sedition Act, which was bitterly denounced by Virginia lawyer and main author of the Declaration of Independence, Thomas Jefferson.

The repression which occurred under the Alien and Sedition Acts ended in 1801, when Thomas Jefferson was elected president. As a Republican, Jefferson was an informal, accessible chief executive. Although he wanted to limit the power of the president, political realities forced Jefferson to exercise that power vigorously. In 1803, he bought the huge Louisiana territory from France for $15 million: Now the United States would extend as far west as the Rocky Mountains. When North African pirates attacked American ships, Jefferson sent a naval expedition against the state of Tripoli.

Meanwhile, the Supreme Court, under Chief Justice John Marshall, was asserting its own authority. In the 1803 case of Marbury v. Madison, Marshall affirmed that the Court could declare void any act of Congress "repugnant to the Constitution." That ruling established the most fundamental idea in American constitutional law—that the Supreme Court makes the final decision in interpreting the Constitution and can, if the justices determine a law to be unconstitutional, declare the law void, although it was enacted by the Congress and signed by the president.

During the Napoleonic Wars, British and French warships harassed American merchant ships. Jefferson responded by banning American exports to Europe, but New England merchants protested that their trade was ruined by the embargo, which Congress repealed in 1809. In 1812, however, President James Madison went to war with Britain over this issue.

During the War of 1812, American warships had some impressive victories, but the vastly superior British Navy blockaded American ports. Attempts to invade British Canada ended in disaster, and British forces captured and burned Washington, the nation's new capital city. Britain and the United States agreed on a compromise peace in December 1814; neither side won any concessions from the other. Two weeks later, General Andrew Jackson routed a British assault on New Orleans. News of the peace treaty had ®ot yet reached the soldiers.

After the war, the United States enjoyed a period of rapid economic expansion. A national network of roads and canals was built, steamboats traveled the rivers, and the first steam railroad opened in Baltimore, Maryland, in 1830. The Industrial Revolution had reached America: There were textile mills in New

England; iron foundries in Pennsylvania. By the 1850s, factories were producing rubber goods, sewing machines, shoes, clothing, farm implements, guns and clocks.

The frontier of settlement was pushed west to the Mississippi River and beyond. In 1828, Andrew Jackson became the first man born into a poor family and born in the West, away from the cultural traditions of the Atlantic seaboard, to be elected president. Jackson and his new Democratic party, heirs to the Jeffersonian Republicans, promoted a creed of popular democracy and appealed to the humble members of society—the farmers, mechanics and laborers. Jackson broke the power of the Bank of the United States, which had dominated the nation's economy. He rewarded inexperienced but loyal supporters with government jobs. He made land available to western settlers—mainly by forcing Indian tribes to move west of the Mississippi.

SECTIONAL CONFLICT

The Jacksonian era of optimism was clouded by the existence in the United States of a social contradiction—increasingly recognized as a social evil—that would eventually tear the nation apart: slavery. The words of the Declaration of Independence—"that all men are created equal"—were meaningless for the 1.5 million black people who were slaves. Thomas Jefferson, himself a slaveowner, recognized that the system was inhumane and wrote an attack on slavery into the Declaration, but Southern delegates to the Continental Congress forced him to remove the passage. The importation of slaves was outlawed in 1808, and many Northern states moved to abolish slavery, but the Southern economy was based on large plantations, which used slave workers to grow cotton, rice, tobacco and sugar. Still, in several Southern states, small populations of free blacks also worked as artisans or traders.

In 1820, Southern and Northern politicians disputed the question of whether slavery would be legal in the western territories. Congress agreed on a compromise: Slavery was permitted in the new state of Missouri and the Arkansas territory, and it was barred everywhere west and north of Missouri. But the issue would not go away, some organized themselves into abolitionist societies, primarily in the North, Southern whites defended slavery with increasing ardor. The nation was also split over the issue of high tariff, which protected Northern industries but raised prices for Southern consumers.

Meanwhile, thousands of Americans had been settling in Texas, then a part of Mexico. The Texans found Mexican rule under General Santa Anna increasingly oppressive, and in 1835 they rebelled, defeated a Mexican army and set up the independent Republic of Texas. In 1845, the United States annexed Texas, and Mexico suspended diplomatic relations. President James K. Polk ordered American troops into disputed territory on the Texas border. After a battle between Mexican and American soldiers in May 1846, Congress declared war on Mexico

An American army landed near Vera in March 1847 and captured Mexico City in September. In return for $15 million, Mexico was forced to surrender an enormous expanse of territory—most of what is today California, Arizona, Nevada, Utah, New Mexico and Colorado.

In 1846, by settling a long-standing border dispute with British Canada, the United States had acquired clear title to the southern half of the Oregon Country—the present states of Oregon, Washington and Idaho. Thus America became a truly continental power, stretching from the Atlantic to the Pacific.

The acquisition of these new territories revived a troubling question: Would newly acquired territories be open to slavery? In 1850, Congress voted another compromise: California was admitted as a free state, and the inhabitants of the Utah and New Mexico territories were allowed to decide the issue for themselves. Congress also passed the Fugitive Slave Act, which helped Southerners to recapture slaves who had escaped to the free states. Some Northern states did not enforce this law, however, and abolitionists continued to assist fleeing blacks. Harriet Beecher Stowe of Massachusetts wrote Uncle Tom's Cabin, a sentimental but powerful anti-slavery novel which converted many readers to the abolitionist cause. The issue of slavery became, in American politics, economics and cultural life, the central point of contention.

In 1854, Senator Stephen Douglas of Illinois persuaded Congress to allow the inhabitants of the Kansas and Nebraska territories to resolve the question of slavery within their own borders—which voided the Missouri Compromise of 1820. In Kansas, the result was a violent feud between pro- slavery and anti-slavery settlers. In 1857, the Supreme Court handed down the Dred Scott decision, which held that blacks had no rights as American citizens and that Congress had no authority to bar slavery in the Western territories.

In 1858, when Senator Douglas ran for reelection, he was challenged by Abraham Lincoln and the Republican party (a new anti- slavery party unrelated to Jefferson's Republican party). In a series of historic debates with Douglas, Lincoln demanded a halt to the spread of slavery. He was willing to tolerate slavery in the Southern states, but at the same time he affirmed that "this government cannot endure permanently half slave and half free."

CIVIL WAR

Lincoln lost the senatorial race, but in 1860 he and Douglas faced each other again—as the Republican and Democratic candidates for president. By now the tension between North and South was extreme. In 1859, John Brown, an abolitionist zealot, had tried to begin a slave rebellion in Virginia by attacking an army munitions depot. Brown was quickly captured, tried and hanged, whereupon many Northerners hailed him as a martyr. Southern whites, however, now believed that the North was preparing to end slavery by bloody warfare. Douglas urged Southern Democrats to remain in the Union, but they nominated their own separate presidential candidate and threatened to secede if the Republicans were victorious.

The majority in every Southern and border state voted against Lincoln, but the North supported him and he won the election. A few weeks later, South Carolina voted to leave the

Union. It was soon joined by Mississippi, Florida, Alabama, Georgia, Louisiana, Texas, Virginia, Arkansas, Tennessee and North Carolina. These 11 states proclaimed themselves an independent nation—the Confederate States of America—and the American Civil War began.

Southerners proclaimed that they were fighting not just for slavery; after all, most Confederate soldiers were too poor to own slaves. The South was waging a war for independence—a second American Revolution. The Confederates usually had the advantage of fighting on their home territory, and their morale was excellent. They had superb soldiers, cavalrymen and generals, but they were greatly outnumbered by Union (Northern) forces. The Southern railroad network and industrial base could not support a modern war effort. The Union navy quickly imposed a blockade, which created serious shortages of war materiel and consumer goods in the Confederacy. To fight the war, both sides suspended some civil liberties, printed mountains of paper money and resorted to conscription.

Lincoln's two priorities were to keep the United States one country and to rid the nation of slavery. Indeed, he realized that by making the war a battle against slavery he could win support for the Union at home and abroad. Accordingly, on January 1, 1863, he issued the Emancipation Proclamation, which granted freedom to all slaves in areas still controlled by the Confederacy.

The Southern army (Confederates) won some victories in the early part of the war, but in the summer of 1863 their commander, General Robert E. Lee, marched north into Pennsylvania. He met a Union army at Gettysburg, and the largest battle ever fought on American soil ensued. After three days of desperate fighting, the Confederates were defeated. At the same time, on the Mississippi River, Union General Ulysses S. Grant captured the important city of Vicksburg. Union forces now controlled the entire Mississippi Valley, splitting the Confederacy in two.

In 1864, a Union army under General William T. Sherman marched across Georgia, destroying the countryside. Meanwhile, General Grant relentlessly battled Lee's forces in Virginia. On April 2, 1865, Lee was forced to abandon Richmond, the Confederate capital. A week later he surrendered to Grant at Appomattox Court House, and all other Confederate forces soon surrendered. On April 14, Lincoln was assassinated by the actor John Wilkes Booth.

The Civil War was the most traumatic episode in American history. Even today, the scars have not entirely healed. All of America's later wars would be fought well beyond the boundaries of the United States, but this conflict devastated the South and subjected that region to military occupation. America lost more soldiers in this war than in any other—a total of 635,000 dead on both sides.

The war resolved two fundamental questions that had divided the United States since 1776. It put an end to slavery, which was completely abolished by the 13th Amendment to the Constitution in 1865. It also decided, once and for all, that America was not a collection of semi-independent states, but a single indivisible nation.

HISTORY: 1865 TO 1929

By Jonathan Rose (Drew University)

Though the victory of the North in the American Civil War assured the integrity of the United States as an indivisible nation, much was destroyed in the course of the conflict, and the secondary goal of the war, the abolition of the system of slavery, was only imperfectly achieved.

At Promontory Point, Utah, workers celebrate completion of th

e first transcontinental railroad in 1869.

The railroad was a key element in the transformation of the

United States from a predominately agricultural to an industrial nation.

Union Pacific Railroad

RECONSTRUCTION

The defeat of the Confederacy (Southern states) left what had been the country's most fertile agricultural area economically destroyed and its rich culture devastated. At the same time, the legal abolition of slavery did not ensure equality in fact for former slaves. Immediately after the Civil War, legislatures in the Southern states7-fearful of the way in which former slaves might exercise the right to vote and also eager to salvage what they could of their former way of life, attempted to block blacks from voting.

They did this by enacting "black codes" to restrict the freedom of former slaves. Although "radical" Republicans in Congress tried to protect black civil rights and to bring blacks into the mainstream of American life, their efforts were opposed by

President Andrew Johnson, a Southerner who had remained loyal to the Union during the Civil War. He served as the Republican vice president, and was elevated to the presidency on the assassination of Abraham Lincoln.

In March 1868, the House of Representatives responded to Johnson's opposition to radical solutions by attempting to remove him from office. The charges against him were groundless, and a motion to convict the president was defeated in the Senate. Johnson had, in the opinion of many people, been too lenient toward former Confederates, but his acquittal was an important victory for a central principle of American government. The principle is the separation of powers between the legislative, executive and judicial branches of government. Johnson's acquittal helped preserve the delicate balance of power between the president and Congress.

Congress nevertheless was able to press forward with its program of "Reconstruction," or reform, of the Southern states, occupied after the war by the army of the North. By 1870, Southern states were governed by groups of blacks, cooperative whites and transplanted Northerners (called "carpetbaggers"). Many Southern blacks were elected to state legislatures and to the Congress. Although some corruption existed in these "reconstructed" state governments, they did much to improve education, develop social services and protect civil rights.

Reconstruction was bitterly resented by most Southern whites, some of whom formed the Ku Klux Klan, a violent secret

society that hoped to protect white interests and advantages by terrorizing blacks and preventing them from making social advances. By 1872, the federal government had suppressed the Klan, but white Democrats continued to use violence and fear to regain control of their state governments. Reconstruction came to an end in 1877, when new constitutions had been ratified in all Southern states and all federal troops were withdrawn from the South.

Despite Constitutional guarantees, Southern blacks were now "second-class citizens"—that is, they were subordinated to whites, though they still had limited civil rights. In some Southern states, blacks could still vote and hold elective office. There was racial segregation in schools and hospitals, but trains, parks and other public facilities could still generally be used by people of both races.

Toward the end of the century, this system of segregation and oppression of blacks grew far more rigid. In the 1896 case of Plessy v. Ferguson, the United States Supreme Court ruled that the Constitution permitted separate facilities and services for the two races, so long as these facilities and services were equal. Southern state legislatures promptly set aside separate— but unequal—facilities for blacks. Laws enforced strict segregation in public transportation, theaters, sports, and even elevators and cemeteries. Most blacks and many poor whites lost the right to vote because of their inability to pay the poll taxes (which had been enacted to exclude them from political participation) and their failure to pass literacy tests. Blacks accused of minor crimes were sentenced to hard labor, and mob violence was sometimes perpetrated against them. Most Southern blacks, as a result of poverty and ignorance, continued to work as tenant farmers. Although blacks were legally free, they still lived and were treated very much like slaves.

MOVIIMG WEST

In the years following the end of the Civil War in 1865, Americans settled the western half of the United States. Miners searching for gold and silver went to the Rocky Mountain region. Farmers, including many German and Scandinavian immigrants, settled in Minnesota and the Dakptas. - Enormous herds of cattle grazed on tfie plains of Texas and other western states, managed by hired horsemen (cowboys) who became the most celebrated and romanticized figures in American culture. Most of these horsemen were former Southern soldiers or former slaves, both of whom headed west after the defeat of the South. The cowboy was America's hero: He worked long hours on the open range for low wages. He was not nearly as violent as movies later represented him to be.

Settlers and the United States Army fought frequent battles with Indians, upon whose lands the stream of white settlers was encroaching, but here, too, the bloodshed has been exaggerated. A total of perhaps 7,000 whites and 5,000 Indians were killed in the course of the 19th century. Many more Indians died of hunger and disease caused by the westward movement of settlers. White men forced the Indians from their land and nearly destroyed all of the buffalo, the main source of food and hides for the tribes of the Great Plains.

INDUSTRIAL GROWTH

During this period, the United States was becoming the world's leading industrial power, and great fortunes were made by shrewd businessmen. The first transcontinental railroad was completed in 1869. Between 1860 and 1900, total rail mileage increased from 31,000 to almost 200,000 (5,000 to 322,000 kilometers)— more than in all of Europe. To encourage this expansion, the federal government granted loans and free land to western railroads.

The petroleum industry prospered, dominated by John D. Rockefeller's giant Standard Oil Company. Andrew Carnegie, who came to America as a poor Scottish immigrant, built a vast empire of steel mills and iron mines—which he sold in 1901 for nearly 500 thousand million dollars. Textile mills multiplied in the South, and meatpacking plants sprang up in and around Chicago. An electrical industry was created by a series of inventions—the telephone, the phonograph, the light bulb, motion pictures, the alternating-current motor and transformer. In Chicago, architect Louis Sullivan used steelframe construction to develop a peculiarly

American contribution to the cities of the world—the skyscraper.

Nineteenth-century Americans pointed with pride at these accomplishments—and with good reason. The United States has always been hospitable to inventors, experimenters and entrepreneurs. The freedom to develop new enterprises largely accounts for the vitality of the American economy.

But unrestrained economic growth created many serious problems. Some businesses grew too big and too powerful. The United States Steel Corporation, formed in 1901, was the largest corporation in the world, producing 60 percent of the nation's steel. To limit competition, railroads agreed to mergers and standardized shipping rates. "Trusts"— huge combinations of corporations—tried to establish monopoly control over some industries, especially oil.

These giant enterprises could produce goods efficiently and sell them cheaply, but they could also set prices and destroy smaller competitors. Farmers in particular complained that railroads charged high rates for hauling produce. Most Americans, then as now, admired business success and believed in free enterprise; but they also believed that the power of monopolistic corporations had to be limited to protect the rights of the individual.

One answer to this problem was government regulation. The Interstate Commerce Commission was created in 1887 to control railroad rates. In 1890, the Sherman Antitrust Act banned trusts, mergers and business agreements "in restraint of trade." At first, neither of these measures was very effective, but they established the principle that the federal government could regulate industry for the common good.

LABOR, IMMIGRANTS, FARMERS

Industrialization brought with it the rise of organized labor. The American Federation of Labor, founded in 1881, was a coalition of trade unions for skilled workers. It agitated not for socialism, but for better wages and shorter working hours. Around 1900, the average unskilled laborer worked 52 hours a week for a wage of nine dollars. In the 1890s, discontent over low wages and unhealthful working conditions triggered a wave of industrial work stoppages, some of them violent. Several workers and company guards were killed during an 1892 strike at the Carnegie Steel plant in Homestead, Pennsylvania. In 1894, Army troops were sent to Chicago to end a strike of railroad workers.

Many of the workers in these new industries were immigrants. Between 1865 and 1910, 25 million people came to the United States, many of them settling in large enclaves in major American cities. At the insistence of laborers who feared Asian immigrants because of their willingness to accept low wages for unskilled work, federal legislation barred the entry of Chinese in 1882. The Japanese were largely excluded in 1907, but most other arrivals were free to enter the United States. Immigrants often encountered prejudice from native-born Americans—who, of course, were themselves descended from immigrants. Still, America offered the immigrants more religious liberty, more political freedom and greater economic opportunities than they could find in their native lands. The first-generation immigrant usually had to struggle with poverty, but his children and grandchildren could achieve affluence and professional success. Since the founding of Jamestown, the first permanent European settlement in North America, in 1607, the United States has accepted two-thirds of al the world's immigrants—a total of 50 million people.

For American farmers, the late 19th century was a difficult period. Food prices were falling, and the farmer had to bear the cost of high railroad shipping rates, expensive mortgages, high taxes and tariffs on consumer goods. Several national organizations were formed to defend the interests of small farmers—the Grange in 1867, the National Farmers' Alliance in 1877 and the Populist Party in the 1890s. The Populists demanded nationalization of the railroads, a progressive income tax and monetary reform. In 1896, they supported the Democratic presidential candidate, William Jennings Bryan of Nebraska. A great orator, Bryan conducted an active national campaign, denouncing the trusts, the banks and the railroads. Bryan won the votes of the agricultural states of the South and the West, but he lost the election to William McKinley, a conservative Republican.

OVERSEAS EXPANSION

With the exception of the purchase of Alaska from Russia in 1867, American territorial expansion had come to a virtual standstill in 1848. However, in about 1890, as many European nations were expanding their colonial empires, a new spirit entered American foreign policy, largely following northern European patterns. Politicians, newspaper editors and Protestant missionaries proclaimed that the "Anglo- Saxon race" had a duty to bring the benefits of Western civilization to the peoples of Asia, Africa and Latin America

At the height of this period (1895), a revolt against Spanish colonialism erupted in Cuba. The Spanish army herded Cuban civilians into camps, where as many as 200,000 people died of disease and hunger. In the United States, newspaper owners such as William Randolph Hearst and Joseph Pulitzer published lurid accounts of Spanish atrocities and stirred up popular sentiment for America to liberate the island.

The United States had by now built a modern navy, and in January 1898 the battleship Maine was sent on a visit to Havana, Cuba. On February 15, a mysterious explosion sank the Maine in Havana harbor. It is not clear who or what caused the disaster, but most Americans were convinced at the time that Spain was responsible. The United States demanded that Spain withdraw from Cuba and started mobilizing volunteer troops. Spain responded by declaring war on the United States.

American troops landed in Cuba, and the United States Navy destroyed two Spanish fleets—one at Manila Bay in the Philippines, (a Spanish possession at the time), the other at Santiago in Cuba. In July, the Spanish government asked for peace terms. The United States acquired much of Spain's empire—Cuba, the Philippines, Puerto Rico and Guam. In an unrelated action, the United States also annexed the Hawaiian islands.

In comparison to the empire building of the European powers, America's acquisitive period was limited in scope and of short duration. After the Spanish- American War, Americans justified their actions to themselves on the grounds that they were preparing underdeveloped nations for democracy. But could Americans be imperialists? After all, they had once been a colonial people and had rebelled against foreign rule. The principle of national self-determination was written into the Declaration of Independence. In the Philippines, insurgents who had fought against Spanish colonialism were soon fighting American occupation troops. Many intellectuals, such as the philosopher William James and Harvard University president Charles Eliot, denounced these actions as a betrayal of American values.

Despite the criticisms of the anti- imperialists, most Americans believed that the Spanish conflict had been appropriate and they were eager to assert American power. President Theodore Roosevelt proposed to build a canal in Central America, and in 1903 he offered to buy a strip of land in what is now Panama from the Colombian government. When Colombia refused Roosevelt's offer, a rebellion broke out in the area designated as the canal site. Roosevelt supported the revolt and quickly recognized the independence from Colombia of Panama, which sold the canal zone to the United States a few days later. In 1914, the Panama Canal was opened to traffic.

American troops left Cuba in 1902, but the new republic was required to grant naval bases to the United States. Also, until 1934, Cuba was barred from making treaties that might bring the island into the orbit of another foreign power. The Philippines were granted limited self- government in 1907 and complete independence in 1946. In 1953, Puerto Rico became a self-governing commonwealth within the United States, and in 1959 Hawaii was admitted as the 50th state of the Union.

PROGRESSIVE MOVEMENT

While Americans were venturing abroad, they were also taking a fresh look at social problems at home. Although the economy was booming and prosperity was spreading, up to half of all industrial workers still lived in poverty—and many of those workers were women and children. New York, Boston, Chicago and San Francisco could now boast impressive museums, universities, public libraries—and crowded slums. Before 1900, the prevailing economic dogma had been laissez-faire—the idea that government should interfere with business as little as possible. After 1900, the fashionable ideology was "Progressivism"—a movement to reform society and individuals through government action.

Social workers now went into the slums to set up settlement houses, which provided health services and recreational facilities for the poor. Prohibitionists demanded an end to the sale of liquor— partly to prevent the suffering that alcoholic workers could inflict on their spouses and children. In the cities, reform politicians fought corruption, regulated public transportation, built municipally owned utilities and reduced taxes through more efficient government. Many states passed laws restricting child labor, protecting women workers, limiting work hours and providing workmen's compensation. Women agitated for the right to vote, and by 1914 several states had granted that right.

Popular magazines published sensational articles by "muckrakers"— investigative journalists who exposed shady business practices, corruption in government and poverty in the cities. In 1906, Upton Sinclair attacked the meatpacking industry in his novel The Jungle. Middle-class readers were appalled to learn what went into their breakfast sausages, and a federal meat-inspection statute was soon enacted. The Pure Food and Drug Act (1906) curbed the sale of adulterated food and fraudulent patent medicines; and the Harrison Act (1914) imposed the first effective federal controls on narcotics.

President Theodore Roosevelt strengthened federal regulation of the railroads and enforced the Sherman Antitrust Act against several large corporations, including the Standard Oil Company. In 1902, Roosevelt ended a coal strike by threatening to send in troops—not against the workers, but against uncooperative mine owners. This was a turning point in American industrial policy: No longer would the government automatically side with management in labor disputes. The Roosevelt administration also promoted conservation. Vast reserves of forest land, coal, oil, minerals and water were saved for future generations. The Progressive Movement was primarily a movement of economists, sociologists, technicians and civil servants—social engineers who believed that scientific and cost-efficient solutions could be found to all political problems.

Some Americans favored more radical ideologies. The Socialist party, under Eugene V. Debs, advocated a peaceful, gradual, democratic transition to a state-run economy. The Industrial Workers of the World (or "Wobblies") called for a general strike to overthrow the capitalist system. The IWW never gained a very large following, however, and virtually ceased to exist by 1920. Some Socialists were elected to local offices, but their party could not win more than six percent of the vote in any presidential race. Socialism has never had much appeal in the United States, where economic debates have generally concentrated on the question of whether, and to what extent, the government should regulate private enterprise.

Woodrow Wilson, a Democrat elected president in 1912, believed that the federal government had a responsibility to protect small businesses against large corporations. As part of his "New Freedom" program, Wilson enacted a personal income tax, toughened antitrust laws against huge corporate mergers and created the Federal Trade Commission to police unfair business competition. The 1913 Federal Reserve Act created a government- controlled system of 12 regional reserve banks, which strengthened public regulation of the nation's credit. Wilson also passed laws restricting child labor, granting low-cost loans to farmers and setting a maximum eight-hour working day for railroad workers.

WAR AND PEACE

When the First World War erupted in Europe in August 1914, Wilson urged a foreign policy of strict neutrality. But many Americans were outraged by Germany's invasion of Belgium, and the press published reports (often exaggerated) of German atrocities against Belgian civilians. Americans were also incensed when, in May 1915, a German submarine sank the British liner Lusitania, killing 128 American passengers. In January 1917, Germany declared unrestricted submarine warfare against all ships bound for Allied ports, including neutral merchant vessels. In February, Wilson learned that if Germany and America went to war, the German foreign minister planned to offer an alliance to Mexico and Japan, with the promise that Mexico would recover the lands it had lost to the United States in 1848. By now, America had sold thousands of millions of dollars in munitions and other goods to the Allies, largely on credit.

In April 1917, Wilson asked Congress for a declaration of war—not just to defeat Germany or to end submarine warfare, but to secure "the rights and liberties...of free people everywhere." For Wilson, the war would be a great crusade for world peace and national self-determination. "The world must be made safe for democracy," Wilson proclaimed as America entered "the war to end all wars."

As in Britain and Germany, the necessities of war forced the United States to expand temporarily the authority of the federal government, which was empowered to coordinate railroad administration, war industries, labor relations and food production.

When war was declared, the American army was a small force of 200,000 soldiers. Millions of men had to be drafted, trained, equipped and shipped across a submarine- infested ocean to Europe. A full year passed before the United States Army was ready to make a major contribution to the Allied war effort

In the spring of 1918, the Germans launched a last desperate offensive, in the hope of reaching Paris before the American army was prepared to fight. But a few American divisions were available to assist the French and the British in repelling this attack. By fall, Germany's position was hopeless: Its armies were retreating in the face of a relentless American buildup.

The previous January, Wilson had outlined his war aims—the Fourteen Points. These called for, among other things, open diplomacy, freedom of the seas, free international trade, disarmament and a just settlement of colonial disputes. The map of Europe would be redrawn to establish independent states for every national group, and a world association of nations would be organized to protect the peace. Wilson hoped that, by offering lenient peace terms, he could persuade Germany to cease fighting. In October, the German government asked for peace, and an armistice was declared on November 11.

In 1919, Wilson went to Europe to draft the peace treaty. He was greeted by cheering crowds in the Allied capitals, but the welcome turned sour when negotiations began at Versailles. Despite Wilson's protests, the Allies imposed crushing reparations on Germany and divided its colonies among themselves. Wilson did succeed in establishing the League of Nations, but many Americans feared that such a world organization might drag the United States into another foreign war. A group of Republican senators attached reservations to the Versailles Treaty: They would accept the League of Nations only on the understanding that Congress, not the League, retained control over American armed forces. Britain and France did not object to that reservation, but Wilson stubbornly refused to modify the treaty. The president and the Congress deadlocked over this issue. The United States never ratified the Versailles Treaty and never joined the League of Nations.

ISOLATION AND PROSPERITY

The majority of Americans did not mourn the defeated treaty, for they had grown disillusioned with the results of the war. After 1920, the United States turned inward and withdrew from European affairs.

At the same time, Americans were growing increasingly suspicious of and hostile toward foreigners in their midst. In 1919, a series of terrorist bombings produced what became known as the "Red Scare." Under the authority of Attorney General A. Mitchell Palmer, raids of political meetings were conducted, arrests were made and several hundred foreign-born political radicals—anarchists, socialists and communists—were deported, although most of them were innocent of any crime. In 1921, two Italian anarchists, Nicola Sacco and Bartolomeo Vanzetti, were convicted of murder on the basis of very dubious evidence. Intellectuals protested that Sacco and Vanzetti had been condemned for their political beliefs, but the two men were denied a retrial and, after exhausting all legal appeal procedures, were electrocuted in 1927.

In 1921, Congress had enacted immigration limits, which were tightened in 1924 and again in 1929. These restrictions favored immigrants from Britain, Ireland, Scandinavia and Germany—"Anglo- Saxon" and "Nordic" stock. Small quotas were reserved for eastern and southern Europeans; none at all for Asians. In 1920, Republican party leaders arranged the nomination of Warren G. Harding for president. A politician of limited education, Harding promised the voters a return to "normalcy"—and won a landslide victory. After years of reform, high taxes, war and international entanglements, the majority of Americans voted for a candidate who seemed to embody old-fashioned American values.

But the 1920s were anything but normal. It was an extraordinary and contradictory decade, when hedonism and bohemianism coexisted with a puritanical conservatism. It was the age of Prohibition: In 1920, alcoholic beverages were outlawed by a Constitutional Amendment. But drinkers cheerfully evaded the law in thousands of "speakeasies" (illegal bars), and gangsters made fortunes supplying illegal liquor. The Ku Klux Klan, revived in 1915, attracted millions of followers and terrorized blacks, Catholics, Jews and immigrants. At the same time, there was a flowering of black literature—the "Harlem Renaissance"—and jazz caught the imagination of many white Americans, including composer George Gershwin. Also, in 1928 Democrat Alfred E. Smith became the first Roman Catholic to run for president. There was gross corruption in the administrations of President Harding and James J. Walker, the "playboy mayor" of New York City. But in 1927, Charles Lindbergh excited the nation when he completed the first nonstop airflight from New York to Paris. In an age of materialism and disenchantment, this modest young aviator reaffirmed for Americans the importance of individual heroism.

The controversies of the decade were summed up in the celebrated 1925 "monkey trial," in which John T. Scopes was prosecuted for teaching Darwin's theory of evolution in the Tennessee public schools. In his last great crusade, William Jennings Bryan assisted the prosecution, affirming the literal truth of the biblical account of creation. Scopes was defended by Clarence Darrow, a famous trial lawyer and agnostic, who exposed Bryan's fundamentalism to public ridicule. The trial received national attention because it embodied the great cultural schism of the 1920s—the clash between modern ideas and traditional values.

President Warren Harding, the champion of normalcy, did do something positive by helping stop the repression of political radicals. His Secretary of State, Charles Evans Hughes, set up the Washington Conference of 1921, at which the world's major powers worked out a plan for naval disarmament and agreed to respect the independence of China.

Harding's successor, Calvin Coolidge, was known as a man of few words. His taciturn manner masked a shrewd mind: He knew that silence was an excellent means of intimidating people who asked for political favors. Frugal, puritanical and thoroughly honest, Coolidge was an

immensely popular president. He believed that "The chief business of the American people is business"—and that government should not interfere with private enterprise. "He didn't do anything," quipped comedian Will Rogers, "but that'! what the people wanted done."

For business, the 1920s were golden years of prosperity. The United States was now a consumer society, with a booming market for radios, home appliances, synthetic textiles and plastics. The businessman became a popular hero; the creation of wealth a noble calling. One of the most admired men of the decade was Henry Ford, who had introduced the assembly line into automobile production. Ford was able to pay high wages and still earn enormous profits by manufacturing th Model T—a simple, basic car that millions of buyers could afford. For a moment, it seemed that America had solved the eternal problem of producing and distributing wealth.

There were, however, fatal flaws in the prosperity of the 1920s. Overproduction of crops depressed food prices, and farmer suffered. Industrial workers were earning better wages, but they still did not have enough purchasing power to continue buying the flood of goods that poured out of their factories. With profits soaring and interest rates low, plenty of money was available for investment, but much of that capital wen into reckless speculation. Thousands of millions of dollars poured into the stock market, and frantic bidding boosted the prices of shares far above their real value. Many investors bought stocks "on margin," borrowing money from their brokers to cover up to 90 percent of the purchase price. As long as the market prospered, speculators could make fortunes overnight, but they could be ruined just as quickly if stock prices fell. The bubble of this fragile prosperity finally burs in 1929 in a worldwide depression, and by 1932 Americans were confronting the worst economic crisis of modern times. That collapse, in turn, led to the most profound revolution in the history of American social thought and economic policy.

Suggestions for Further Reading

Bailey, Thomas A. A Diplomatic History of the American People. 10th ed.

Englewood Cliffs, NJ: Prentice-Hall, 1980.

Blum, John M. and others, eds.

The National Experience: A History of the

United States Since 1865. Vol. 2. 6th ed.

San Diego, CA: Harcourt Brace Jovanovich, 1985. Boorstin, Daniel J.

The Americans: The Democratic Experience New York: Random House, 1974.

Burns, James MacGregor. The American Experience: Vol. 2: The Workshop of Democracy. New York: Knopf, 1985.

Smith, Page.

The Rise of Industrial America: A People' History of the Post-Reconstruction Era. New York: McGraw-Hill, 1984

.

HISTORY: 1929 TO THE PRESENT

By Jonathan Rose (Drew University)

GBEAT DEPRESSION

On October 24,1929—"Black Thursday"—a wave of panic selling of stocks swept the New York Stock Exchange. Once started, the collapse of share and other security prices could not be halted. By 1932, thousands of banks and over 100,000 businesses had failed. Industrial production was cut in half, farm income had fallen by more than half, wages had decreased 60 percent, new investment was down 90 percent and one out of every four workers was unemployed.

The Republican president, Herbert Hoover, asked employers not to cut wages, and he tried to reduce interest rates and support farm prices. In 1932, he approved the creation of the Reconstruction Finance Corporation, which lent money to troubled banks.

But these measures were inadequate. To masses of unemployed workers, Hoover seemed uncaring and unable to help them. In the 1932 election, he was resoundingly defeated by Democrat Franklin D. Roosevelt, who promised "a New Deal for the American people."

Jaunty, optimistic and a commanding public speaker, Roosevelt, a former governor of New York State, was able to inspire public confidence as Hoover could not. "The only thing we have to fear is fear itself," Roosevelt stated at his inauguration and he took prompt action to deal with the emergency. Within

United States Navy three months—the historic "Hundred Days"—Roosevelt had rushed through Congress a great number of laws to aid the recovery of the economy. The Civilian Conservation Corps (CCC) put young men to work in reforestation and flood control projects. The Federal Emergency Relief Administration (FERA) aided state and local relief funds, which had been exhausted by the Depression. The Agricultural Adjustment Administration (AAA) paid farmers to reduce production, thus raising crop prices. The Tennessee Valley Authority (TVA) built a network of dams in the Tennessee River area, in the southeastern region of the United States, to generate electricity, control floods and manufacture fertilizer. And the National Recovery Administration (NRA) regulated "fair competition" among businesses and ensured bargaining rights and minimum wages for workers.

In 1935, the Social Security Act established contributory old-age and survivors' pensions, as well as a joint federal- state program of unemployment insurance. The Wagner Labor Relations Act banned unfair employer practices and protected the workers' right to collective bargaining.

The Works Progress Administration (WPA) was one of the most effective of the New Deal measures, probably because it was based on the belief, originating with the Puritans and almost universally accepted among later Americans, that working for one's livelihood is honorable and dignified, but receiving help which one doesn't earn— "charity"—is demeaning and robs people of their independence and their sense of self worth. Financed by taxes collected by the federal government, the WPA created millions of jobs by undertaking the construction of roads, bridges, airports, hospitals, parks and public buildings.

Roosevelt's New Deal programs did not end the Depression. Although the economy improved as a result of this program of government intervention, full recovery was finally brought about by the defense buildup prior to America's entering the Second World War.

On the eve of America's entry into World War II in 1941,

President Franklin Roosevelt (left) and British Prime Minister

Winston Churchill meet aboard ship and issue a declaration known as

the Atlantic Charter. Its principles were later reflected in the Charter of the United Nations.

WORLD WAR II

In September 1939, war erupted in Europe. Roosevelt announced that the United States would be neutral, but not indifferent. In September 1940, when Britain was threatened by a German invasion, the United States gave the British 50 overage destroyers in return for naval bases in the western Atlantic. Two weeks later, Congress approved the first peacetime military conscription in American history. By early 1941, Britain could no longer afford to purchase American goods, so Roosevelt persuaded Congress to enact a "lend-lease" bill. Through this program the United States eventually supplied $13.5 thousand million in war supplies to Britain and another $9 thousand million to the Soviet Union.

In the Far East, Japanese forces had invaded Manchuria (1931), China (1937) and French Indochina (July 1941). Roosevelt responded to this aggression by banning American exports of scrap iron, steel and oil to Japan and by freezing Japanese credits in the United States.

On December 7, 1941, carrier-based Japanese bombers struck at Pearl Harbor naval base in Hawaii. The surprise attack sank or damaged eight battleships and destroyed almost 200 aircraft. The United States immediately declared war on Japan. Four days later, Japan's allies, Germany and Italy, declared war on the United States.

In 1941, Japan possessed a large navy and a greater number of aircraft than could be mobilized by the United States. Prospects for a Japanese military victory depended on Japan's

being able to defeat the Americans before the United States could retool its mighty industrial complex to produce military equipment. At this Japan failed, and the United States was soon producing huge numbers of ships, aircraft and weaponry.

Spurred by the fear that Germany might develop a nuclear weapon, the government spent $2 thousand million on the top-secret Manhattan Project, which produced and tested an atomic bomb in 1945.

American, British and Soviet war planners agreed to concentrate on defeating Germany first. British and American forces landed in North Africa in November 1942, then proceeded to Sicily and the Italian mainland in 1943, liberating Rome on June 4,1944, after months of bitter fighting. Two days later, June 6, "D-Day," Allied troops landed in Normandy in the largest amphibious operation in military history. Paris was liberated on August 24, and by September, American units were across the German border. In December 1944, however, the Germans launched a ferocious assault in the Ardennes region of Belgium. It took a week for the Allies to regroup and a month to counterattack and to force a German withdrawal in what became known as the "Battle of the Bulge." This proved to be the last German offensive of World War II. Finally, on April 25,1945, the western Allied forces met advancing Soviet troops at the town of Torgau, Germany. The Germans surrendered May 5, 1945.

In the Pacific, Japanese armed forces achieved a series of early victories. By May 1942, they had overrun the Philippines and forced the surrender of 11,500 Americans and Filipinos, who were treated brutally by their captors. In an atmosphere of war hysteria, 110,000 Japanese-Americans living in America's western states were forced into relocation camps. Government officials justified this action as a precaution against sabotage and espionage, but no Japanese- Americans were convicted of any act of disloyalty during the war, and many of them fought bravely in the armed forces.

By May 8,1942, the Japanese threat to Australia was checked at the Battle of the Coral Sea. In June, the main Japanese fleet, steaming toward Hawaii, was repulsed at the Battle of Midway, with the loss of four aircraft carriers.

Over the next three years, American forces advanced toward Japan by "island- hopping"—capturing some strategic islands in the Pacific and bypassing others. An Allied force under General Joseph W. Stillwell aided the Chinese, and troops under General Douglas MacArthur returned to the Philippines in October 1944. The central Pacific island of Iwo Jima fell to the Americans in March and Okinawa in June 1945. В-29 bombers launched devastating raids against Japanese cities.

American forces now prepared to invade the Japanese home islands. In the hope of bringing the war to a swift end, President Harry Truman ordered the use of the atomic bomb against Hiroshima (August 6) and Nagasaki (August 9). Japan agreed to surrender on August 14. Nearly 200,000 civilians died in the nuclear attacks, but military experts agree that the casualties, Japanese and American, would have been far greater if the Allies had been forced to invade Japan.

WOLD WAR

After the war, tensions quickly developed between the United States and the Soviet Union. At the Yalta Conference of February 1945, Roosevelt, Churchill and Soviet leader Josef Stalin promised free elections for all the liberated nations of Europe. The western Allies restored democracy in Western Europe and Japan, but Soviet forces imposed Communist dictatorships in Eastern Europe.

In 1947, Secretary of State George C. Marshall proposed a massive aid program to help rebuild destroyed Europe. The U.S.S.R. and the Eastern European nations were invited to participate in the Marshall Plan, but the Soviets rejected the offer. Americans realized that an impoverished Europe in which deprivation and despair was widespread, would be susceptible to social and political movements hostile to western traditions of individual freedom and democratic government. The Marshall Plan was a generous and thoroughly successful program. Over four years it paid out $12.5 thousand million in aid and restored the economies of Western Europe.

In May 1947, the United States began sending military aid to the Greek government, which was fighting Communist guerillas, and to Turkey, which was being pressured by the Soviets for territorial concessions. At this time, Germany and Berlin were divided in two—a western zone under American, British and French occupation, and an eastern zone under Soviet domination. In the spring of 1948, the Soviets sealed off West Berlin in an attempt to starve the isolated city into submission. The western powers responded with a massive airlift of food and fuel until the Soviets lifted the blockade in May 1949. A month earlier the United States had allied with Canada, Britain, France, Belgium, the Netherlands, Italy, Luxembourg, Norway, Denmark, Iceland and Portugal to form the North Atlantic Treaty Organization (NATO).

On June 25,1950, armed with Soviet weapons and acting with Stalin's approval, North Korea's army invaded South Korea. President Truman immediately secured a commitment from the United Nations to defend South Korea, and American troops were sent into battle, later joined by contingents from Britain, Turkey, Australia, France and the Philippines. By September

  1. the North Koreans had conquered most of South Korea. The U.N. forces were confined to an area at Pusan at the southern tip of the Korean peninsula. Then General Douglas MacArthur launched a daring amphibious landing at Inchon in central Korea. The North Korean army was outflanked and shattered, and MacArthur's forces swept north toward the Yalu River— the boundary between North Korea and the People's Republic of China. In November, however, Chinese troops counterattacked and forced the U.N. army south of the 38th parallel (the boundary between North and South Korea). MacArthur advocated air and sea assaults against China, but President Truman believed that such a strategy would lead to a wider conflict, and on April 11,

  2. he relieved MacArthur of his command. Peace talks began three months later, but the fighting continued until June 1953, and the final settlement left Korea still divided.

Frustrated by the Korean stalemate and angered by the Communist takeovers in Eastern Europe and China, many Americans looked for "those responsible" and came to believe that their government too, might have been infiltrated by Communist conspirators. Republican Senator Joseph McCarthy assertec that the State Department and the army were riddled with Communists. McCarthy's sensational investigations uncovered no subversives, but his accusations and slanders destroyed the careers of some diplomats. In 1954, in the course of the broadcasts on national television, McCarthy was exposed as fraud, and he later was censured by the Senate Toleration of political dissent is one of the most fundamental and essential of American traditions. The McCarthy era—like the passage of the Alien and Sedition Acts of 1798 and the excesses of the Red Scare of 1919-1920—was a serious lapse from this tradition.

PROSPERITY AND CIVIL RIGHTS

From 1945 until 1970, the United States enjoyed a long period of economic growth, interrupted only by brief and fairly mild recessions. For the first time, the great majority of Americans could enjoy a comfortable standard of living. By 1960, 55 percent of all households owned washing machines, 77 percent owned cars, 90 percent had television sets and nearly all had refrigerators.

At the same time, the United States was moving slowly in the direction of racial justice. In 1941, the threat of black protests persuaded President Roosevelt to ban discrimination in war industries. In 1948, President Truman ended racial segregation in the armed forces and in all federal agencies. In 1954, in the decision Brown v. Board of Education of Topeka, Kansas, the Supreme Court unanimously ruled that segregation in the public schools was unconstitutional; nevertheless, southern states continued to resist integration. In 1955, the Rev. Martin Luther King, Jr. led a boycott of segregated public transportation that eventually ended segregation on city buses in Montgomery, Alabama. In 1957, the governor of Arkansas tried to prevent black students from enrolling an all-white high school in the state capital of Little Rock. To enforce obedience to the law requiring integration, President Dwight D. Eisenhower sent in federal troops.

That same year, Americans were jolted to learn that the Soviet Union had launched Sputnik, the Earth's first man-made satellite. This was a shock for the United States, a nation that had always taken pride in its technological expertise. In response, the American federal government increased efforts already underway to produce a satellite and spent more money оn education, especially in the sciences.

NEW FRONTIER AND GREAT SOCIETY

In 1960, Democrat John F. Kennedy was elected president. Young, energetic and handsome, Kennedy promised to "get the country moving again"; to forge ahead toward a "New Frontier." But one of Kennedy's first foreign policy ventures was disaster. In an effort to overthrow the Communist dictatorship of Fidel Castro in

Cuba, Kennedy supported an invasion of the island nation by a group of Cuban exiles who had been trained by the Central Intelligence Agency (CIA). In April 1961 the exiles landed at the Bay of Pigs and were almost immediately captured.

In October 1962, observation planes discovered that the Soviet Union was installing nuclear missiles in Cuba, close enough to strike American cities in a matter of minutes. Kennedy imposed a blockade on Cuba. Soviet Premier Nikita Khrushchev finally agreed to remove the missiles, in return for an American promise not to invade Cuba.

In April 1961, the Soviets scored another triumph in space: Yuri Gagarin became the first man to orbit the Earth. President Kennedy responded with a pledge that the United States would land a man on the moon before the end of the decade. In February 1962, John Glenn made the first American orbital flight, and he was welcomed home as a hero—much as Charles Lindbergh had been celebrated 35 years earlier after he made the first nonstop solo flight across the Atlantic. It took $24 thousand million and years of research but Kennedy's pledge was fulfilled in July 1969, when Neil Armstrong stepped out of the Apollo 11 spacecraft onto the surface of the moon.

In the 1960s, Martin Luther King, Jr. led a nonviolent campaign to desegregate southern restaurants, interstate buses theaters and hotels. His followers were met by hostile police, violent mobs, tear gas, fire hoses and electric cattle prods. The Kennedy administration tried to protect civil rights workers and secure voting rights for southern blacks.

In 1963 Kennedy was assassinated in Dallas, Texas. Kennedy was not a universally popular president, but his death was a terrible shock to the American people.

The new president was Lyndon Johnson, who had been vice president under Kennedy and succeeded to the office on the death of the president. He persuaded Congress to pass the Civil Rights Act of 1964, which outlawed racial discrimination in public accommodations and in any business or institution receiving federal money. Johnson was elected to a new term with widespread popular support in 1964. Encouraged by a great election victory, Johnson pushed through Congress many social programs: federal aid to education, the arts and the humanities; health insurance for the elderly (Medicare) and for the poor (Medicaid); low- cost housing and urban renewal. The Voting Rights Act of 1965 finally enabled all black Americans to vote. Discrimination in immigration was also ended: national origins quotas were abolished allowing a great increase in entry visas for Asians.

Although most Americans had by now achieved affluence, Michael Harrington's book The Other America (1962) identified persistent pockets of poverty—in urban slums, in most black neighborhoods and among the poor whites of the eastern Appalachian mountains. President Johnson responded with his "War on Poverty," which included special preschool education for poor children, vocational training for school dropouts and community service jobs for slum youths.

VIETYAM WAR

American involvement in Vietnam did not begin with President Johnson. When Communist and nationalist rebels fought French colonialism in Indochina after World War II, President Truman sent military aid to France. After the French withdrew from Southeast Asia in 1954, President Eisenhower dispatched American advisers and aid to help set up a democratic, pro-Western government in South Vietnam. Under President Kennedy, thousands of military officers trained South Vietnamese soldiers and sometimes flew Vietnamese warplanes into combat.

In August 1964, two American destroyers sailing in the Gulf of Tonkin reported attacks by North Vietnamese torpedo boats. President Johnson launched air strikes against North Vietnamese naval bases in retaliation. The first American combat soldiers were sent to Vietnam in March 1965. By 1968, 500,000 American troops had arrived. Meanwhile, the Air Force gradually stepped up В-52 raids against North Vietnam, first bombing military bases and routes, later hitting factories and power stations near Hanoi.

Demonstrations protesting American involvement in this undeclared and, many felt, unjustified war broke out on college campuses in the United States. There were some violent clashes between students and police. In October 1967, 200,000 demonstrators demanding peace marched on the Pentagon in Washington.

At the same time, unrest in the cities also erupted, as younger and more militant black leaders were denouncing as ineffectual the nonviolent tactics of Martin Luther King. King's assassination in Memphis, Tennessee, in 1968, triggered race riots in over 100 cities. Business districts in black neighborhoods were burned, and 43 people were killed—most of them black.

Ever increasing numbers of Americans from all walks of life opposed the involvement of the United States in the war in Indochina, and in the 1968 election, President Johnson faced strong challenges. On May 31, facing a humiliating defeat at the polls and a seemingly endless conflict in Vietnam, Johnson withdrew from the presidential race and offered to negotiate an end to the Vietnam War. The voters narrowly elected Republican Richard Nixon. As president Nixon appealed to "Middle America"— the "great silent majority" who were unhappy with violence and protest at home.

In Indochina, Nixon pursued a policy of "Vietnamization," gradually replacing American soldiers with Vietnamese. But heavy bombing of Communist bases continued, and in the spring of 1970 Nixon sent American soldiers into Cambodia. That action caused the most massive and violent campus protests in the nation's history. During a demonstration at Kent State University in Ohio, National Guardsmen killed four students.

Then, as the American people perceived that the war was being ended, the situation quite suddenly changed: Quiet returned to the nation's colleges and cities. By 1973, Nixon had signed a peace treaty with North Vietnam, brought American soldiers home, and ended conscription. Students began rejecting radical politics and generally became more oriented

toward individual careers. Many blacks were still living in poverty, but many others were finally moving into well-paid professions. The fact that many big cities—Cleveland, Newark, Los Angeles, Washington, Detroit, Atlanta— had elected black mayors contributed to the easing of urban tensions.

DECADES OF CHANGE

Political activism however, did not disappear in the 1970s—it was rechanneled into other causes. Some young people worked for the enforcement of antipollution laws or joined consumer-protection groups or campaigned against the nuclear power industry. Following the example of blacks, other minorities— Hispanics, Asians, American Indians, homosexuals— demanded a broadening of their rights.

Women had been gradually moving into the labor force since World War II, and in the 1970s a women's liberation movement pressed for legal abortion, day-care centers, equal pay and jobs for women. In 1973, the Supreme Court banned most restrictions on abortion, but that ruling only made more difficult a furious national debate: feminists defended abortion as a Constitutional right; others denounced it as the destruction of innocent life.

President Nixon achieved two major diplomatic goals: re-establishing formal relations with the People's Republic of China and negotiating the first Strategic Arms Limitation Treaty (SALT I) with the Soviet Union. In the 1972 election, he easily defeated George McGovern, a liberal antiwar Democrat.

During the campaign, however, five men were arrested for breaking into the Democratic party headquarters at the Watergate building in Washington, D.C. Journalists investigating the incident discovered that the burglars were employed by President Nixon's reelection committee. The White House made the scandal worse by trying to cover up its connection with the break-in. In July 1973, it was revealed that President Nixon had recorded his office conversations concerning the Watergate affair. Congressional committees, special prosecutors, federal judges and the Supreme Court all demanded that the President surrender the recordings, and after prolonged resistance he finally made them public. The tapes revealed that President Nixon was directly involved in the cover up. By the summer of 1974, it was clear that Congress was likely to impeach and to convict the president. On August 9, Richard Nixon became the only American president to resign his office.

Republican Gerald Ford, who succeeded to the presidency on the resignation of Richard Nixon, was likable and conciliatory. Ford did much to restore the trust of the citizens, though some voters never forgave him for pardoning his former boss, Richard Nixon. The 1976 election was won by Democrat Jimmy Carter, former governor of Georgia. Carter had limited political experience, but many voters now preferred an "outsider"—someone who was not part of the Washington establishment.

Precisely because he was an outsider, President Carter had difficulty working with Congress. He also could not control the chief economic problem of the 1970s—inflation. The Organization of Petroleum Exporting Countries (OPEC) had been increasing the cost of oil since 1973, and those increases fueled a general rise in prices. By 1980, inflation had soared to an annual rate of 13.5 percent, and the nation was experiencing a period of economic difficulty. Carter signed a second Strategic Arms Limitation Treaty (SALT II) with the Soviet Union, but it was never ratified by the Senate after the Soviet invasion of Afghanistan in December 1979. He also seemed ineffectual in the face of another crisis: In 1979, Iranian radicals stormed the United States embassy in Teheran and held 53 Americans hostage. Carter's greatest success was the negotiating of the Camp David Accord between Israel and Egypt, which led to an historic peace treaty between the two nations.

In the presidential race of 1980, American voters rejected Carter's bid for a second term, and elected Ronald Reagan, a conservative Republican and former governor of California. As a result of the election, the Republican party gained a majority in the Senate for the first time in 26 years. By giving Ronald Reagan an overwhelming election victory, the American public expressed a desire for change in the style and substance of the nation's leadership. Throughout his presidency, Reagan demonstrated the ability to instill in Americans pride in their country, and a sense of optimism about the future.

If there was a central theme to Reagan's national agenda, it was his belief that the Federal Government had become too big. Upon taking office in 1981, the administration's immediate problems were stagnant economic growth, high inflation and soaring interest rates. Reagan soon began a drastic reshaping of the federal budget, directed largely at domestic- spending programs. Reagan's domestic program was rooted in the belief that the nation would grow and prosper if the power of the private economic sector were unleashed. The administration also sought and won significant increases in defense spending.

Despite a growing federal budget deficit, by 1983 the economy as a whole had rebounded, and the United States entered into one of the longest periods of sustained economic growth since World War II. Presiding, like Eisenhower, over a period of relative peace and prosperity at the end 6f their first term, President Reagan and Vice President George Bush overwhelmingly won reelection in 1984. They carried 49 of 50 states in defeating the Democratic party ticket of former Vice President Walter Mondale and Geraldine Ferraro, who was the first woman in U.S. history to run as a vice presidential candidate.

In foreign policy, President Reagan sought a more assertive role for the nation. The United States confronted an insurgency in El Salvador, and the Sandinista regime in Nicaragua. In 1983 U.S. forces landed in Grenada to safeguard American lives and to oust a regime which took power after the assassination of the country's elected Prime Minister. The U.S. also sent peace-keeping troops to Lebanon, in an effort to bolster a moderate, pro-Western government. The mission ended tragically when 241 American Marines were killed in a terrorist bombing. In 1986, U.S. military forces struck targets in Libya, in retaliation for Libyan-instigated attacks on American personnel in Europe. Additionally, the United States and other Western European nations kept the vital Persian Gulf oil-shipping lanes open during the Iran-Iraq conflict, by escorting tankers though the war zone.

U.S. relations with the Soviet Union during the Reagan years fluctuated between political confrontation and far-reaching arms control agreements. In December, 1987, the U.S. and the Soviet Union signed the Intermediate- Range Nuclear Forces (INF) Treaty, which provided for the elimination of a whole category of ballistic missiles. However, efforts to make major cuts in other strategic weapons systems were not concluded, in large part due to the Reagan Administration's strong desire to develop the Strategic Defense Initiative (SDI), commonly known as the "star wars" ballistic missile defense system.

On January 28, 1986, after 24 successful flights, the space shuttle Challenger exploded 73 seconds after liftoff, killing all on board. The Challenger tragedy was a reminder of the limits of technology at a time when another technological revolution, in computers, was rapidly transforming the way in which millions of Americans worked and lived. It was estimated that by mid-decade Americans possessed more than 30 million computers. By late 1988, however, the U.S. successfully launched a redesigned space shuttle Discovery, which deployed a satellite in the first shuttle flight since the Challenger disaster.

The Reagan Administration suffered a defeat in the November 1986 congressional elections when the Democrats regained majority control of the U.S. Senate. However, the most serious issue confronting the administration at that time was the revelation that the U.S. had secretly sold arms to Iran in an attempt to win freedom for American hostages held in Lebanon, and to finance the Nicaraguan contras during a period when Congress had prohibited such aid. During the Congressional hearings which followed, the country addressed fundamental questions about the public's right to know, and the proper balance between the executive and legislative branches of government. Despite these problems, President Reagan enjoyed unusually strong popularity at the end of his second term of office.

Reagan's successor, George Bush, benefited greatly from the popularity of the former president. In the 1988 election, Bush defeated the Democratic Party's nominee, Michael Dukakis, by a wide margin, becoming the first sitting vice president since 1836 to be elected to the Presidency. During his campaign, Bush promised to continue the economic policies of the Reagan Administration. He echoed some of Reagan's positions on social issues, such as his strong stand against abortion, while quieting some of Reagan's critics with a call for a "kinder, gentler nation," and by stressing a commitment to be the "education president."

The U.S.-Soviet dialogue continued to broaden and deepen during the first year of the Bush Administration, at a time of ferment and remarkable political change in the Soviet Union and Eastern Europe—symbolized most eloquently by the opening of the Berlin Wall in November 1989. In the two years following that event, the world witnessed the dissolution of the Soviet Union and the end of its dominating influence in Eastern Europe. The Bush Administration promoted the concept of a "new world order," based on a new set of international realities, priorities, and moral principles.

The idea of a "new world order" faced its first test when Iraq invaded oil-rich Kuwait in August 1990. In January 1991, when Iraq did not comply with United Nations resolutions designed to force its withdrawal from Kuwait, U.S. military forces, as part of a multinational coalition, liberated Kuwait in a swift and decisive victory. Immediately after the war, th< Bush Administration took the lead in bringing together the age-old antagonists in the Middle East for a series of unprecedented peace conferences. As the 1992 elections approached President Bush focused his attention more on domestic issues and problems such as economic recession, unemployment, crime, education, and health care.

IMMIGRATION TO AMERICA

The story of the American people is the story of immigrants. The United States has welcomed more immigrants than any other country in the world. More than 75 percent of all people who ever moved from their homeland settled in the United States. Since its early days, the United States has accepted more than 50 million newcomers.

Migration to America began more than 20,000 years ago. At that time, groups of wandering hunters followed herds of game (animals hunted for food) from Asia to America across a northern land bridge where the Bering Straits are today. These people settled throughout North and South America. They are considered to be the only "native" Americans. By the time Christopher Columbus, an Italian navigator employed by the Spanish monarchs, "discovered" the American continents in 1492, about one million Native Americans lived in the area that later became the United States. Today, there are about 1.4 million Native Americans in the United States (0.6 percent of the total population).

Groups of Spanish settlers established outposts in what is now Florida, in the southeastern United States, during the 1500s, and a small French colony was founded on the

New World's northeastern seacoast (Maine) in 1604. In 1607, Great Britain founded its first permanent North American settlement: Jamestown, in the colony of Virginia—now a state located along the southeastern coast of the United States. Communities of Dutch, Swedish and German settlers were also established along North America's Atlantic coastline.

Why did these early European colonists risk a dangerous ocean journey and great hardship to settle in an unknown land? They were brought by the same desires that still bring immigrants today. The first European settlers were seeking land, wealth and freedom—a better life. Some colonists came to America to find religious freedom; one example of these was a group of English religious dissenters called Pilgrims. In 1620, the Pilgrims founded the colony of Plymouth,

in what is now Massachusetts. The Pilgrims disagreed with the religious teachings of the official Church of England, and they came to America to be free to worship as they pleased.

BRITISH NORTH AMERICA

By 1700, Great Britain had established clear colonial dominance over that part of North America which later became the eastern United States. Thirteen separate British colonies, governed indirectly by the British Parliament, provided raw material for the "mother country" and bought goods produced in Britain. Though people of many European nationalities lived in the colonies, the official language spoken was English and British laws and social institutions prevailed.

Over time, the colonial inhabitants of North America became increasingly dissatisfied with the power of Great Britain's king and Parliament to control their lives. They believed Britain's taxes to pay for colonial administrative expenses to be an unjustified burden, and they feared that the broad degree of personal freedom which existed in the colonies might be curtailed. This situation led, in 1775, to the outbreak of hostilities between British military and colonial marksmen and to the colonists formally declaring their independence from Great Britain in 1776. The American Revolutionary War (1775- 1783) established the independence of the United States.

Even during the Revolutionary War, immigration from many countries continued, and in 1776 Thomas Paine, an Englishman who emigrated to America in 1774, wrote "Europe, not England, is the parent country to America." These words encouraged immigrants from Holland, Germany, France, Switzerland, Spain and Scotland to come to the New World. In 1780, three out of every four Americans were of English or Irish background. Most of the rest came from other countries in northern and western Europe.

UNWILLING IMMIGRANTS

Among the flood of immigrants to North America, one group of people came unwillingly. Tbese were Africans. About 500,000 Africans were brought to the colonies as slaves between 1619 and 1808. This trade began as an outgrowth of the Spanish traffic in slaves between Africa and South America. Importing slaves to the United States became a crime in 1808, 21 years after the adoption of the Constitution of the United States in 1787, but slavery itself was not eliminated until after the Civil War (1861-1865). By 1810, there were 7.2 million people in the United States, of which 1.2 million were slaves and 186,768 free blacks. In 1863, President Abraham Lincoln signed the Emancipation Proclamation, freeing slaves in areas of the nation which were in rebellion. Slavery was completely abolished in 1865, with the passage of the 13th Amendment to the Constitution. Some blacks had gained their freedom before this time. Today, black Americans compose about 12 percent of the total population.

THE GOLDEN DOOR

Between 1840 and 1860, the United States received its largest wave of immigrants to date. In Europe, famine, poor crops, rising populations and political unrest caused an estimated five million people to leave their homelands each year. Between 1845 and 1850, the Irish people faced famine. The potato crop, upon which the Irish depended for subsistence, suffered blight for five years, and about 750,000 Irish starved to death. Many of those who survived left Ireland for the United States. In one year alone—1847—118,120 Irish people emigrated to the United States. By 1860, one of every four people in New York City had been born in Ireland. Today in the United States there are more than 13 million Americans of Irish ancestry.

During the Civil War, the federal government encouraged immigration from Europe, especially from the German states, by offering grants of land to those immigrants who would serve as troops in the armies of the North. In 1865, about one in five Northern soldiers was a wartime immigrant. Today, fully one-third of Americans have German ancestors.

Until about 1880, most immigrants came from northern and western Europe. Then a great change occurred. More and more immigrants began coming from countries in eastern and southern Europe. They were Poles, Italians, Greeks, Russians, Hungarians and Czechs. By 1896, more than half of all immigrants were from eastern or southern Europe.

One group of people who came to the United States during this period were Jews. The first Jewish people actually settled in North America as early as 1654, but Jews did not move to the United States in great numbers until the 1880s. During the 1880s, Jews suffered fierce pogroms (massacres) throughout eastern Europe. Many thousands of Jews escaped an almost certain death by coming to the United States. Between 1880 and 1925, about two million Jews immigrated here. Today, there are about 5.7 million Jewish Americans living in the United States, comprising about 2.2 percent of the total population. In certain states, such as New York, a state along the mid-Atlantic coast, their numbers are higher, and they account for more than 10 percent of the population.

During the late 1800s, so many people were entering the United States that the government was having trouble keeping records on all of these people. To solve this problem, the government opened a special port of entry in New York harbor. This port was called Ellis Island. Between 1892, when Ellis Island was opened, and 1954, when it closed, more than 20 million immigrants entered the United States through this port of entry. During its busiest days, almost 2,000 immigrants a day passed through. Today, about half of all Americans have ancestors who entered the United States by way of Ellis Island.

The United States was becoming known throughout the world as a refuge and a welcoming place for people of many nations. In 1886, as a gesture of friendship, France gave the United States the Statue of Liberty, which stands on an island in the harbor of New York City, near Ellis Island. Since that time, the Statue of Liberty has been one of the first sights many immigrants to the United States

see. It is a symbol of the hope and freedom that the country offers. The words etched on the base of the statue have been the inspiration for people in many lands who hope to come to the United States: "Give me your tired, your poor, your huddled masses yearning to breathe free. The wretched refuse of your teeming shore. Send these, the homeless tempest-tossed to me. I lift my lamp beside the golden door!"

Ellis Island is now a national park and historical museum. In the first year that it was open, more than a million people visited— many of them to see the place where their ancestors entered the United States. Visitors enter the museum through the baggage room, just as their ancestors did. Then they walk upstairs and sit on the benches where new arrivals waited their turn to fill out forms and undergo medical examinations.

The Statue of Liberty began lighting the way for new arrivals just at a time when native- born Americans began worrying that the United States was being overrun by immigrants. In the 30 years between 1890 and 1920, more than 18.2 million immigrants flooded America's shores. By 1910, 14.5 percent of all residents were foreign-born; today, about 6.2 percent of all American residents are foreign-born.

How could the United States absorb so many foreigners? Many citizens worried that these new Americans would take away their jobs. Citizens began demanding that the Congress limit the number of immigrants.

LIMITS ON NEWCOMERS

Gradually, responding to the demands of American citizens, Congress began to pass laws barring the entry of certain types of immigrants. The United States refused to accept immigrants who were prostitutes, convicts, insane, mentally retarded, beggars, revolutionaries, persons suffering from serious diseases and children without at least one parent.

These rules only held back one percent of all immigrants. So Congress tried to deny entry to immigrants who could not read or write. However, President Grover Cleveland, who was then president of the United States, refused to give his approval. Some people protested that the United States was being overrun by "inferior" races. But President Cleveland wrote: "Within recent memory...the same thing was said of immigrants who, with their descendants, are now numbered among our best citizens."

In 1924, Congress passed the Reed- Johnson Immigration Act. This law, which reflected the fears and prejudices of an "older" wave of immigrants from northern Europe, set limits on how many people from each foreign country would be permitted to immigrate to the United States. The number of people allowed from each country was based on the number of people from that country already living here. This system was designed primarily to limit immigration from southern and eastern Europe. For example, 87 percent of the immigration permits went to immigrants from Great Britain, Ireland, Germany and Scandinavia.

The mix of American people today reflects this old system. For example, until the 1970s, almost 81 percent of all

newcomers to the United States emigrated from 10 nations: 14.8 percent came from Germany; 11.1 percent came from Italy; 10.3 percent came from Great Britain; 10 percent came from Ireland; 9.2 percent came from Austria and Hungary; 8.6 percent came from Canada; 7.1 percent came from Russia; 4.1 percent came from Mexico; 3 percent came from the West Indies; and 2.7 percent came from Sweden.

Immigration was slow during the "Great Depression" years of the 1930s. This was a time when one out of four Americans was without a job. In fact, more people moved out of the United States during these years than entered.

Earlier, laws had been passed specifically to exclude Asian immigrants. Citizens in the American West were afraid that the Chinese and other Asians would take away jobs building railroads, and there was much animosity toward them. In 1882, the United States banned most Chinese immigration. Other Asians were refused entry as well. By 1924, no Asian immigrants were permitted into the United States.

The law that kept out Chinese immigrants was changed in 1949, and thereafter, Chinese were once again allowed to enter and to become American citizens. Other Asians have been permitted to become American citizens since 1952. Today, Asian- Americans are one of the fastest-growing ethnic groups in the United States. About 6.5 million Asians live in the United States, comprising about 2.5 percent of the population. Asian immigrants come from many very different countries, including the People's Republic of China, Japan, The Lao People's Democratic Republic, the Philippines, Vietnam, South Korea, Cambodia and Thailand.

Although most Asians in the United States emigrated relatively recently, they have become one of the most successful immigrant groups in the country. Large numbers of them study in the best American universities. They also have a higher average income than many other ethnic groups.

REFUGEES

After World War II, the United States began accepting refugees as a special group. The first refugees to the United States were Europeans who were uprooted by the horrors of war. Since then, the United States has taken in refugees from many places in the world. In 1956, thousands of Hungarians sought refuge in the United States after the Soviet Union crushed the attempt to establish a non- communist government in Hungary. After Fidel Castro took control of Cuba in 1959, the United States accepted 700,000 Cuban refugees. Many of these people settled together in communities around Miami, Florida. Again, in 1980, the United States accepted a special group of more than 110,000 Cuban refugees who came in crowded boats.

The United States has also accepted other groups of special political refugees. These include Southeast Asians, who were fleeing persecution after the end of the Vietnam War. Since 1975, the United States has accepted 750,000 refugees from Vietnam, Cambodia and The Lao People's Democratic Republic.

WHO IMMIGRATES?

The year 1965 marked a most important change in American immigration law. A new law signed by President Lyndon B. Johnson ended the old system of immigration that had favored northern and western Europeans. Under the new law, there is no consideration of people's country of origin, just as there have never been legal bars based on their race or their beliefs. Since 1965, the United States has accepted immigrants strictly on the basis of who applies first, within overall annual limits.

Soon after the 1965 law was passed, immigration patterns began to change. As recently as the 1950s, two-thirds of all legal immigrants came from Europe and Canada. By the 1980s, only one immigrant in seven came from these traditional sources. In the 1950s, immigrants from Asia accounted for only six percent of total immigration—or 150,000 in the entire decade. During the 1980s, 2.6 million Asian immigrants arrived, making up 44 percent of legal immigrants. The top points of origin from which new immigrants came in 1990 were Mexico (57,000), the Philippines (55,000), Vietnam (49,000), the Dominican Republic (32,000), Korea (30,000), China (29,000), India (28,000), the Soviet Union (25,000), Jamaica (19,000) and Iran (18,000).

The United States welcomes more immigrants than any other nation in the world. In 1990, its population included 18 million foreign-born people. But it cannot let everyone who wants to immigrate come at once. Under the Immigration Act of 1990, the total number of immigrants may not exceed 700,000 a year. But the new law also provides for exceptions that could increase that number significantly. For example, immediate family members of U.S. citizens would be admitted without regard to the cap, and more than 230,000 family members of U.S. citizens entered the country in 1990. And refugees—who numbered 97,000 in 1990—are also admitted without regard to the numerical limit.

The Immigration Act of 1990 also attempts to attract more skilled workers and professionals to the United States by reserving places specifically for them. And 10,000 places are reserved for entrepreneurs who promise to invest a minimum of $500,000 to start new businesses that will employ at least 10 people. People from countries whose immigrants were restricted by a previous immigration law are given special consideration. In 1990, 20,000 people entered under that category, most of them from Ireland, Canada, Poland and Indonesia. The new law also attempts to bring in more immigrants from countries that have supplied relatively few new Americans in recent years. It does this by providing some 50,000 "diversity visas" each year. About 9,000 people entered the United States in 1990 on diversity visas. The largest numbers came from Bangladesh, Pakistan, Peru, Egypt, and Trinidad and Tobago. Special provision is also made for Amerasian children born of American fathers stationed overseas. Some 3,000 nurses, most of them from the Philippines, also entered the country in 1990 under a law designed to alleviate the shortage of nurses in the U.S.

The United States also has a long tradition of taking in refugees, people fleeing from war or political persecution. But the United States cannot help all of the approximately 14 million

people in that category who need help from the international community. Every year, the executive branch of the government and the Congress jointly decide from which countries the United States will admit refugees. Other people are admitted on a case-by-case basis. In 1990, most of the 97,000 refugees came , from the Soviet Union, Vietnam, Eastern Europe, Iran, Afghanistan, Iraq, Cuba, and Ethiopia. The government also grants asylum to people who, when visiting the United States, ask to stay because of fear of persecution if they return to their home country. Not all requests for asylum are granted. The applicant must prove that his or her fears of persecution are well-founded. All together, 656,000 immigrants were admitted to the U.S. in 1990.

ILLEGAL IMMIGRANTS

Not all immigrants enter the United States legally. In 1986 there were an estimated 3 to 5 million people living in the country without permission, and the number was growing by about 200,000 a year.

Native-born Americans and legal newcomers worry about illegal immigrants. Many believe that illegal immigrants take jobs from citizens of the United States, especially from minority people and young people. Moreover, they can place a heavy burden on tax-supported social services. Some American employers have also exploited illegal workers, paying them less than the legal minimum wage and making them work under sub-standard conditions. The illegal immigrants cannot complain, for if they do, the employer can turn them over to the government law enforcement officials and have them sent out of the country.

In order to eliminate some of these problems with illegal immigration, the U.S. Congress approved a revision of American immigration policy in 1986. Under the new law, many illegal Immigrants who have been in the United States since 1982 can apply for legal residency that will eventually allow them to stay in the country permanently and give them full protection of the country's laws. In 1990, about 880,000 people gained legal status under this provision, and more than 2.5 million people are expected to "legalize." The new immigration law also provides protected status for illegal immigrants who have recently worked in U.S. agriculture and will also allow them to apply for permanent residency and citizenship. Historically, illegal immigrants have been very important in the harvesting of U.S. crops. While the new law brings significant benefits and protection to illegal immigrants who have been living in the United States, it also provides for strong measures to prohibit new illegal immigrants from entering the country and still imposes penalities on businesses that knowingly employ illegal aliens in the future.

IMMIGRATION TODAY

The 1965 change in the immigration laws—as well as rising illegal immigration—has changed the nature of the American population. It is not uncommon today to walk down the streets of the United States and hear Spanish spoken. In 1950, there were fewer than four million United States residents from

Spanish-speaking countries. Today, there are an estimated 17.6 million Hispanic people here. About 60 percent of the Hispanics in the United States have origins in Mexico, America's neighbor to the South. The other 40 percent come from a variety of countries, including El Salvador, the Dominican Republic and Colombia. Today, about 50 percent of the Hispanics in the United States live in California or Texas. Other states with large groups of Hispanics are Florida in the South, and Arizona, New Mexico and Colorado in the West.

In the past, Americans used to think of the United States as a "melting pot" of immigrants. A "melting pot" meant that as immigrants from many different cultures came to the United States, their old ways melted away and they became part of a completely new culture. The United States was likened to a big pot of soup, which had bits of flavor from each different culture. All of the different cultures were so well blended together that it formed its own new flavor.

Today, Americans realize that the simple "melting pot" theory is less true. Instead, different groups of people keep many of their old customs. Often groups of Americans from the same culture band together. They live together in distinctive communities, such as "Chinatowns" or "Little Italys"—areas populated almost exclusively by Americans of a single ethnic group—which can be found in many large American cities. Living in ethnic neighborhoods gives new Americans the security of sharing a common language and common traditions with people who understand them.

In time, however, people from different backgrounds mix together. They also mix with native-born Americans. Old traditions give way to new customs. The children of immigrants are often eager to adopt new, American ways. They often want to dress in American fashions, to speak English and to follow American social customs. By one estimate, about 80 percent of European immigrants marry outside their own ethnic groups by the time they reach the third generation. Third generation means that their great-grandparents were immigrants. Yet as successive generations become more "Americanized," they often retain significant elements of their ethnic heritage.

AMERICAN TRAITS

A look at immigration in the past and at immigration today shows where the American people came from. Understanding immigration also helps to explain some of the traits of the American people. For example, immigrants move to the United States because they are looking for a better life. It takes a lot of courage to leave behind everything that is familiar and come to a new country. Since before the independence of the United States, Americans have been a people willing to take risks and try new things. This willingness to strike out for the unknown takes an independence and an optimism that also is thought to be a characteristic of the American people today.

Immigrants also come to the United States because they differ from the majority of people surrounding them and because Americans also

are known to be generally accepting of people with different ideas.

Americans are also a people who are quick to learn and are open to new experiences. They have to be. Immigrants both today and in the past have a whole new world to learn about. They often have to learn everything from a new language to new social customs and new ways to make a living.

Immigrants also believe in the dream of the United States. They believe that by working hard and obeying the laws, they can have a better life. Often, Americans who have been here longer become less acutely aware of the rights and advantages that they have. Immigrants help native-born Americans to appreciate the good things to be found in the United States.

John F. Kennedy, who was president during the early 1960s, was the grandson of an Irish immigrant. Kennedy once said that the United States was "a society of immigrants, each of whom had begun life anew, on an equal footing. This is the secret of America: a nation of people with the fresh memory of old traditions who dare to explore new frontiers...."

Although there is sometimes friction and ill-feeling between new immigrants and people whose families have been Americans for generations, most Americans welcome newcomers. There is a popular feeling that immigrants have made America great and that each group has something to contribute. When President Bush signed the 1990 Immigration Act into law, he declared that its liberalized provisions would be "good for America."

Suggestions for Further Reading

Archdeacon, Thomas J.

Becoming American: An Ethnic History.

New York: Free Press, 1983.

Dinner stein, Leonard and David M. Reimers. Ethnic Americans: A History of Immigration and Assimilation. New York: Harper and Row, 1982.

Easterlin, Richard A., David Ward, William S.

Bernard and Reed Ueda.

Immigration.

Cambridge, MA: Harvard University Press, 1982.

Michael Fix and Jeffrey S. Pass el. The Door Remains Open: Recent Immigration to the United States and A Preliminary Analysis of The Immigration Act of 1990.

Washington, DC, The Urban Institute, 1991.

Glazer, Nathan, ed.

Clamor at the Gates: The New American Immigration.

San Francisco: Institute for Contemporary Studies, 1985.

Morris, Milton D.

Immigration: The Beleaguered Bureaucracy. Washington: Brookings Institution Press, 1985.

AMERICAN REGIONALISM

On every coin issued by the government of the United States are found three words in Latin: E pluribus unum. In Engish this phrase means "out of many, one." The phrase is an American motto. Its presence on coins is meant to indicate that the United States is one country made up of many parts.

On one level of meaning, the "parts" are the 50 states that march across the North American continent and extend to Alaska in the north and Hawaii in the mid-Pacific. On another level, the "parts" are the nation's many different peoples, whose ancestors came from almost every area of the globe. On a third level, the "parts" are the environments or geographical surroundings of the United States. These environments range from the rolling countryside of the Penobscot River Valley in central Maine to the snowcapped peaks of the Cascade Mountains in western Washington state and from the palm-fringed beaches of southern Florida to the many- colored deserts of Arizona.

From the .standpoint of government, the United States is one single country, of course. Its center is the national government in Washington, D.C. There are many other signs that the United States is indeed united—a national language (English) and a national coinage, to name only two. The many parts remain, however, making it difficult in some ways to gain an idea of the United States as a whole.

How, then, do Americans think of the United States? They often speak of it as a country of several large regions. These regions are cultural rather than governmental units.

They have been formed out of the history, geography, economics, literature and folkways that all parts of a region share in common.

The development, over time, of culturally distinctive regions within a country is not unique to the United States. Indeed, in some countries, regionalism has acquired political significance and has led to domestic conflict. In the United States, however, regions have remained culturally defined, to the point that there are no easily demarcated borders between them. For this reason, no two lists of American regions are exactly alike. One common grouping creates six regions. They are:

  • New England, made up of the northernmost five states along the Atlantic seaboard plus Vermont and parts of New York.

  • The Middle Atlantic Region, composed of New York, New Jersey, Pennsylvania, Delaware and Maryland.

  • The South, which runs from Virginia south to Florida and then west as far as central Texas. The region also takes in West Virginia, Kentucky, Tennessee, Arkansas, Louisiana and large parts of Missouri and Oklahoma.

  • The Midwest, a broad collection of states sweeping westward from Ohio to Nebraska and southward from North Dakota to Kansas, including eastern Colorado.

  • The Southwest, made up of western Texas, portions of Oklahoma, New Mexico, Arizona, Nevada and the southern interior area of California.

  • The West, comprising Colorado, Wyoming, Montana, Utah, California, Nevada, Idaho, Oregon, Washington, Alaska and Hawaii.

Defining the six main regions of the United States does not fully explain American regionalism. Within the main regions are several kinds of subregions. One type of subregion takes a river valley as its center. Thus, historians and geographers often write of the Mississippi Valley, the Ohio Valley or the Sacramento Valley. A second type of region is centered around mountain areas such as the

Blue Ridge country of Virginia or the Ozark country of Arkansas and Missouri.

Blue Ridge country of Virginia or the Ozark country of Arkansas and Missouri.

DIFFEREVT PLACES, DIFFERENT HABITS

What makes one region of the United States different from another? There are many answers to the question and the answers vary from place to place.

As a case in point, consider the role of food in American life. Most foods are quite standard throughout the nation. That is, a person can buy packages of frozen peas bearing the same label in Idaho, Missouri or Virginia. Cereals, rice, candy bars and many other foods also appear in standard packages. The quality of fresh fruits and vegetables generally does not vary from one state to another.

A few foods are not available on a national basis. They are simply regional dishes, limited to a single locale. In San Francisco, one popular dish is abalone, a large shellfish from Pacific waters. Another is a pie made of boysenberries, a cross between raspberries and blackberries. Neither abalone nor boysenberry pie is likely to appear on a menu in a New England restaurant, however. If you were to ask a Boston waiter for either dish, you might discover that he had never heard of it.

To take another example, consider the way Americans use the English language. For many years experts have been writing rules for standard American English, both written and spoken. With the coming of radio and television, this standard use of the English language has become much more generalized. But within several regions and subregions local ways of speaking, known as dialects, still remain quite strong.

In some farming areas of New England the natives are known for being people of few words. When they speak at all, they do so in short, rather choppy sentences and clipped words. Even in the cities of New England there

are definite styles of speech. Many people pronounce the word idea as "idear," or Boston as "Bahstun."

Southern dialect tends to be much slower and more musical. People of this region have referred to their slow speech as a "southern drawl." For instance, they commonly use "you-all" as the second person plural.

Regional differences extend beyond foods and dialects. Among more educated Americans, these differences sometimes center on attitudes and outlooks. An example is the stress given to foreign news in various local newspapers. In the East, where people look out across the Atlantic Ocean, papers tend to show greatest concern with what is happening in Europe, North Africa and western Asia. In the towns and cities that ring the Gulf of Mexico, the press tends to be more interested in Latin America. In California, bordering the Pacific Ocean, news editors give more attention to events in East Asia and Australia.

To explain the nature of regionalism more fully, it is necessary to take a closer look at each of these areas and the people who live there.

NEW ENGLAND

This hilly region is the smallest in area of all those listed above. It has not been blessed by large expanses of rich farmland or by a climate mild enough to be an attraction in itself. Yet New England can lay historic claim to having played a dominant role in the development of modern America. From the 17th century into the 19th century, New England was the nation's preeminent region with regard to economics and culture.

The earliest European settlers of New England were English Protestants of firm and settled doctrine. Many of them came in search of religious liberty, arriving in large numbers between 1630 and 1830. These immigrants shared a common language, religion and social organization. Among other things, they gave the region its most famous political form, the town meeting (an outgrowth of the meetings of church elders). In these meetings, most of a community's citizens gathered in the town hall to discuss and decide on the local issues of the day. Only men of property could cast a vote. Even so, town meetings allowed New Englanders a kind of participation in government that was not enjoyed by people of other regions before 1790. Town meetings remain a feature of many New England communities today.

From the first, New Englanders found it difficult to farm land in large lots, as was possible in the South. By 1750, many settlers had turned to other pursuits. The mainstays of the region became shipbuilding, fishing and trade. By the mid-19th century, New England possessed the largest merchant marine in the world. In their business dealings, New Englanders became known for certain traits, and are still thought of as being shrewd, thrifty, hardworking and inventive.

These traits were tested in the first half of the 19th century when New England became the center of America's Industrial Revolution. All across Massachusetts, Connecticut and Rhode Island, new factories appeared. These factories produced clothing, rifles, clocks and many other goods. Most of the money to run these industries came from the city of Boston,

then the financial heart of the nation.

One famous writer even referred to Boston as "the hub of the universe." And, in fact, the cultural life of the region was very strong. Nathaniel Hawthorne wrote novels such as The Scarlet Letter exploring the themes of sin and guilt. Ralph Waldo Emerson and Henry David Thoreau wrote essays on the importance of following "one's own star." Older colleges and universities blossomed, while newer ones sprang up. New England's oldest schools of higher learning, such as Harvard University (Massachusetts), Yale University (Connecticut), Brown University (Rhode Island) and Dartmouth College (New Hampshire), were originally religious in their purpose and orientation, but gradually became more secular.

During this period, New England was also a source of pioneers for the westward movement. New Englanders transplanted themselves and many of their ideas to Ohio and the northern Midwest, to the Pacific Northwest and finally all the way to Hawaii. As some of the older stock of New England traveled onward, a newer stock gradually began to take its place. Immigrants from Ireland, Italy and Eastern Europe arrived in large numbers in the cities of the southern part of the region. Immigrants from French Canada moved into the mill towns of New Hampshire and Maine.

Despite a changing population, much of the older spirit of New England still survives today. It can be seen in the simple, woodframe houses and white church steeples that are features of many small towns. It can be heard in the horn blasts from fishing boats as they leave their harbors on icy, winter mornings. Living may be easier in some other regions, but most New Englanders envy none of them. "However mean your life is, meet it and live it" wrote Henry David Thoreau; "do not shun it and call it hard names."

Thoreau's advice has a new meaning these days. Many industries have left the region and moved to places where goods can be made more cheaply. Clothing mills, shoe plants, clock factories and other businesses have shut their doors for the last time. In more than a few factory towns, skilled workers have been left without Jobs. Yet there are also signs of hope for a brighter future. One of them is the growth of newer industries such as electronics. The electronics industry produces radios, television sets, computers and similar items.

Whatever the future brings, there is not much doubt that the region will face it with pride. True New Englanders do not think of their hills and valleys merely as home but also as a center of civilization. A woman from Boston was once asked why she rarely traveled. "Why should I travel," she replied, "when I'm already there?"

THE MIDDLE ATLANTIC REGION

If New England supplied the spirit of invention, the Middle Atlantic region provided 19th-century America with its muscle. The largest states of the region, New York and Pennsylvania, became major centers of heavy industry. Here were most of the factories that produced iron, glass and steel. Here, too, were a number of the nation's greatest cities.

The Middle Atlantic region had been settled from the first by a much wider range о people than New England. Dutch made their homes in the woodlands along the lower Hudson River in what is now New York. Swedes established tiny communities in present-day Delaware. English Catholics founded Maryland and an English Protestant sect, the Quakers, settled Pennsylvania. In time, the Dutch and Swedish settlements all fell under English control. Yet the Middle Atlantic region remained an important early gateway to America for people from many parts of the world.

Early settlers of the region were mostly farmers and traders. The traders dealt mainly in furs brought to coastal towns by trappers from inland areas of New York and Pennsylvania. Many of the farmers of New York, northern Pennsylvania and northern New Jersey were New Englanders. These people had moved south and west in search of better land, bringing their way of life with them. Another large group of farmers in Pennsylvania came from Germany. These people included the Mennonites, members of; Protestant sect that believed in living simply.

In the early years, the Middle Atlantic region was often used as a bridge between New England and the South. Philadelphia, Pennsylvania, a mid-point between the northern and southern colonies, became the home of the Continental Congress, the group that led the fight for independence. The same city was the birthplace of the Declaration of Independence in 1776 and the United States Constitution in 1787.

At about the same time, some eastern Pennsylvania towns first tapped the iron deposits around Philadelphia. Steam soon replaced water as a source of power, creating ; greater need for iron. Heavy industries sprang up throughout the region because of nearby natural resources. Several mighty rivers, such as the Hudson and the Delaware, were transformed into vital shipping lanes. Cities along these waterways—New York on the Hudson, Philadelphia on the Delaware, Baltimore on Chesapeake Bay—expanded int< major urban areas.

Industries needed workers and many of them came from overseas. Late in the 19th century, the flow of immigrants to America swelled to a steady stream. In the words of the region's most beloved poet, Walt Whitman, th United States became "not merely a nation but a teeming nation of nations." New York City was port of entry for most newcomers. In the 1890s and early 1900s, millions of them sailed past the Statue of Liberty in New York harbor on the way to a fresh start in the United States.

Today New York ranks as the nation's largest city, its hub of finance and a cultural center for the United States and the world. It still bears traces of its Dutch past in the names of neighborhoods such as Harlem. Yet very few faces on the city's streets are Dutch faces. New York has the largest Jewish population о any city in the world. About three out of 10 of the faces one sees are likely to be those of black Americans, many of whose families moved to the city long ago from the South. Another three out of 10 New Yorkers come from overseas, nowadays from a mixture of countries that include Jamaica and South Korea, Haiti and Vietnam.

Elsewhere in the region, the pattern of settlement is less varied. Black Americans are an important force in all the region's cities. But families of Italian and Eastern European descent are more apparent in urban areas outside New York City.

Recently the region's heavy industries have fallen on hard times. Like the factories of New England, these industries have found it hard to compete with cheaper goods made elsewhere. Even so, the Middle Atlantic region has managed to propser. It has done so partly by building new industries such as drug manufacturing and communications.

THE SOUTH

If all regions of the United States differ from one another, the South could be said to differ most. At several times in the nation's history, in fact, the region has shown a pride in its differences that has approached defiance and even blossomed into southern nationalism. The South was devastated socially and economically in the mid-19th century by the American Civil War. Nevertheless, it has remained distinct, and it played a major role in forming the character of America from before the War of Independence to the Civil War.

Perhaps the most basic difference between the South and other regions is geographic. Southerners generally enjoy more ways free of frost than northerners do. The South also has more rainfall than the West. A southerner once described his region as a land of yellow sunlight, clouded horizons and steady haze. He thought the climate an inspiration for the southern spirit of romance.

The first Europeans to settle this sultry region were, as in New England, mostly English Protestants. These were Anglican rather than Calvinist, however, and few of them came to America in search of religious freedom. Most sought the opportunity to farm the land and live in reasonable comfort. Their early way of life resembled that of English farmers, whom they often imitated, in the days before the Industrial Revolution. The South emulated England as much as New England prided itself on its distinction from it.

In coastal areas some settlers grew wealthy by raising and selling crops such as tobacco and cotton. In time some of them established large farms, called plantations, which required the work of many laborers. To supply this need, plantation owners came to rely on slaves shipped by the Spanish, Portuguese and English from Africa. Slavery is unjust. The fact remains, however, that it became a part of southern life in the United States, as it did throughout Central and South America. Nevertheless, the great majority of southern agriculture was carried out on single family farms, just as it was in the North, and not on large plantations.

The South played a major role in the American Revolution of the 1770s. Soon afterward, it provided the young United States with four of its first presidents, including George Washington. After about 1800, however, the apparent interests of the manufacturing North and the more agrarian South began to diverge in obvious ways. The North became more and more industrial, while the South was wedded to the land.

In the cotton fields and slave quarters of the region, black Americans created a new folk music, Negro spirituals. These songs were

religious in nature and some bore similarities to a later form of black American music, jazz.

As the century wore on, slavery became a steadily more serious problem for the South. "Nothing is more certainly written in the book of fate," asserted Virginia's Thomas Jefferson, than that black people "are to be free." As the United States expanded, Jefferson's words came to seem an increasingly accurate forecast. Nonetheless, many southern leaders defended the slave system; to them, an attack on slavery seemed an unwarranted attack on the southern way of life.

The issue led to a national political crisis in 1860. Eleven southern states from Virginia to Texas left the federal union to form a nation of their own. The result was the most terrible war in the history of the United States, the Civil War (1861-1865). With all its largest and most important cities in ashes, the South finally surrendered. It was then forced to accept many changes during the period of Reconstruction (rebuilding), which lasted officially until 1877. Many of the subsequent political alignments in the United States stem from the passions and perceptions of this period.

The leaders of Reconstruction were members of the Republican party in the national government. They not only ended slavery, but planned to put black southerners on an equal footing with whites and to redistribute old plantation lands. White southerners opposed and resented such efforts and the Republicans who supported them. For the next century, white southerners voted for the Democratic party with such fervor that their region became known as the "Solid South."

For a time, black Americans gained a voice in southern government. By the end of the 19th century, though, they faced a new barrier to equality. Southern towns and cities refined and legalized the practice of racial segregation. Blacks attended separate schools from whites, rode in separate railroad cars and even drank at separate water fountains.

Gradual change did come, however, and this time from within. It began in about 1900 as the region turned to manufacturing of many different kinds. By 1914, the South had at least 15,000 factories and the number was increasing, although the population remained largely rural. At about the same time, many black Americans began moving from southern farms to the cities of the North.

The pace of change quickened throughout the first half of this century. Coastal sections of Florida and Georgia became vacation centers for Americans from other regions. In cities such as Atlanta, Georgia, and Memphis, Tennessee, the populations soared. For decades some southern leaders had been speaking of a "New South." Now it seemed, a "New South" was coming into being.

The greatest change of all took place after the return of the veterans of World War II. In the 1950s and 1960s, after years of black protest, Supreme Court rulings and the passage of sweeping civil rights legislation, the obvious forms of segregation came to an end. For the first time since Reconstruction, blacks gained a greater voice in local government throughout the South. Although their struggle for equality had not ended, it was finally having an effect.

All these changes produced many tensions among southerners. In the period between the first and second World Wars, a southern literary movement arose which gave the nation some of its greatest writers of this century. Novelists such as Thomas Wolfe, Robert Penn Warren, Carson McCullers and William Faulkner spun stories of southern pride and displacement. Playwrights such as Tennessee Williams built dramas around the same themes. Why this literary outpouring? Georgia's Flannery O'Connor, a major novelist, once explained it this way: "When a southerner wants to make a point, he tells a story; it's actually his way of reasoning and dealing with experience."

Today sleek, new, high-rise buildings crowd the skylines of cities such as Atlanta, Georgia and Little Rock, Arkansas. Late model cars cover the parking lots of iron mills in Birmingham, Alabama, and oil refineries in Houston, Texas. Along the Atlantic and Gulf coasts of Florida, builders put up new apartments for vacationers from almost everywhere. The South is booming as never before.

THE MIDWEST

For the first 75 years of American history, the area west of the Appalachian mountains was not really a region at all. It was a beacon summoning the nation to its future and, later, measuring how far the United States had come.

In what are now the states of Ohio, Indiana and Illinois, people moving to the frontier found gently rolling countryside. If tickled with a hoe, they said, the land would laugh with the harvest. As they moved west across the Mississippi River, though, the land became flatter and more barren. Here the horizons were so broad that they seemed to swallow travelers in space.

The key to the region was the mighty Mississippi itself. In the early years it acted as a lifeline, moving settlers to new homes and great amounts of grain and other goods to market. In the 1840s, Samuel Clemens spent his boyhood beside the Mississippi. Writing under the name of Mark Twain, he later described the wonders of rafting on the river in his novel, The Adventures of Huckleberry Finn.

As the Midwest developed, it turned into a cultural crossroads. The region attracted not only easterners but also Europeans. A great many Germans found their way to eastern Missouri and areas farther north. Swedes and Norwegians settled in western Wisconsin and many parts of Minnesota. The Irish came and so did Finns, Poles and Ukrainians. As late as 1880, 73 percent of the residents of Wisconsin had parents who had been born in foreign countries.

Gradually, the Midwest became known as a region of small towns, barbed-wire fences to keep in livestock, and huge rectangular fields of wheat and corn. Midwestern farmers raised more than half of the nation's wheat and oats and nearly half of its cattle and dairy cows. A hectare of land in central Illinois could produce twice as much corn as a hectare of fertile soil in Virginia. For these reasons, the region was nicknamed the nation's breadbasket

Midwesterners are praised as being open, friendly, straightforward and "down-to-earth." Their politics tend to be cautious, though the caution could sometimes be peppered with protest. The region gave birth to the Republican party, formed in the 1850s to oppose the extending of slavery into western lands. The Midwest also played an important role in the Progressive Movement at the turn of this century. Progressives were farmers, merchants and other members of the middle class who generally sought less corrupt, fairer and more efficient government.

Perhaps because of their location, midwesterners lacked the interest in foreign affairs shown by many Americans in the financial and immigration centers of Boston and New York. In the years after World War I, many leaders argued that the nation should stay out of overseas quarrels. This movement, called isolationism, died with Japan's surprise attack on the United States in 1941. Yet the Midwest is still remembered as the region least ready to rally to foreign causes.

Today the hub of the region remains Chicago, Illinois, the nation's third largest city. This major Great Lakes port has long been a connecting point for rail lines and air traffic to far-flung parts of the nation. At the heart of the city stands the world's tallest building, Sears Tower. This skyscraper soars a colossal 1,454 feet (447 meters) into the air.

THE SOUTHWEST

The Southwest differs from the Midwest in three primary ways. First, it is drier. Second, it is emptier. Third, the populations of several of the southwestern states comprise a different ethnic mix.

Rain-laden winds blow across most of the region only in the spring. During that season, the rain may be so abundant that rivers rise over their banks. In summer and autumn, however, little rain falls in much of Arizona and New Mexico and the western sections of Texas. Only in the river valleys of those areas can any intensive farming take place.

Partly because this region is drier, it is much less densely populated than the Midwest. Outside the cities, the region is a land of wide open spaces. One can travel for miles in some areas without seeing signs of human life.

Parts of the Southwest once belonged to Mexico. The United States gained this land following a war with its southern neighbor between 1846 and 1848. Today three southwestern states lie along the Mexican border—Texas, New Mexico and Arizona. All have a larger Spanish-speaking population than other regions except southern California.

THE WEST

Americans have long regarded the West as a "last frontier." Yet California has a history of European settlement much older than that of most mid western states. Spanish priests and soldiers first set up missions along California's coast a few years before the start of the American Revolution. In the 19th century, California and Oregon entered the Union ahead of many states to the east.

In the West, scenic beauty exists on a grand scale. All eleven states are partly mountainous, and in Washington, Oregon and northern California the mountains present some startling contrasts. To the west of the mountains, winds off the Pacific Ocean carry enough moisture to keep the land well watered. To the east, however, the land is very dry. Parts of western Washington receive 20 times

the amount of rainfall received in eastern Washington. The wet climate near the coast supports great forests of trees such as redwoods and stately Douglas firs.

In many areas the population is sparse. Colorado, Wyoming, Montana, Utah and Idaho—the Rocky Mountain States—occupy about 15 percent of the nation's total land area. Yet these states, so filled with scenic wonders, have only about three percent of the nation's total population.

Except for Hawaii, the westernmost states have all been settled primarily by people from other parts of the nation. Thus, the region has an interesting mix of ethnic groups. In southern California—also considered part of the Southwest—people of Mexican descent play a role in nearly every part of the economy. In the valleys north of San Francisco, Italian families loom large in the growing of grapes and the bottling and selling of California wine. Americans of Japanese descent traditionally managed truck farms in northern California and Oregon, and Chinese Americans were once mostly known as farmers, laborers and the owners of laundries and restaurants. In recent years large numbers of the younger generation have achieved positions of prominence in medicine, law, engineering scientific research, music and many other fields. In the 1980s, large numbers of people from Korea and Southeast Asia settled in California, mainly around Los Angeles.

Hawaii is the only state of the Union where Asian-Americans outnumber residents of European stock. Among Asian-Americans, those of Japanese descent are the largest group. People of Chinese and Filipino ancestry are also well represented.

New Englanders have left their mark on much of the West. Many northwesterners prize "Yankee" virtues such as shrewdness and thrift. In much of California, however, life is more flamboyant. Some observers trace this quality to the gambling instincts of the Gold Rush of 1848, which first brought many Americans west in search of gold discovered there. Others say that the Gold Rush did not last long enough to leave a lasting mark on the culture of the state. These observers claim that the California experience is mostly the result of a sunny climate and the self-confidence that comes of success.

The success is not much debated. In 1860 El Pueblo de Nuestra Senora la Reina de los Angeles de Porciuncula was a hodgepodge of adobe huts on the edge of a sandy wilderness. A century and a quarter later, Los Angeles had become the second most populous city in the nation. To millions of people, the city means Hollywood, the center of the film industry. Yet Los Angeles also produces aircraft parts, electronic equipment and other products of today's technology.

Fueled by growth in Los Angeles and smaller cities such as San Jose, California is now larger than every other state in size of population. Still the richness of America is not measured exclusively in numbers but in the diversity and resourcefulness of its people from all the various regions.

URBAN CULTURE: THE AMERICAN CITY

By Bruce Oatman (Fordham University)

Three hundred years ago a handful of town dwellers lived in a few scattered locations along the Atlantic coastline of what is now the United States. In the early years of this century, over 50 percent of the population of the United States still lived in rural areas. Today, however, the United States is a nation of urban dwellers. Almost 80 percent of the national population lives either within the formal boundaries of cities or in the huge suburban rings (clusters of communities

socially and economically connected to the cities) which surround them. More than two hundred of these metropolitan regions now make up the everyday setting of American life.

The influence of cities in modern America is extensive. Thanks in part to urban-based national news media, in a country in which only two people in 100 now live on farms, the power of cities to influence life far beyond their borders is very great. From urban centers, through suburban communities, into the smallest and most distant rural villages flow many social and economic values, ways of making a living, clothing styles and manners, and a modern technological spirit. As a result, many of the once sharp distinctions that could be made between rural and urban ways of life no longer exist. The geography may differ between city and country, and social and political attitudes may still vary, but the forms

of living and working are remarkably similar.

How did this come about and what does it mean for the quality of American life today?

EARLY YEARS: 1625-1812

The original North American colonies were regarded by the mother countries of Britain, Holland and France primarily as sources of raw material from field, forest, ocean and mine, and as potential markets for finished goods manufactured in Europe. While this approach required rural and wilderness settlement, it was necessary, at the same time, to establish small towns in the colonies as administrative centers to control the emerging trans-Atlantic trade. These towns were gathering places for artisans and shopkeepers who served the agricultural

hinterlands. In the large and frightening wilderness, the towns provided security and also served as social centers.

Eventually, with increasing numbers of European settlers arriving in the New World, coastal cities—the largest of which were Boston, New York, Philadelphia and Charleston, South Carolina—came into being, and their economic and social influence stretched into extensive rural backlands. At the same time, as port cities, they rapidly grew to be flourishing centers of international commerce, trading with Europe and the Caribbean.

By 1660, Boston contained about 3,000 people. One of its inhabitants described it as a "...metropolis...[with] two handsome churches, a market place and a statehouse. The town is full of good shops well furnished with all kinds of merchandise—artisans and tradesmen of all sorts."

New York (then called New Amsterdam) was founded in 1625 by the Dutch West India Company, which exported furs, timber and wheat. Captured by the British in 1664, New Amsterdam was renamed New York. Because of its favorable geography, it soon became an important trading port. By 1775, its population was about 25,000.

William Penn, who planned the city of Philadelphia, believed that a well-ordered city was necessary to economic growth and moral health. He wanted to build a "green country town" which would not be sharply cut off from the surrounding forest and farmlands. Inside the town were markets, residential housing, small factories, churches, public buildings, recreational areas and parks. Farming areas would be on the periphery but close enough to be accessible to the city dwellers.

Penn's ideas were widely copied in his day. An echo of them can be heard in contemporary planned communities which preserve parks and open spaces within a town's boundaries.

Most American towns of this early period featured open spaces alternating with built-up areas. Much free land was available, and, as fewer than 10 percent of the people lived in the towns, few opposed their growth. By the middle of the 18th century, however, many people opposed this growth because the towns had begun to seem too large and crowded. In 1753, the newspapers printed a debate which seems very similar to the arguments of today. The positive view of cities was expressed by a writer who argued that the economic specialization of cities led to increased wealth for both city and farm dwellers:

"...different handicrafts ought to be done by different persons, that (such) work might be done to perfection, which would be a considerable profit to the country...and to those who are proficient in the handicrafts. [Specialization] would cause an extraordinary market for provisions of all kinds...."

The contrary view of cities was expressed in an argument, dating back to antiquity, and reflecting a strong belief in the virtues of an agrarian life in the United States, which portrays cities as places which undermine self- sufficiency and encourage meaningless social activity and moral decay:

"...Every town not employed in useful manufacture...is a dead weight upon the public.... When families collect themselves into townships they will always endeavor to support themselves by barter and exchange which can by no means augment the riches of the public.... Another consequence of the clustering into towns is luxury—a great and mighty evil, carrying all into...inevitable ruin."

By 1750, the larger cities were dominated by a wide range of commercial and craft activities. A corresponding range of social groups developed: from an economically and socially dominant merchant and administrative class to a middle class of artisans, shopkeepers, farmers and smaller traders. On the edge of society, groups of the poor and dispossessed scrambled for an economic foothold, and were sometimes dependent upon charity.

Culturally, the colonies were outposts of Britain. The colonial cities were visited by touring actors and musicians and enriched by the development of schools, libraries and lecture halls. All of this increased the differences between city and country life and contributed to the importance of the American city as an initiator of social change. In terms of administration, the development of towns created a dense web of social, economic and governmental structures and regulations. However, the forms of municipal government varied greatly from place to place. In New England, the town meeting prevailed. This was a gathering of all citizens to discuss common concerns, and was an outgrowth of Protestant leader John Calvin's ideas about providing for representative government in a religious community. This form of community government continues today in the small towns of the Northeast. Councilmen were first elected to govern New York City in 1684. In contrast, the city of Charles Town (now called Charleston), in South Carolina, had no local representatives, but was governed by the State Assembly.

The War of Independence (1775-1783) was largely brought about by the grievances of city dwellers. Strict limitations imposed by the British on manufacture and trade, and the British Parliament's repeated levying of taxes without prior consultation with the colonists were widely perceived as unjust and punitive measures. Furthermore, one hundred years of inter-city trade had forged a sense of nationhood. The famous Boston Tea Party, during which colonists destroyed tea imported on British ships rather than pay taxes on it, expressed the colonists' frustration and their growing sense of national unity.

The war secured political independence for the United States, but economically, the new nation was still dependent upon the trading patterns that had developed over a century. The country supplied raw material and imported finished goods. This situation lasted until the War of 1812 (with England), during which great suffering occurred as a result of the British blockade of American ports. Even those Americans who had earlier resisted the development of a larger manufacturing sector and the growth of cities now changed their minds.

Thomas Jefferson, president of the United States from 1801 to 1809, had written in 1800 that, "I view great cities as pestilential to the morals, the health and the liberties of man." However, after 1812, he wrote, "We must now

place the manufacturer by the side of the agriculturist." Economic growth and independence, were necessary to guarantee political liberty however undesirable the growth of manufacturing cities might be.

Some of Jefferson's contemporaries had even earlier chosen to view the cities from the positive rather than the negative perspective and to turn their practical intelligence to the improvement of city life. Benjamin Franklin of Philadelphia was one of these:

"I began now to turn my thoughts to public affairs, beginning with small matters—our city had the disgrace of suffering its streets to remain long unpaved so that it was difficult to cross them. By talking and writing on the subject, I was at length instrumental in getting the street paved with stones—all the inhabitants of the city were delighted."

MIDDLE PERIOD: 1812-1918

At the time of the War of 1812, less than one in 10 Americans lived in cities. By the end of World War I (1914-1918), one in two did. In 1812, American cities had experienced little of the overcrowding and decay of European cities of that time. Within a few decades, however, the very rapid growth of urban population gave American cities all of the unpleasant qualities long associated with older cities everywhere.

This growth can be traced to four causes: rapid industrialization, with its ever-increasing demand for workers; the relentless construction of roads and railways making easier the movement of goods and people from, to and through the urban manufacturing centers, a steady stream—at times a flood—of immigrants fleeing war, persecution and poverty in their countries of origin and concentrating in America's major ports of entry, and farm workers displaced by machinery or discouraged by low wages, making their way to a supposed brighter future in the cities.

Boston's population increased from 43,000 in 1820 to 250,000 in 1870. New York's population went from 124,000 in 1820 to 942,000 in 1870; Philadelphia's population rose from 64,000 to 674,000 in the same period, and Chicago's population climbed from 0 to 299,000. During the same period,

the ratio of urban dwellers in the much expanded national population rose from eight percent to 25 percent. This was also the period of westward migration, which settled the territory from Chicago to California. By the end of the 19th century, the United States was dotted with large and small cities. These were bound together in a continent-wide web of social and economic relations made possible by the building of road and rail systems. From the 1820s to the 1880s, changes occurred so rapidly that city governments struggled to cope with them.

By 1830, New York had gained a reputation, which it still holds, as a place of great motion and constant activity. The city was considered to be the showcase of American modernism. At the same time, New York experienced archaic sanitation, typhoid and dysentery epidemics, contaminated water, severe poverty, insufficient housing and schools, and an overwhelming influx of

immigrants. Juvenile crime was so widespread that in 1849 New York's police chief devoted his entire annual report to the subject. Garbage filled the streets and, until the 1860s, bands of pigs were typically let loose to roam as scavengers in all the larger cities.

The immigrants came from practically every country and area of the world, though the majority of the earlier wave (1830-1870) were from northern and western Europe and most of the later wave (1880-1920) came from eastern and southern Europe. These immigrants crowded into the cities, often living together in distinct communities, or ethnic neighborhoods demarcated by language, religious and cultural differences. Many of these enclaves—less well-defined and less separated from the surrounding culture—still exist today.

Most city governments were characterized by a spirit of laissez-faire (let people do as they please). City government leaders saw their role as one of maintaining civil order, not as engaging in city planning. Generally, as compared with many other industrial countries, this attitude toward planning is still the rule. The American emphasis on individual freedom argues against central regulation and management.

Between 1880 and 1920, many urban problems found at least temporary solutions. Movement to bring about social, economic and political reform arose in all the large cities. Collectively, these reform activities came to be known as the Progressive Movement. The same creative impulses that were transforming industrial production were turned to the social problems of the new cities.

Public health programs were started and groups were founded to offer help to the poor. Public school systems were enlarged and strict qualification standards for teachers were set. Government reform was brought about partially by a system of promotion for public employees based upon merit rather than upon political favoritism.

Housing quality laws were passed. Agencies were created to teach language and job skills to millions of immigrants. In addition, there were many technical innovations that improved the quality of city life. These included the electric light and the electrification of machinery, water and sewage systems, the trolley car and subway, and the elevator and skyscraper.

By the 1920s, it seemed that the American city was finally gaining the ability to solve its many' problems.

AMERGING METROPOLIS

By about 1918, half of the United States population lived in cities and metropolitan areas; by 1990, almost 80 percent lived in such places. Strong economic and social currents encourage the continued concentration of the urban population which otherwise might disperse into more sparsely settled areas. The creation of large metropolitan markets for goods, services and jobs acts as a magnet for further growth. In addition, as farming has become more mechanized over the last half century, increasing numbers of unneeded farm workers have followed those who earlier sought better lives in urban areas. There are many activities which can only thrive in central locations with large populations. These include manufacturing, business and government administration, large-scale cultural and retail activities, and a whole host of service occupations.

Despite this, many central city areas have experienced a decrease in population since the mid-1960s, as suburbs grew. This loss is not the result of people's returning to live on farms or in villages. It is a product of Americans' increasing prosperity and of their desire to own a piece of land.

The growth of American cities between 1860 and 1960 has always been viewed in the United States with feelings of both pride and dismay. The city is a product of the machine age; it is a creation of the industrialization which produced much of the country's wealth and strength. Much that is best and most innovative in education, culture, and political and social thought results from the intellectual exchange and excitement which city life makes possible. On the other hand, poverty, overcrowding, social conflict and criminal violence are also much more common in cities than in rural areas. Demands for social services which go beyond the ability of the cities to provide have, over time, created problems which make living in the cities less attractive.

The response of many city dwellers has been to relocate from the city center to less heavily populated areas at the edge of the city. These areas, known as "suburbs," have combined elements of both urban and rural living, and have blurred the dividing line between city and countryside. Many business and manufacturing firms have moved to these suburbs, attracted by lower taxes, low land prices, and the growing labor pool and retail markets there.

Older distinctions between city and suburb, central business district and suburban shopping area, and even city slum and single home residential district are not very useful today. This is because these places are no longer relatively independent. The suburban rings around all central cities must be regarded as part of the urban structure. Central cities and their suburbs together form metropolitan regions and must be considered economic and social wholes. Highways have been constructed to make travel from city to suburb easier, and the provision of social services has been extended, so that living in a suburb is nearly as convenient as living in a city, and yet the problems of overcrowding and crime are much less serious.

Meeting the needs of these expanding outer rings of metropolitan areas requires more complex systems of urban government. A variety of urban governmental forms, often distinguished by whether they are headed by an elected individual (mayor), a hired manager or a council of elected officials, is being tried to determine which is most effective at meeting modern urban/suburban needs.

Also as a result of the expansion of these suburban rings, many metropolitan areas have grown so large in recent decades that they have overlapped, and have begun to merge. This new urban network has been called "megalopolis" by French geographer Jean Gottman. He identified the largest of these as occupying an area on the Atlantic seaboard from north of Boston, through New York, south to Washington, D.C.—"Bosnywash." This megalopolis contains more than one-sixth of the entire United States population. It is bound together by many economic and social relationships. It is estimated that by the year 2000, 80 percent of Americans will live in 28 or so of these megalopolises.

As many of America's urban dwellers have moved to the suburban rings in search of greater privacy, cleaner air and less social conflict, a pattern of urban living has emerged which is in sharp contrast to that in cities in other industrialized countries. Elsewhere in the world because of the advantages which city life can offer, city centers—or inner cities—are regarded as the most desirable living space and are occupied by the most affluent groups. In the United States, many in the wealthy and the middle class have moved to the periphery. As a result, cities have lost tax money that these groups paid to provide needed services. The lessening of services further encourages those who can afford to move outside the city limits to do so, and the city centers are perceived as among the least desirable areas to live.

This does not mean that those areas are unoccupied. It means that, because of the low rents, newly arrived groups, the members of which are the least educated, least skilled, poorest and least adapted to urban life, move first into the most undesirable living space near the center of the city. Who are these groups?

An important source of urban population growth, especially since 1945, has been the migration to cities of black Americans and Hispanics. Many of these newcomers had been farm workers whose livelihood was lost through the mechanization of farms. They followed the trail of earlier migrants to the city, expecting to find semiskilled factory and service jobs.

Unfortunately, their migration occurred when economic changes were causing a loss of such jobs, many to other countries. The consequence is that all the larger American cities have experienced an increase of relatively unskilled, poor people for whom jobs are not readily available. However, as these people gain skills, get jobs and become more affluent, they, in turn, move outward and their places are taken by a less affluent and more rootless population.

These are only general tendencies and there are many exceptions. For example, during the past two decades cities such as New York, Boston, Baltimore, Washington, D.C. and San Francisco have accomplished major "urban renewal" projects, rebuilding and renovating huge tracts of the central city area, and thus once again attracting businesses and more affluent groups to settle there. In many cities young middle class business and professional families have returned to deteriorating neighborhoods and restored the economic and cultural vitality of the areas. Though it probably represents only a minority trend, this is a hopeful sign for the American cities.

It is only to be expected that the enormous century-long growth of cities should have left many unsolved problems. Most of these problems were not foreseen. Probably they could not have been. Many are the consequences of successes of one sort or another. The noise and congestion of automobile traffic, for example, is a result of almost universal car ownership. Cars fill many city streets which were intended for horse and foot traffic.

The federal government has been deeply involved in the fate of the cities since the economic depression of the 1930s. Before that, the role of Washington had simply been to coordinate local efforts. In recent years, the federal government has assisted city governments in coping with the increased costs of services, the loss of tax revenues and the poverty of many residents. In general, ups and downs of the national economy can have a profound effect on city life, and the cities need help to lessen the impact of those ups and downs. In 1965, a Department of Housing and Urban Development was created in the federal government to manage programs concerned with community development and housing needs.

City administrators have tried in recent years to strengthen their abilities to organize the delivery of services. Mayors in many cities have been given wider powers to cope with the magnitude of the problems with which they are faced. One reform effort is the attempt to create metropolitan-wide governments.

Mass production and distribution of necessary goods are best accomplished when many people live together in a community. In this sense, the city is a product of industrialization and trade—the foundations of the modern American economy. Americans live in cities from economic necessity and a desire to enjoy the social and cultural advantages cities offer. At the same time they yearn to own a separate piece of land, to be closer to nature and to be free of the limitations imposed by living too close to others. This dichotomy has been made more difficult by America's extremely rapid change from a rural to an urban society and by the multinational nature of the American society, in which members of many different ethnic groups find themselves living very close to one another—and trying to tolerate and accept one another's different ways of living—in the huge cities of the United States.

The social problems that are products of the rapid growth of urban populations will be alleviated as more and more creative approaches to urban living are found. Urban planning and renewal with a central consideration for human well-being—an unaffordable luxury in the early stages of industrialization—have become the standard in America's post-industrial phase. The outlook for America's cities and for the quality of life for the nearly 80 percent of the American people who live in urban settings is hopeful.

ETHNIC GROUPS AND MINORITIES

By Bruce Oatman (Fordham University)

Erica Ward is a sixteen-year-old high school student who lives in a small town in New York State. For a recent school history project she was asked to count the different ethnic groups from which she is descended. After discussing this question with older relatives, she put together this list:

  • Nationality groups—English, Dutch, German, Irish and French.

  • Racial groups—white, black and Native American.

• Religious groups—Catholic Christian and at least five types of Protestant Christian: Baptist, Mormon, Methodist, Congregationalist and Unitarian. In addition, some of her cousins are Jewish.

Erica's earliest known ancestor to migrate to the New World was Dutch, and landed in New York in 1678. The most recent migrant was a German who came to Philadelphia in about 1848. Of course her Creek Indian ancestors have been in America for thousands of years.

According to the 1990 census, about one- quarter of Americans trace their dominant ancestry to Great Britain. Half are descended from people from other European nations. The remainder are descended from Native Americans, Africans, Hispanics and Asians.

For 300 years, the coming of different groups to the United States has involved their

struggles to make a living and to be accepted as equal partners in American life. Many immigrant groups have moved from a position of disdained outsider to one of full participation in social and economic life; some other groups have yet to complete this journey.

NATION OF DIVERSITY

The United States is a country of many ethnic groups. An ethnic group is made up of people who share one or more characteristics which make them different from other groups. They may share specific racial or physical traits, speak their own language or practice a distinctive religion. They are usually bound to one another by common traditions and values, and by their own folklore and music. Some of their activities may be determined by unique

institutions, such as a complex family structure or the social practices within their communities. Members of an ethnic group tend to see themselves—and to be seen by outsiders—as separate from other people.

The Harvard Encyclopedia of American Ethnic Groups lists 106 major groups in the United States today, including Native Americans, Albanians, Afro-Americans, Arabs, Burmese, Chinese, Eskimos, Filipinos, Greeks, Irish, Italians, Jews, Mexicans, Puerto Ricans and Swiss. There are really more. For example, there are more than 170 different Native American tribes. For the sake of simplicity, the encyclopedia treats them as one. In the same way, Syrians, Jordanians, Egyptians and Palestinians are all counted as Arabs.

Most members of ethnic groups long established in the United States have lost much of the distinctiveness of their culture. Third generation Germans, for example, may only speak English and may think of themselves as "plain" Americans. Third generation Chinese, however, often retain their language and many cultural and family traditions. They will usually define themselves as Chinese-Americans.

Members of most ethnic groups are full participants in the broad tapestry of American life, even if they keep alive many of their old traditions. The Irish, Danes, Germans Italians, Poles, Jews, Mormons and Catholics, for example, have moved into almost all social, economic and political sectors.

Some ethnic groups, however, suffer disadvantages which continue to keep them from freely participating in some areas of American professional and cultural life. Poverty and all the deprivation that goes with it often make it more difficult for black Americans and Puerto Ricans to acquire the social and educational skills needed to enter more desirable and more highly paid occupations. Racial prejudice and discrimination against blacks, Chinese and Native Americans has often meant that many members of those groups have been forced to live and work in narrow sectors of American life. Recent Hispanic immigrants, such as Mexicans and Puerto Ricans, also have encountered discrimination based on their ethnicity.

Those ethnic groups which suffer systematic economic or social disadvantages are called minority groups j About one of every five Americans is a member of such a group.

In the past, many minority groups overcame the barriers that confronted them. The Irish, the Italians and the Germans, the Catholics and the Jews all faced hostility and discrimination which severely restricted their opportunities for decades. In time they largely overcame these barriers and became fully integrated into national life. There are many signs that today's minorities are following the same path. For several decades, it has been an official aim of public policy to encourage such an outcome.

COLONIAL BEGINNINGS

Among the major European powers that attempted to settle North America, Britain was the most successful. Its colonies in Virginia (1607) and Massachusetts (1620) laid the

foundation for the experiences of ethnic groups in the following centuries.

The English language, as well as English laws, and social, economic and religious customs were successfully transplanted to the New World. All of the groups which followed these earliest colonists were measured by their adherence to English standards. This meant that later immigrants had to undergo a period of adjustment during which they were treated as outsiders. During the colonial period Germans, Scotch-Irish, French Protestants and others had to undergo these trials.

The colonists' relations with the Native Americans were full of conflict from the beginning. This was because the two communities did not share the same social and economic values. When the colonists found they could not turn the "Indians" into trading partners, they perceived them only as an obstacle to a more rapid exploitation of the land by Europeans. As many thousands of immigrants were brought to the colonies in the first few decades, they entered an intense competition with the Native Americans for land. By the 1670s, the pattern had been set: Most territorial or economic conflicts between whites and Native Americans were settled by force of arms. That practice continued for 200 years.

Slaves had been imported from Africa into Virginia by Dutch and Spanish traders as early as 1619. Later in the same century, immigration from England slowed, while the need for cheap labor increased. This led to an enormous increase in the slave trade after 1662. Most of these Africans were imported to work on large agricultural plantations, but they soon were found in a wide range of craft and service occupations. In 1671, one in 20 residents in Virginia was black; by 1770, four in 10 were.

One of the longest lasting aspects of the subjugation of blacks and Indians was the common European view of them as uncivilizable, naturally cruel and simple- minded peoples. In one form or another these racist ideas must still be combated today.

INDEPENDENCE TO CIVIL WAR

The patterns of the colonial period long endured. Immigration was encouraged when people were needed—to settle the newly annexed lands of the Northwest Territories in the early 1800s and to help build canals and, later, railroads, for example. The new immigrants were usually poor and found themselves on the bottom of the social and economic scale. Over the course of a generation or two, most European immigrants could merge into the larger Anglo-American society and escape the burden of minority status. This was not possible for Afro- American slaves and for Native Americans.

While ethnic and minority groups were struggling with one another for economic security, the new United States had become the most democratic nation on earth. Free competition encouraged people to feel that each person's ideas (and efforts) were worthy to be judged against every other person's ideas.

The recognition that the rights of each citizen depended upon maintaining the rights of all was a central theme in the Declaration of Independence (1776). In that document, each

citizen was declared to have natural rights to the security of life, the exercise of social and political liberty, and to the pursuit of the economic goals of his own choosing.

The Declaration also asserted that "all men are created equal." It may seem strange that this idea was emphasized in the presence of slavery and a clear inequality among actual groups. However, the writers were repeating a view which was already a fundamental ideal within the American system. A Massachusetts legal code of 1641 had asserted the right of every person "... whether inhabitant or foreigner to enjoy the same justice and law that is general for the colony."

Ideas have real consequences, even when they only imperfectly describe the world. One consequence of the idea of equality in American history is that all groups have felt free to struggle to raise their economic and political status in relation to other groups.

A great influx of immigrants occurred after 1820. The opening of the territory in the West and the development of industry created new opportunities for millions of people. By 1850, the population numbered 23 million. Only 10 years later it totalled 31 million.

Between 1840 and 1860, Europeans from Ireland, Germany and Great Britain came in great numbers to the United States. A few of these came to escape religious or political persecution, but most sought greater economic opportunity.

Most of these immigrants landed at one of the five major American ports: New York, Boston, Philadelphia, Baltimore and New Orleans. New York was the nation's largest city and led all the others as a center of commerce and industry.

Many immigrants remained in the cities. Others moved inland and to the West. By 1860, immigrants and their children were a majority of the population in New York, Chicago and New Orleans. Jobs were created as fast as boats could bring people to fill them. Except for the port city of New Orleans, the South attracted few settlers, since it provided little industrial development to create jobs.

The Germans were the largest 19th century immigrant group. They settled in a wide range of locations. By the end of the century they made up the largest single foreign element in 26 states. They worked as farmers, craftsmen and professionals, and came from all classes of citizens in their native land.

The Germans attempted to retain their language and their traditional ways of life. They created communities with old-world institutions such as concert and lecture halls, schools and theaters, beer gardens, and social and athletic societies. They were both Protestant and Catholic.

In contrast, the three million Irish who came to these shores in the 1840s and 1850s were almost all poor and had been peasants in Ireland. They were also Catholic. This aroused a great deal of fear and anxiety among native Protestants. Poor Protestant workers felt threatened by the willingness of the Irish to work for low wages.

The Irish were the poorest of the 19th century immigrants. They were crowded into the eastern port and industrial cities, where they formed a readily available unskilled labor market for the growing industrial enterprises.

For decades this combination of poverty, Catholicism and economic rivalry led to the

hostile isolation of the Irish. As a result, the Irish suffered the worst discrimination of any immigrants of that era.

The coming of the Civil War (1861-1865) provided the most serious American domestic crisis since the Revolution. This crisis was brought about by a national rivalry between the agricultural slave-owning states of the South and the industrializing states of the North. America's acquisition of vast territories extending to the Pacific Ocean led to a conflict between those who wanted to extend slavery to the new lands and those who did not.

In order to protect their interests, 11 Southern states withdrew from the Union. The war that followed not only abolished slavery but also guaranteed the continued expansion of industry. This led directly to another increase in immigration after the war had ended.

The abolition of slavery created the largest minority group in the nation's history. In a period of only a few years, millions of ex- slaves entered a world where in principle, they were free to compete with all other groups for jobs and other resources. They were greatly handicapped, however, by inadequate education, an agrarian background and widely held prejudice among the other ethnic groups in the United States.

Even before the war had ended there were riots in many cities, often led by the poor Irish, whose low economic position made them feel most threatened by competition from the waves of black job seekers.

In the midst of the Civil War, Abraham Lincoln was moved to reaffirm the underlying ideals of the American Republic in his famous Gettysburg Address. He spoke of the United States as a nation "... conceived in liberty and dedicated to the proposition that all men are created equal."

STREAM OF NEWCOMERS

The demand for workers created by the industrial growth after the Civil War could not be met by native labor alone. Between 1880 and 1930, more than 25 million people came to the United States. In addition, many freed slaves moved to Northern cities, as did many white rural dwellers.

At the end of the 19th century, immigrants began to come from areas that had been little represented in earlier periods. Mexicans moved northward into the southwestern states. Millions of Italians, Slavs and Eastern Europeans, and hundreds of thousands of Chinese, Greeks, Armenians and other groups made the journey to North America.

New York was still the largest port of entry. By 1930, 75 percent of New Yorkers were foreigners or the children of foreigners. Most were Italians and East European Jews.

Italians were the largest group of new citizens. Like the Irish of 50 years before, they were mostly poor peasants. They found work in the United States in construction and heavy industry and on the railroads.

The second largest group were two million Jews from several countries of Eastern Europe. Most settled in and around New York City. The experience of finding themselves a minority group was not new to the Jews, whose history in Europe had been that of suffering from prejudice and discrimination.

Most arrived in poverty. Jewish men worked as skilled or semiskilled laborers. They found jobs in light industries such as clothing, cigar and toy manufacturing. After two or three generations, many Jewish families had found a solid foothold in the middle class.

The third major group which arrived in the early decades of the century were the Slavs. These included Poles, Czechs, Ukrainians, Russians, Bulgarians and people from what today is Yugoslavia.

After 1910 and continuing until the 1970s, black Southerners took part in a large internal migration to the North. Their reasons for moving were similar to those of so many others: to improve their condition in new surroundings and to escape from unpleasant situations. In this case, the persistence of southern racism and the mechanization of southern agriculture made the uncertainties of the North seem mild by comparison.

The adjustment of the various minority groups varied widely. All groups faced prejudice, not only from more established groups, but also from one another. But until the economic depression of the 1930s, unskilled jobs existed everywhere in the country.

Low wages for long hours in unpleasant, sometimes dangerous, surroundings was the common situation of the immigrants. Reform movements in all the major cities in the early decades of the century led to some improvements in working conditions. There were also improvements in housing, public health and sanitation.

Social workers established community houses to provide recreational and educational services. These supplemented the activities of the social and self-help organizations, including churches, that were established by each ethnic group.

Most of the new groups wishe^l to retain their traditional ways of life as much as possible while coming to terms with the realities of American life. Many of the associations that they formed to aid themselves remain active today, long after the children and grandchildren of their founders have become comfortably situated and regard themselves as "typical" Americans.

The greatest outside influence on the children of the immigrants after the turn of the century was the newly expanded system of public education. The public schools saw their role as one of "Americanizing" those children by providing a door into the larger society. They were very successful in achieving this goal.

Next to the church and school, the ethnic newspaper was the most important educational influence in the communities of newcomers. The papers helped their readers to engage successfully in individual and group struggles in the economic and political arenas.

The urban Irish proved to be the ethnic group most skilled at political action. They created complicated political organizations within the Democratic party and dominated the politics of larger cities for decades. Their political "machines" functioned like huge welfare agencies. They distributed jobs, food, advice and favors of all sorts. It was only when public agencies took over many of these functions and when new ethnic groups moved to the cities that the Irish declined as a political force.

After a generation or two, most ethnic minority groups had considerably improved their economic situations. Their success

depended upon their acceptance of dominant American cultural traditions. This meant that there occurred a progressive loosening of strict ethnic ties among most groups. The twin themes of social acceptance and ethnic loss have been reflected in popular literature for more than a century.

NEW ETHNIC CONTOURS

The Second World War (1939-1945) and the 20 years which followed were times of rapid economic expansion in the United States. Tens of millions of American workers were able to move into the middle class. Young Italians, Irish, Jews and others found it easier than ever to get the education and skills that would let them improve their situations. The "minority" classification of these groups became steadily less important. Large numbers of Hungarians who came to the United States in the mid- 1950s also moved quickly into the mainstream of American society.

Yet other disadvantaged minorities remain. Black Americans are only now beginning to overcome the effects of 250 years of slavery. The continued movement of blacks to the larger cities since 1945 has coincided with the loss, because of increased technology, of the unskilled jobs that served other groups as the first step up the economic ladder. Since the 1950s, Black Americans have been moving into the mainstream of American life. Though a fairly large black middle class has emerged, many blacks continue to exist on the economic margins.

Millions of poor Mexicans and other Hispanics have entered the country in recent years, along with more than one million Spanish-speaking American citizens from Puerto Rico. Hispanics are now the fastest growing minority group in the United States. Many have found it difficult to move out of marginal positions, though one notable exception to this statement are immigrants from Cuba who have, in a relatively short time, established themselves in business and professions and gained both affluence and political power.

The situation of the Native American, many of whom must choose between living on reservations or moving outside remains difficult, as well.

New waves of immigrants have recently begun to arrive from Korea, the Philippines, Haiti and Southeast Asia. These groups, following the pattern set by earlier waves of immigrants from China and Japan, are establishing themselves in small businesses, working tirelessly, and investing all of their efforts and money to ensure that their children receive the education and learn the skills necessary to build a prosperous and satisfying life. They are only the latest to seek a new life on these shores. They will almost certainly not be the last.

LEGAL AND POLITICAL STRUGGLES

Since the 1940s, the federal government has taken a very active role in providing ways for the poor and disadvantaged to move into the social and economic center of American life.

Special efforts have been made on behalf of blacks, whose heritage of disadvantage has

made them unique. President Truman's integration of the armed services in the late 1940s was an important step. Another was the ruling of the United States Supreme Court in the 1954 school desegregation case. The Court ruled that "Separate educational facilities are inherently unequal. People cannot be deprived of the equal protection of the laws guaranteed by the Constitution."

In 1964, President Lyndon Johnson declared a "war on poverty." He was prompted by a wide protest movement whose aim was to secure full civil rights for blacks and other deprived Americans. Johnson declared:

"... our goal—an America in which every citizen shares all the opportunities of his society, in which every man has a chance to advance his welfare to the limit of his capacities. We have come a long way toward this goal. We still have a long way to go."

In the last 20 years, certain people who feel themselves systematically disadvantaged have argued that their situation is similar to that of ethnic minorities. Women have successfully lobbied for the full range of jobs and pay that are available to men. Women have won the right to be covered under the antidiscrimination statutes of the civil rights laws. They can sue for redress if they believe they have been denied rights generally open to men.

Other groups have made similar claims. Older people are fighting against mandatory retirement rules; other groups have organized to fight the systematic discrimination that they face.

THE FUTURE

Future success in raising the economic level of blacks and other minorities depends largely on the growth of the economy. When economic life falters, group conflict and prejudice increase. This is because people see themselves as competing for the same scarce resources, such as jobs.

The American economy is undergoing an historic transformation. Traditional industrial jobs are being lost to other countries. The recent enormous growth of jobs has been concentrated in service sectors. Many of these jobs require skills beyond the level of many ethnic minority members.

Many people are also trapped by poverty in the central areas of large cities, where few new jobs are being created. The social demoralization of some ethnic minorities is also a barrier that keeps them from taking advantage of actual opportunities that are available to them.

The belief in an essential equality among all people has been part of the cultural heritage of Americans since the founding of the United States. Many efforts have been made in recent decades to reform social and economic life to conform to the ideal. Today, despite vigorous debate over specific programs and policies, all levels of government regard aid to the disadvantaged and the enforcement of antidiscrimination laws as very important areas of their activities.

The social drama of the struggle for equality and acceptance will continue, as it has for over 300 years. As always, the leading roles in this drama will be played by ethnic groups, old and new

Suggestions for Further Reading

Foster, David William.

Sourcebook of Hispanic Culture in the United States.

Chicago: American Library Association, 1982.

Glazer, Nathan.

Ethnic Dilemmas, 1964-1982.

Cambridge, MA: Harvard University Press, 1983.

Sowe 11, Thomas. Ethnic America: A History. New York: Basic Books, 1981.

Thernstorm, Stephan et al., eds.

Harvard Encyclopedia of American Ethnic

Groups.

Cambridge, MA: Harvard University Press, 1980.

Waltzer, Michael et al.

The Politics of Ethnicity.

Cambridge, MA: Harvard University Press, 1982.

BLACK AMERICA

By Michael Cusack

(Senior Editor, Scholastic Magazine)

JAMESTOWN BEGINNING

The history of blacks in North America began in August 1619, when a small Dutch warship sailed up the James River to the young English colony of Jamestown, Virginia.

The Dutch ship had captured a Spanish ship in the Caribbean Sea carrying black men and women to Spanish colonies in South America. At that time, the Jamestown colonists needed workers to help clear and till the land and build houses. So the Jamestown settlers welcomed the blacks as a source of free labor.

In 1619, the English did not have the practice of slavery—the complete ownership of one person by another person. But they did have the practice of indentured service. That is the ownership of a person's labor for a period of time by another person or group of people. Many of the first English settlers in North America were indentured servants. They had pledged their labor to pay for their ship passage to the New World, to pay old debts, or to make up for some small crime. In some cases, they were tricked, cheated, or even kidnapped into indentured service.

The 20 blacks landed from the Dutch ship were viewed as indentured servants. Black and white indentured servants worked side by side at Jamestown, clearing fields, planting crops,

Martin Luther King, Jr. one cfthe outstanding black leaders cfthe 20th century, joins with fellow civil rights activists in escorting black children to a previously all-white school in Mississippi in 1966. The civil rights movement to end racial discrimination is one of the milestones of modern American history. Wide World Photos making roads and building houses. The death rate at Jamestown was extremely high—for landowners and servants, black and white—and the need for labor was great. To meet this demand, ships' captains often bought, traded or captured blacks from the Spanish and Portuguese.

Though an increasing number of black servants arrived in the English colonies during the early 1600s, the vast majority of indentured servants were white. During the period, black and white indentured servants had the same status. When their period of service was over, they were considered to be free. They were then able to marry, own property and, in some colonies, exercise all the rights and responsibilities of citizenship.

SLAVERY

Gradually, however, the status of black servants changed. Between 1640 and 1680, Virginia and the other southern colonies drifted steadily toward the establishment of a system of slave labor.

Most white indentured servants had a set term of servitude, and they knew it. No matter how badly they were treated, they could look forward to eventual freedom. They usually had written contracts stating when they would be free.

Blacks had no such contracts. They were brought to America by ships' captains who sold them to the highest bidder. In the early 1600s, the buyers and sellers sometimes agreed on a period of servitude for black indentured servants. That helped support the feeling that the buyers and sellers were trading in labor not people. However, the black servants had no voice in these dealings. And since the buyers wanted to get the greatest value for the price they paid, it became commonplace that black servants were indentured for life. It also became customary that the children of black indentured servants were considered to be indentured from birth to death—in other words, they were held in slavery. Near the end of the 17th century, all pretense that such a system wasn't slavery faded away.

Because blacks could be owned for life, the demand for black slaves outstripped the demand for white indentured servants. The demand for black labor on the large plantations of Maryland, Virginia and the Carolinas was great. To satisfy this demand, special ships were built to transport captive blacks directly from the west coast of Africa to the slave markets of North America. During the 18th century, the slave trade boomed. It brought death and untold suffering to millions of blacks. At the same time it made a number of people in Britain and in the British American colonies immensely wealthy.

Throughout the 18th century, an increasing number of people in Britain and North America spoke out against the slave trade. But the wealthy slave owners and slave traders had powerful friends in government and were able to defeat all attempts to end the slave trade.

CONFLICTS OF CONSCIENCE

During the late 1600s and early 1700s, slavery existed in practically all the North American colonies. While most black slaves were held on large farms and plantations, it wasn't unusual for small farmers and tradespeople to own one or two slaves.

By the mid-1700s, many small farmers and tradespeople had mixed feelings about slavery. They wanted cost-free labor, but they were uncomfortable with the idea of owning

another person. This was in conflict with the growing revolutionary idea that all men are created equal.

At about the same time, many small farmers and tradespeople found that it was not always profitable to own slaves. Slaves and indentured servants had to be fed all year round, but the need for their labor might vary from season to season. Some farmers found that it was cheaper to hire day laborers when needed than to own slaves.

As small farmers started disposing of their slaves, some were freed, but most were sold to plantations in the West Indies, Virginia and the Carolinas. Unlike a small farm or tradesman's shop, a plantation provided an impersonal setting for slavery. Hundreds—even thousands—of slaves might live and work on a large plantation. The plantation owner, who hired professional overseers, did not usually have daily contact with most of the slaves. Food, housing and clothes for the slaves were seen as costs to be kept as low as possible.

The plantation economy was based on the large scale production of cash crops, such as tobacco and cotton, through the use of very cheap labor. The farmland of entire regions— much of Virginia, the Carolinas and Georgia— became linked to that economy. It was felt that any change in the institution of slavery could cause the economic and social collapse of those regions. This fear caused a number of people to contradict their own ideals of freedom, equality and the rights of man.

During the 1770s and '80s, the American colonists fought for independence from Britain. They called for self-determination, democracy, equality and recognition of the natural rights of man. Yet many outspoken advocates of American freedom—including Patrick Henry, George Washington and Thomas Jefferson—lived within a system of slavery. They sometimes wrote against slavery, and Washington even wrote a provision in his will that led to the eventual freedom of his slaves. But the system of slavery was firmly entrenched. Some colonists said that while they personally deplored slavery, they had to accept it as an economic necessity. Others argued that blacks were secure and happy as slaves.

Other advocates of slavery went a step further. They used pseudoreligious and pseudoscientific arguments to "prove" that blacks were inferior and therefore suited to be slaves. All these attempts to justify slavery in a land where personal freedom was highly valued created a barrier between black and white communities.

Over the years, several black men and women achieved fame and fortune in the arts, sciences, religion and commerce. Some had high standing in colonial society. Many joined in the struggle to forge a new nation—the United States. Yet all were subject to the constant handicaps and indignities imposed by prejudice and discrimination.

ACHIEVEMENT AND STRUGGLE

As the United States of America entered its first century of existence, free blacks faced a double struggle. One part of the struggle was for personal achievement—for a chance to use one's talents and abilities to gain a secure, respected place in society. The other part was

to cast off the yoke of slavery that oppressed all blacks—free and captive.

Against tremendous odds, many blacks raised themselves to positions of influence in American society. From those positions, they campaigned for freedom and dignity for all blacks.

Many names stand out. One was Benjamin Banneker (1731-1806) who gained fame as an astronomer, mathematician, author and inventor. He also helped design the city of Washington, D.C. Banneker, who had always been free, could have enjoyed his prestige and wealth without conflict. Yet he chose to challenge the American establishment on the issue of slavery. In a famous letter to Thomas Jefferson, Banneker asked the statesman to live up to the full meaning of his words, "all men are created equal."

Paul Cuffe (1759-1817) was one of 10 children of a former slave. Growing up free but poor, in Massachusetts, Cuffe gradually gained wealth through farming and shipping. By 1800, he was one of the wealthiest men in Massachusetts. But he used most of his wealth to help others. At his own expense, he built a public school and hired a teacher to educate all the children of Westport, Massachusetts— white and black.

Cuffe worked hard to end slavery. He helped to free many individual slaves. But as he saw that many free blacks ended up in conditions of inequality, dire poverty and frequent humiliation, he concluded that freedom alone was not enough. In Cuffe's opinion, the answer was in Africa, the continent of their ancestors.

Cuffe organized the Friendly Society to help former slaves go to Africa as free people to set up a new nation. After Cuffe's death, this resulted in the creation of the Republic of Liberia. Between 1820 and 1860, about 11,000 American blacks moved to Liberia.

Though Cuffe's "back to Africa" idea has been echoed several times during the long struggle for freedom and equality, most American blacks never considered it the best solution. A number of black leaders who had at first been in favor of the idea, turned against it in the 1840s and '50s. In their opinion, the best course would be an all-out drive to end slavery and then gain full equality and citizenship rights for all blacks.

One of the leaders who held this view was Frederick Douglass (1817-1895). Born a slave in Maryland, Douglass escaped when he was 21. He went north to New Bedford, Massachusetts, where he was welcomed by black and white abolitionists—people trying to end slavery. There, it soon became obvious that Douglass was a great writer and public speaker. And he used both talents to rally people against slavery. He demanded full freedom and complete equality for all blacks. He argued that blacks were as much a part of the American tradition as any other group. And he urged that they be given the freedom and opportunity to contribute and participate in all aspects of American life.

Contending that slavery was morally wrong, Douglass and other abolitionists openly encouraged blacks to escape to freedom. Means of helping runaway slaves were set up in various places. This led to the creation of an escape route called "the underground railroad."

ESCAPE TO THE NORTH

From the first days of slavery in America, there were escape attempts. In colonial times, runaway slaves often took refuge in swamps, forests, mountains, and among Indian tribes. Then, starting with Pennsylvania in 1780, several northern states abolished slavery. So fugitive slaves frequently sought refuge in those "free" states. To stop that, the Congress passed the Fugitive Slave Law of 1793. This law required the authorities of all states and territories to arrest and return fugitive slaves. It also led to "bounty hunting."

Slave owners offered bounties (rewards) for the return of runaways. Not only did this tempt people along the way to capture fugitive slaves, it also created a group of professional "bounty hunters." These hunters pursued fugitives across state borders in the hope of collecting rewards.

During the early 1800s, the men and women who tried to escape from slavery were usually alone and unaided. Their attempts often ended in recapture or death. Then, starting in the 1830s, people opposed to slavery provided money, food and hiding places for fugitives. Escape routes were mapped out, and word of them spread through the slave quarters of plantations.

The system of escape routes became known as the "underground railroad." Hiding places were called "depots." People providing money were called "stockholders." And guides who led fugitives along the escape routes were called "conductors."

Many of the "conductors" were free blacks or former slaves. They often plunged deep into slave states to contact escapees. This was dangerous. If captured, former slaves went back to slavery. But free black "conductors" were also likely to end up in slavery ... or dead. Gunfights between bounty hunters and armed "conductors" increased as the number of escapes from slavery sharply increased.

To blunt the work of the "underground railroad," a tougher Fugitive Slave Law was passed by Congress in 1850. The 1850 law called for "severe penalties to be imposed on anyone assisting Negroes to escape from bondage." It also authorized federal marshals to "command all good citizens to aid in the capture of fugitives." As a result, bounty hunters were appointed as marshals in slave states. Then, with the full backing of the law, they were able to prowl the free states in search of fugitive slaves. This did not stop the "underground railroad conductors." It just made their work harder.

The most famous of the underground conductors was a young woman named Harriet Tubman (1821-1913). In 1849, she escaped from slavery in Maryland and made her way to Philadelphia. Over the next 10 years, Harriet Tubman made 19 trips into slave states and led more than 300 men, women and children to freedom. On early trips, Harriet Tubman led the fugitives to such northern cities as New York and Philadelphia. But the 1850 law made those cities unsafe. So Tubman decided to lead the people in her care all the way to Canada, where they would be beyond the reach of lawmen and bounty hunters

THE END OF SLAVERY

Emancipation, or the ending of slavery, didn't happen in a single day. The process began in April 1861 with the outbreak of the American Civil War between free states of the North and slave states of the South. During the war, wherever the Union or Northern Army gained control, slavery, for all practical purposes, was ended. It's estimated that half-a-million slaves escaped to Union-controlled areas.

The next big step in the process took place on January 1,1863. President Abraham Lincoln issued the Emancipation Proclamation, declaring that slaves in states, or portions of states, at war against the United States were free. Few slaves were freed, however, since most lived in the rebellious South. Nevertheless, the Proclamation was a critical turning point: It increased Northern support by making the end of slavery a principal objective of the war. Freedom for all slaves came later, in 1865, when the war ended and Congress passed the 13th Amendment to the Constitution, which completely abolished slavery. Another Amendment, the 14th, gave blacks full citizenship rights. For a time, many hoped that blacks and whites could live together in a state of equality and tolerance. But local laws and customs were used to deprive blacks of voting rights. In most former slave states, a system of racial segregation arose, and blacks had to use separate schools, churches, hospitals, parks, swimming pools, lunchrooms, washrooms, bus sections and theater sections.

In the early years of the 20th century, lynchings—the illegal killing of people for real or imagined crimes—greatly increased After the First World War, the promise of equality and opportunity in the South for blacks seemed further away than ever. As a result, many blacks moved from the rural South to the great cities of the North. Although northerners did not practice formal segregation, blacks encountered discrimination in jobs and housing.

However, progress did occur during the difficult years from 1919 to 1950. Individual blacks made breakthroughs in education, science, sports, entertainment, business, engineering and most of all in music and the arts. Blacks gained influence in organized labor, industry and government. There were black university presidents and black millionaires.

American blacks also achieved fame far from home. Dr. Charles Drew (1904-1950) advanced medical science and saved millions of lives during World War П by his discovery of a way to preserve blood. Dr. Ralph Bunche (1904-1971), Undersecretary General of the United Nations, saved countless lives by promoting peace in the Middle East during the late 1940s. For this achievement, he was awarded the 1950 Nobel Peace Prize.

RENAISSANCE AND WAR

Blacktalent in the arts and music flowered during the 1920s, '30s and '40s. This artistic awakening began in Harlem, a mostly black section of New York City, and was known as "the Harlem Renaissance."

The Harlem Renaissance produced the novels of Zora Neale Hurston (1903-1960) Richard Wright (1908-1960) and Frank Yerby (1916- ). It inspired the poetry of Countee Cullen (1903-1946), Langston Hughes (1902- 1967), James Weldon Johnson (1871-1938), Claude McKay (1890-1948) and Sterling Brown (1901-1984). It drew strength from the philosophical writings of Alain Locke (1899- 1974), the first black to win a Rhodes scholarship to Oxford University in England. It echoed with the music of Duke Ellington (1899-1974), Count Basie (1904-1984) and Louis Armstrong (1900- 1971), and the glorious voices of Paul Robeson (1896-1976), Roland Hayes (1887- ) and Marian Anderson (1902-). And it glowed with the paintings and murals of Jacob Lawrence (1917- ), Charles White (1918- ) and Lois Mailou Jones (1908- ).

However, neither the glory of the Harlem Renaissance nor the achievements of individual artists did much to improve the daily lives of most blacks. The decade of the Great Depression—the 1930s—was a difficult time for all Americans, but it was particularly hard for black Americans. In many communities, when welfare aid or jobs were given out, whites came first. For many black families, staying alive was a daily struggle.

The economic depression ended with the outbreak of World War II. As America's factories started turning out the weapons of war, blacks as well as whites benefited from the employment boom. American society became more mobile during the war years and many of the discriminatory practices against blacks were eased, particularly in the North.

In September 1940, President Franklin D. Roosevelt ordered the nation's first peacetime conscription. Young men—blacks and whites—answered the call. But the call affected blacks and whites differently. White youths were rushed to training camps. But black youths had to wait around for months until there was room for them in black units.

Before the draft, fewer than 4,000 blacks were serving in the Army. Most were in support units—supply, construction, food service and transportation. Saying "we want to be soldiers not servants," young blacks strongly objected to this situation. And they were backed up by the entire black community. Many whites, including Eleanor Roosevelt, the president's wife, joined the campaign to get blacks the right to fight for their country.

The campaign succeeded. On December 1, 1941, all specialties in the Army, including the Army Air Force, were opened to qualified blacks. Six months later, the Navy, Coast Guard and Marine Corps followed suit. As the war continued, black combat units fought on all fronts and gained the admiration of the entire nation. One black unit in particular made a name for itself. It was the 332nd Fighter Group of the United States Army Air Force. In the skies over France and Germany, pilots of the 332nd destroyed 261 enemy planes. In March 1945, the Group was awarded a Distinguished Unit Citation. Individual pilots were awarded a total of 904 medals for bravery. The actions of the 332nd group came to symbolize the struggle of all blacks for full equality and an end to segregation in the armed forces.

That goal was achieved on July 26, 1948. President Harry S Truman ordered "...equality of treatment and opportunity for all persons in the Armed Forces without regard to race, color, religion, or national origin." This was one of several cracks that appeared in the wall of segregation in the late 1940s and early '50s. Another was the Supreme Court decision on May 17, 1954 banning segregation of the races in public schools. This was a major blow against segregation, and it inspired many black leaders to press for integration in all aspects of American life.

CIVIL RIGHTS MOVEMENT

At the same time, black leaders felt that the people themselves would have to take action to end discrimination and denial of civil rights. One opportunity for action was presented by the arrest of a woman named Rosa Parks in Montgomery, Alabama, on December 1,1955, for refusing to give up her seat to a white person on a city bus.

After getting Rosa Parks out of jail on bail, the National Association for the Advancement of Colored People (NAACP) planned a course of action to end segregation on buses. They decided to ask Montgomery's blacks to boycott—not use—the city's buses. This would be costly for the bus company since most of their riders were blacks. But it would also be hard on Montgomery's poor blacks who didn't have cars and couldn't afford taxis.

The following Monday was set for the boycott. Montgomery's black churches joined in the planning and preparation. Car pools were organized. Black taxi owners made their cars available. Leaflets were handed out to black families all over the city.

Martin Luther King, Jr., the new pastor of the Dexter Avenue Baptist Church was asked to take charge of the boycott. Several people wondered about that choice. Dr. King was only 27 years old at that time and had had no experience in social action. He was a great preacher who could capture the emotions of a congregation, but could he manage a bus boycott? °°5r

Early on Monday morning, December 5, 1955, King and his wife Coretta Scott King started watching the bus stop near their home. The first bus was empty. So was the second. The third had two passengers—both white. The boycott was working.

Over 95 percent of the black riders stayed off the buses. The boycott lasted over a year and cost the city more and more money each day. Finally, on November 13,1956, the Supreme Court decided that segregation on buses was unconstitutional. The Montgomery bus boycott showed that nonviolent direct action could produce results. It brought blacks from all walks of life together in an almost religious fellowship. And it produced a black leader—Martin Luther King, Jr.—who could move millions to action and touch the conscience of the nation.

Moving on from Montgomery, King led ' direct nonviolent actions for civil rights in all parts of the country. In the South, old barriers of segregation crumbled. In the North, more subtle forms of discrimination in housing and jobs were slowly chipped away.

In the spring of 1963, King went to Birmingham, Alabama, a city with a bad record of discrimination. Parks, eating places, drinking fountains and rest rooms were segregated. King organized local blacks to march quietly and nonviolently through downtown areas of Birmingham. At first, the police arrested thousands of marchers. When that failed to stop the marches, the police attacked the demonstrators with clubs, dogs and firehoses. Through it all, the demonstrators remained nonviolent. And the whole nation watched by means of television. This caused such a public outcry against the white authorities of Birmingham that they had to back down and desegregate their public facilities.

A high point of the civil rights movement occurred on August 28,1963 when 250,000 people of all races marched in Washington, D.C., to demand that the nation keep its pledge of "justice for all." In a moving and dramatic speech,

Martin Luther King said: "I have a dream that one day on the red hills of Georgia the sons of former slaves and the sons of former slaveholders will be able to sit down together at the table of brotherhood.... I have a dream that my four little children will one day live in a nation where they will not be judged by the color of their skin, but by the content of their character."

The focus of civil rights activity then shifted to Washington, where, after lengthy debate, the Congress passed laws prohibiting discrimination in voting, education, employment, housing and public accommodations.

The Civil Rights Acts of 1964,1965 and 1968 were landmarks in dismantling the legal basis for discrimination.

TODAY

Martin Luther King continued to conduct civil rights campaigns throughout the country, and in 1964 he was awarded the Nobel Peace Prize in recognition of his decade of leadership in nonviolent protest against discrimination. Tragically, he was assassinated in Memphis, Tennessee, on April 4,1968.

King's murder led to riots in several cities, and also lent strength to another wing of the civil rights movement While King had preached non-violence and worked with whites, other black leaders repudiated those tactics.

'The day of non-violence is over," said Malcolm X, a leader of the "Black Power" movement.

Nevertheless, most historians credit King—who made the white establishment see the injustices inflicted on blacks—with doing the most to raise the status of black people.

How much of Dr. King's dream has come true? And what problems remain to be solved?

There are still poor, all-black areas in American cities. The average income of blacks is lower than that of whites. Unemployment of blacks—particularly of young men—is higher than that of whites.

On the other hand, the black middle class continues to grow. In 1989,44 percent of employed blacks held "white color" jobs—managerial, professional and administrative positions rather than service jobs or jobs requiring physical labor— compared to 40 percent in 1983. And this trend is expected to continue, partly because more blacks are getting a university education. In 1989,23.5 percent of blacks between 18 and 24 were enrolled in college, compared to 15.5 percent in 1983.

In recent years, the civil rights debate has focused less on outright racial discrimination, which the overwhelming majority of Americans agree is wrong, than on whether the effects of past discrimination require further programs by the government Such programs are often referred to by the term "affirmative action." They may set goals for employing a certain number of blacks (or other minorities) in a business by a target date, or call for enrolling a certain number of minority students in a school or college. The necessity and effectiveness of such "affirmative action" programs remains a controversial issue in the United States today.

In 1991, as Congress—and the public—debated the merits of legislation designed to combat job discrimination, the focus was on these issues. Supporters of the legislation said it was needed because some Supreme Court decisions had made it harder to prove discrimination. Opponents said it would force employers to hire blacks even if they woe less qualified than whites. And since an economic slowdown was making competition for jobs very tough, many people felt very strongly about the legislation.

But, although such issues remain to be resolved,

there will be no turning back from the goals of Dr. King's dream.

Young white Americans now share with black Americans a new appreciation of blacks in history. Older generations were often aware of just three outstanding blacks. These were educator Booker T. Washington (1856-1915), founder of Tuskegee Institute, George Washington Carver (1864-1943), world- renowned botanist and Mary McLeod Bethune (1875-1955), promoter of equal education for black women. Now most young people are aware that blacks have played important roles in all phases of American history. They are aware of the early American poet Phyllis Wheatley (1753- 1784), the exploits of black frontiersman Jim Beckwourth (1798-1867), Arctic explorer Matthew Henson (1866-1955), aviation pioneer Eugene J. Bullard (1894-1961) and crusading journalist Ida Wells Barnett (1862-1931).

Moreover, increasing numbers of blacks are playing important roles in American life. In 1983, Guion S. Bluford, Jr., a black astronaut, traveled in space. In 1988, Jesse Jackson, once an aide to Martin Luther King, was a leading contender for the Democratic nomination for President, and in 1991, Douglas Wilder of Virginia, the first black governor in the U.S. announced his candidacy for the Presidency. In 1990, blacks held more than 7,000 elective offices, ranging from school board officials to members of Congress and mayors of major cities—including the nation's largest city, New York.

Perhaps the greatest change in the past few decades has been in the attitudes of America's white community. A generation has come of age since Martin Luther King's "I have a dream" speech. Characteristic of this new generation is a new tolerance between blacks and whites and an increasing acceptance by whites of blacks in all walks of life and social situations.

THE NATIVE AMERICAN

By Shelly Orenstein (Managing Editor, Scholastic Magazine)

The story of the Native American—or American Indian—is one that is unique, tragic and ultimately inspiring. It is unique because the Indians were the original inhabitants of the American continent and experienced every phase of its European settlement, from the earliest 17th century colonies to the closing of the western frontier at the end of the 19th century. It is tragic because the conflict between the Indians and whites paralleled the experience of traditional peoples throughout the world who have come in contact with expanding, industrialized societies. It is an inspiring story because the Native Americans although dispossessed of much of their land in the 19th century, have survived, have asserted their political and economic rights, and have succeeded in retaining their identity and culture despite the onslaught of modern civilization.

In the state of Wisconsin, Chippewa Indian children are using computers to learn the language of their ancestors. Says one Chippewa father: "It is important for them to learn Ojibway in order to understand the spiritual aspects of our religion."

Today, Native Americans are full citizens of the United States who are proud to be Americans. However, they are equally proud of their own cultural heritage, and, though it is difficult in the modern world, they are trying to protect and maintain it.

Marks of that heritage can be found all over the United States. Many of the names on United States maps—Massachusetts, Ohio,

Painting of Pawnee Indian warriors by artist Charles Bird King. The Pawnee are a tribe which lived in what is now the state of Nebraska where they raised corn and hunted buffalo. In 1875 they moved to lands in what later became the state of Oklahoma. National Collection of Fine Arts

Michigan, Kansas, Idaho and more—are Indian words. The Indians taught the Europeans how to cultivate crops such as corn, tomatoes, potatoes and tobacco. Canoes, snowshoes and moccasins are all Indian inventions. Indian handcrafted artifacts such as pottery, silver jewelry, paintings and woven rugs are highly prized.

About 62 percent of the Indians in the United States live in large cities and rural areas scattered throughout the country. The remainder live on about 300 federal reservations (land set aside for their use). Together, the reservations comprise 52.4 million acres (21 million hectares) of land, or about 2.5 percent of the land area in the United States. Most reservations are located west of the Mississippi River.

In recent decades, the Native American population has been increasing steadily. Today, there are about 1.9 million Native Americans (0.8 percent of the total population of the United States), which is believed to be more than there were when the first European explorers arrived in the New World. At that time, about one million Native Americans were living in North America. These people were soon overwhelmed by a flood of European settlers. By the time of the American Revolution in 1776, there were some 4 million whites and 600,000 blacks—mostly slaves— living on the continent. Just 40 years later, the white population had swelled to 12.9 million and the black population to 2.5 million. In 1990 there were 251,400,000 people living in the United States.

As European civilization spread rapidly across the continent, the native population declined. Disease and warfare took their toll. By 1920, the Indian population had fallen below 350,000. For a time it seemed the Indians would vanish.

The transfer of land from Indian to European—and later American—hands was accomplished through treaties, war and coercion. It was accompanied by a long struggle between the Indian and European ways of life. In many ways, the history of the United States is the story of this struggle.

WHO WERE THE INDIANS?

In 1492, an Italian navigator named Christopher Columbus set sail from Spain in search of a sea route to Asia. Columbus hoped to obtain access to the wealth of spices, silks and gold for which the Asian continent was famous. Six weeks later, his men sighted land.

Thinking he had landed in the Indies, a group of islands east of the coast of Asia, he called the people on the first island on which he landed "los Indios," or, in English, "Indians." Of course, Columbus had not reached Asia at all. He had landed in the New World (the American continent). But the name "Indians" remains fixed in the English language.

Though Columbus had one name for them, the Indians comprised many groups of people. The Indians north of Mexico in what is now the United States and Canada spoke over 300 languages. (Some 50 to 100 of these languages are still spoken today.) And they lived scattered across the continent in small bands or groups of bands called tribes. To them, the continent was hardly new. Their ancestors had been living there for perhaps 30,000 years.

Scientists speculate that people first came to North America during the last ice age. At that time, much of the earth's water was frozen in the glaciers that covered large parts of the globe. As sea levels dropped, a strip of land was exposed in the area that is now the Bering Strait. Man probably followed the big game he was hunting across this land bridge from Siberia into Alaska.

Over time, these people increased in

number, adapted to different environments and spread from the far northern reaches of Alaska and Canada to the tip of South America.

Some groups, such as the peaceful Pueblo of the American Southwest, lived in busy towns. They shared many-storied buildings made of adobe (mud and straw) bricks. They grew corn, squash and beans.

Their neighbors, the Apache, lived in small bands. They hunted wildlife and gathered plants, nuts and roots. After acquiring horses from the Spanish, they made their living by raiding food and goods from their more settled white and Indian neighbors.

In the eastern woods of the North American continent, the Iroquois hunted, fished and farmed. Like the Pueblo, they were excellent farmers, and 12 varieties of corn grew in their communal fields. Their long houses, covered with elm bark, held as many as 20 families. Each family had its own apartment, on either side of a central hall.

The Iroquois were fierce warriors. They surrounded their villages with wooden stockades to protect them from attack by their neighbors. They fought for the glory of their tribe and for the glory of individual warriors.

The Indians of the North Pacific coast harvested ocean fish and seafood. Tribes like the Haida lived in large plank houses with elaborately carved doorposts. These were called totem poles, and the figures on them were a record of the history of the family which lived in the house.

Many Indians were fine crafts workers. They made pottery, baskets, carvings and wove cotton and plant-fiber cloth. They traveled in small boats and on foot, never having developed the wheel. Some, such as the Plains Indians, used dogs to pull a load-carrying frame called a travois. Others, such as the Winnebagoes of the Midwest developed a sophisticated calendar that took the motions of both the sun and the moon into account.

Different as they were, all tribes were greatly affected by the coming of the white man, with his firearms, iron cooking pots, horses, wheeled vehicles and with his diseases, to which the Indians had no immunities. The European arrival changed the Indian way of life forever.

EARLY ENCOUNTERS

Other Europeans quickly followed Columbus to the New World. Spanish settlers arrived in North America in the early 1500s. They settled in what are now Florida and California and in the southwest section of the continent. They sent missionaries to bring Christianity and "civilization"—farming, crafts and so on—to the "Indians," and they forced the Indians to labor in their fields, mines and houses.

Other Europeans, such as the French and the Dutch, came to the New World in search of profit. Some came to fish the rich waters off of the Atlantic coast; many came to trade with the Indians. They exchanged guns, iron tools, whiskey and trinkets for beaver and otter pelts.

Most often, though, the Europeans came to establish new homes; they came to farm. And for that they needed land. At first, the Indians were glad to share their land and their food with the Europeans. The American holiday of Thanksgiving celebrates this Indian generosity. The first to celebrate it were the Pilgrims, a group of English settlers who arrived in

America in 1620. They gave thanks for having survived their first year in the harsh American wilderness. But there would have been no Thanksgiving had it not been for the Indians.

The Pilgrims arrived on the shores of Massachusetts in November and survived their first winter with help from the Wampanoag and Pequamid Indians who shared corn with them, and showed them where to fish. Later, they gave seed corn to the English settlers and showed them how to plant crops that would grow well in the American soil.

THE QUEST FOR LAND

To the Europeans, much of the Indians' land appeared vacant. The Indians didn't "improve the land" with fences, wells, buildings or permanent towns. Many settlers thought the Indians were savages and that their way of life had little value. They felt they had every right to farm the Indian lands.

On Manhattan Island, the present site of New York City, beaver, deer, fox, wild turkey and other game (wild animals) were plentiful. The Shinnecock Indians used the island for fishing and hunting, but they didn't live there. In 1626, the Dutch "bought" the island from them. The Shinnecock did not understand that once the land was sold, the Dutch felt it was their right to keep the Indians off. Like most Indians, they had no concept of private property.

The Indians believed that the land was there to be shared by all men. They worshipped the earth that provided them with food, clothing and shelter. And they took from it only what they needed. They didn't understand when the settlers slaughtered animals to make the woods around their towns safer. They didn't like the roads and towns that to them, scarred the natural beauty of the earth.

To the Europeans, game existed to be killed and land to be owned and farmed. Many did not bother to discuss with the Indians whether or not they wanted to give up their land. To make room for the new settlers, hunting lands, fields, even Indian towns were seized through war, threats, treaties or some combination of the three.

UNIONS

Small Indian bands and tribes could do little against the well-armed and determined colonists, but united, they were often a more powerful force. King Philip, a Wampanoag chief, rallied neighboring tribes against the Pilgrims in 1675. For a year, they fought bloody battles. But even his 20,000 allies could do little against the numerous colonists and their guns. By 1700, few remnants were left of the tribes that had greeted the Pilgrims.

The Iroquois, who inhabited the area below Lakes Ontario and Erie in northern New York and Pennsylvania, were more successful in resisting the whites. In 1570, five tribes joined to form the most democratic nation of its time, the "Ho-De-No Sau-Nee," or League of the Iroquois. The League was run by a council made up of 50 representatives from each of five member tribes. The council dealt with matters common to all of the tribes, but it had no say in how the free and equal tribes ran their day-to-day affairs.

No tribe was allowed to make war by

itself. The council passed laws to deal with crimes such as murder.

The League was a strong power in the 1600s and 1700s. It traded furs with the Britisl It sided with the British against the French in a war for the dominance of America from 1754 to 1763. The British might not have won that war without the support of the League of the Iroquois. In that case, North America might have had a very different history.

The League stayed strong until the American Revolution. Then, for the first time, the council could not reach a unanimous decision on whom to support. Member tribes made their own decisions, some fighting with the British; some with the colonists, some remaining neutral. As a result, everyone fough against the Iroquois. Their losses were great and the League never recovered.

WESTERN FRONTIER

At the time of the American Revolution, the western boundary of the United States was the Appalachian Mountains. Land had become expensive in the colonies and many people were eager to settle the wilderness that lay beyond those mountains.

Armed with only an ax, a rifle and their own self-confidence, these people carved settlements out of the forests of Kentucky, Tennessee and Ohio.

The Indians fought these invaders of their hunting grounds with a vengeance. Encouragec by the French or the British, who were trying t< retain control of the lands west of the United States, Indians attacked frontier settlements. The white settlers struck back, sometimes massacring entire Indian villages. Indian warfare quickly became a part of frontier life.

At first, the new United States government tried to keep the peace by discouraging settlements beyond the mountains. "The utmost good faith shall always be observed toward the Indians, their lands and property shall never be taken from them without their consent; and in their property, rights, and liberty they never shall be invaded or disturbed..." read the Northwest Ordinance, designed to regulate the settling of the new frontier. But the frontier was far away and "good faith" was rarely demonstrated.

The United States tried different ways of dealing with their "Indian problem." Basically, they all boiled down to this: The Indian had to be either assimilated or removed farther west to mak< room for the European civilization the white Americans felt was destined to rule the continent.

In 1817, President James Monroe wrote: "The hunter or savage state, requires a greater extent of territory to sustain it, than is compatible with the progress and just claims of civilized life, and must yield to it. Nothing is more certain, than, if the Indian tribes do not abandon that state, and become civilized, that they will decline, and become extinct."

The Indians' only chance for survival, felt Monroe, was to be removed to an area where they would not be disturbed by the settlers. Given time to learn civilized ways, or to practice their own way of life, they could survive.

And so, in 1830, the United States passed the Indian Removal Act. All Indians in the East would be removed to lands set aside for them west of the Mississippi River.

One of the tribes slated for removal was

the Cherokee. Ironically, the Cherokee had already adopted many of the white man's ways. Many owned large farms and brick homes in the state of Georgia. Their towns had stores, sawmills, blacksmith shops, spinning wheels and wagons.

In 1821, a Cherokee named Sequoyah developed a written language for his people. Using his 85-character alphabet, the Cherokee printed Bibles and a newspaper. They adopted a constitution modeled on that of the United States government.

Like Monroe, some whites thought removal was a way of saving the Indian peoples. Others saw it as a way to get more land from the Indians. When gold was discovered on Cherokee land, pressure for removal mounted.

A few Cherokees were willing to move to the new lands. Though they did not represent the Cherokee nation, they signed a treaty with the American government agreeing to the removal of the Cherokees.

The peaceful Cherokees were removed by force from their homes and forced to march overland to Indian Territory, in what is now the state of Oklahoma. The difficult journey took three to five months. In all, some 4,000—one quarter of the Cherokee nation—lost their lives in the course of this removal. This shameful moment in American history has come to be called "The Trail of Tears."

In 1803, the United States bought a huge block of western land from the French. Called the Louisiana Purchase, the new lands stretched from the Mississippi River to the Rocky Mountains. It doubled the area of the United States. It was to this vast, "far away" land that the Indians were removed.

Just 15 years later, the United States went to war with Mexico and won yet another vast territory stretching from Texas to California. In 1849, gold was discovered in California. Miners traveling to the gold fields often moved directly through the Indian Territory that was supposed to be undisturbed by whites.

The trip to the new lands in California and Oregon took six months by horse-drawn wagon. But in 1869, when the transcontinental railroad was finished, that same trip could be made in six days. This made it much easier for the settlers to move westward.

BROKEN TREATIES

On the Plains, tribes such as the Sioux roamed on horseback, hunting the buffalo that ranged there. The buffalo gave them everything they needed to live. They ate its meat. They used its skin and fur to make clothing. They stretched its hides over a frame of poles to make the tepees, or tents, they lived in. They carved buffalo bones into knives and tools. The clothing of the Plains Indians was decorated with bead work, and their hair with eagle feathers. These were the proud Indians depicted in television dramas and films about the American West.

The Sioux allowed the wagon trains heading west to pass through their lands. But then whites began to settle the Plains. At first, the Sioux made treaties with the government, giving up large pieces of their land. In return, the government promised them peace, food, schools, supplies and the fair arbitration of all conflicts. One such treaty was the Fort Laramie treaty of 1868. It solemnly declared the vast lands between the Missouri River and the Rocky Mountains to be Sioux territory, on which whites were prohibited from passing or settling.

Six years later, gold was discovered in the Black Hills of South Dakota, a land the Sioux considered sacred. A gold rush was on, and the treaty of Fort Laramie was ignored. The United States tried to buy the Black Hills from the Sioux. But the Sioux refused. Crazy Horse, a great Sioux chief, summed up their feelings: "One does not sell the Earth upon which the people walk."

At the same time, the buffalo that the Sioux depended on had begun to disappear. The land they roamed was being fenced by farmers and ranchers. And whites began to hunt the buffalo for sport and for its hide. In 1850, there were still 50 million buffalo on the Plains. By 1885, there were almost none.

By 1871, the American government had determined that the treaty was no longer an appropriate means of regulating Indian-white relations and that no Indian nation or tribe should be recognized as an independent nation or power. Agreements continued to be worded as treaties, but they were in fact laws governing individual behavior. The American government pressured the Indians to give up their traditional way of life and to live only on reservations. Many resisted. One was Sitting Bull, a Sioux leader. "We lived in our country in the way our fathers and fathers' fathers lived before us," he said, "and we sought trouble with no men. But the soldiers came into our country and fired upon us and we fought back. Is it so bad to fight in defense of one's country and loved ones?"

The Sioux delivered some stunning defeats to the United States cavalry. Among them was "Custer's Last Stand" in 1876 at the Little Big Horn River, where a whole company of cavalry was killed. But the Indians could not live on the Plains without the buffalo to feed them. Half starved, they eventually surrendered and came to live on the reservations.

In 1890, unrest developed, resulting from the rapid advance of settlers, the failure of the government to keep many of its treaty agreements, the suffering and dependence of the Indians caused by the disappearance of game and crop failures, the spread of diseases and the resentment among some Indians of an agreement which reduced the size of the Sioux reservation. A messianic movement grew, characterized by a belief among the Indians of a miraculous re-establishment of Indian supremacy and the return from the dead of ancient warriors. It was symbolized by the "Ghost Dance" and spread among the disaffected numbers of several tribes. These Indians left the reservations and banded together. At Wounded Knee, South Dakota, a bloody confrontation between this group and an American cavalry regiment resulted in over 300 deaths—mostly Indian—and marked the end of all hope for a return to the Indians' traditional way of life on the Plains.

THE RESERVATION SYSTEM

By 1890, almost all of the West, from the prairies to the Pacific, had been settled by cattle ranchers, farmers and townspeople. There was no more frontier, no mountains beyond which the Indians could live undisturbed. Most were

confined to reservations. The government had promised to protect the remaining Indian lands. It had also promised supplies and food. But poor management, inadequate supplies and incompetent or dishonest government agents led to great suffering on the reservations. Diseases swept through the tribes and for a while it seemed as though the Indians really were a vanishing race.

Some people were aware of the poor conditions on the reservations. A writer named Helen Hunt Jackson heard Ponca chief Standing Bear speak about the sufferings of the dispossessed Plains Indians. His words moved her to write A Century of Dishonor in 1885. Her book and her efforts helped bring the plight of the Indians to the attention of the nation.

To survive, many believed, the Indians would have to adopt white ways. On the reservations, Indians were forbidden to practice their religion. Children were sent to boarding schools away from their families.

By the General Allotment Act of 1887, each Indian was allotted 160 acres to farm. But there was no magic in owning private property. Many Indians had no desire to farm. Often, the land given them was unfertile. After each Indian was given his plot, the government sold the remaining lands to white settlers. The result was disastrous: By 1934, Indian land holdings had been reduced from 138 million acres (56 million hectares) to 48 million (19 million hectares).

ДA NEW DEAL EOR THE INDIANS

In 1924, Congress passed the Indian Citizenship Act, which declared all Indians born within the territorial limits of the United States to be citizens. The origin of this act can be attributed to the increased respect of white legislators for the Indians which resulted from their exemplary contribution during World War I. The Act was passed after a period of agitation by pan-Indian groups and by friends of the Indian who demanded enlarged political rights for American Indians and an end to the paternalistic policy of the Bureau of Indian Affairs. In 1928, presidential candidate Herbert Hoover selected Charles Curtis, a Kaw Indian from Kansas, as vice presidential candidate.

However, it wasn't until 1934 that the Indians got a "New Deal." The Indian Reorganization Act encouraged the Indians to set up their own governments and ended allotment on the reservations. It halted the policy of trying to persuade or coerce Indians to give up their traditional culture and religion. In 1946, the government set up the Indian Claims Commission to deal with claims of unfair treatment or fraud. In the 32 years the Commission operated, it awarded $818 million in damages.

The United States was becoming proud of its diverse population. And that included a desire to recognize its Native Americans and to try to compensate them for the unfair treatment they had received.

INDIAN POWER & INDIAN RIGHTS

At a time when blacks were protesting violations of their civil rights, Indians, too, took their protests to the American public. In the mid-1960s, they called for an "Indian

Power" movement to parallel the "Black Power" movement. In 1972, the American Indian Movement (AIM) and other Indian rights groups staged a protest march on Washington called the "Trail of Broken Treaties."

In 1973, national attention once again focused on Wounded Knee, South Dakota. AIM occupied the small village there for 71 days. They demanded the return of lands taken in violation of treaty agreements.

Indians today continue to fight for Indian rights, although less militantly than AIM did in the early 1970s.

Books such as Bury My Heart At Wounded Knee by Dee Brown and Custer Died for Your Sins, by Sioux author Vine DeLoria, one of a number of widely read Indian authors, have helped bring the Indian cause to the attention of the American public.

Many Indians have united to fight in the political arena for Indian rights. Groups such as The National Tribal Chairman's Association, the National Congress of American Indians and the National Indian Youth Council watch out for Indian interests in Washington.

Recently, many tribes have carried on the battle for Indian rights in court. They have sued for the return of lands taken from their ancestors. In 1972, two tribes, the Penobscot and the Passamaquoddy of Maine, sued for the return of 12.5 million acres of land (five million hectares)—58 percent of the state of Maine—and $25 thousand million in damages. The tribes settled for $81.5 million dollars from the federal government in 1980 and invested the money, in the name of the tribes, in a variety of profitable business enterprises operated by members of the tribe.

The Sioux in South Dakota sued for the return of the Black Hills, seized from them in 1877. They were awarded $122.5 million. But many do not want to accept the settlement and continue to fight for the return of the sacred land itself.

AN UPHILL BATTLE

Many of the attempts by individual Indians and by tribes to respond to white society rather than accept victimization by it have been highly successful. Two examples are the prosperous Crow and Blackfoot reservations in Montana, on which these two tribes have established and manage a profitable complex of industrial and service-oriented enterprises.

However, in spite of many gains made by the Indians, they still lag far behind most Americans in health, wealth and education. In 1988, the unemployment rate on Indian reservations averaged 64 percent—ten times the national rate. And 27 percent of Native Americans lived below the poverty line—that is, they earned less than the government considered necessary for a decent lifestyle. Diabetes, pneumonia, influenza and alcoholism claim twice as many Indian lives as other American lives.

Since the 1950s, the government has helped Indians who want to move from the reservations to cities. A few have found highly paid jobs in business, education, law and medicine. But most urban Indians still lack the education and job training to find skilled jobs. Many end up trading rural poverty for urban slums.

Life on the reservations varies greatly.

The Navajo reservation, located in parts of three states in the Southwest, is the nation's largest. It is also one of the poorest. Its 16 million acres (6,667,000 hectares) are home for 160,000 Indians. Government housing stands side by side with mobile homes and hogans. These eight-sided, one-roomed traditional Navajo homes are made from logs and have an earthen roof. Many reservation homes lack electricity and plumbing. The reservation has few towns and few jobs. Unemployment on the reservation ran 48 percent in 1988.

In contrast, the Mescalero Apache reservation nearby in New Mexico is one of the nation's wealthiest. It sits on 460,384 acres (186,390 hectares) in some of the highest mountains in the area. The tribe owns and operates a logging company and a cattle ranch. Both are multimillion dollar businesses. They recently built a $22 million luxury resort offering everything from skiing to horseback riding. Three quarters of the reservation's inhabitants live in new two-story houses built on large plots of land. Most who want to work do. Presently, white managers help to run some of their businesses. But the aim of the Apaches is independence—they hope to take over management of all of their own programs.

In all, the Indians signed 370 treaties with the United States. In return for Indian land, the government promised to protect their remaining lands and resources. Government funds support many reservation programs. Since 1824, the Bureau of Indian Affairs (BIA) has been responsible for Indian lands, resources and programs. But slowly, Indians are gaining a stronger voice in determining how the reservations are operated.

Today, most reservations are governed by a tribal council. Many run their own police forces, schools and courts that try minor offenses.

Like the Apache, the aim of most Indian tribes is to become self-supporting. They are trying to attract businesses to the reservations. Others hope that the natural resources on their reservations will provide much needed income. The Navajo, for example, possess oil, coal and uranium reserves. Other reservations are rich in timber, gas, minerals and water.

In the past, the Bureau of Indian Affairs has negotiated the leases between tribes and private companies developing reservation resources. Today, tribes are taking a larger economic role. But the Indians have mixed feelings about development. "What do we do when we want to extract mineral resources from the earth, our mother?" asks Peterson Zah, chairman of the Navajo nation. "Will we dig out our mother's beautiful face to squeeze out this energy for ourselves and for others? Will we go that far?"

Today, most Indians hope for the best of both worlds. Says college-educated Fred Kaydahzinne, great-grandson of a famous Apache warrior: "My generation spent all our time learning the white man's ways. We mastered them, but we lost a lot of Indian heritage. Now we are trying to regain what we have lost."

THE AMERICAN POLITICAL SYSTEM

The United States is a democracy. But what do Americans mean when they use that word?

Abraham Lincoln, one of the best-loved and most respected of America's presidents, said that the United States had a government "of the people, by the people, and for the people." He called the United States "a nation conceived in liberty and dedicated to the proposition that all men are created equal." No one has formulated a better way of describing the principles of the American political system as Americans understand it. The Constitution, laws and traditions of. the United States give the people the right to determine who will be the leader of their nation, who will make the laws and what the laws will be. The people have the power to change the system. The Constitution guarantees individual freedom to all.

ORIGINS OF DEMOCRACY

The idea that the citizens of a nation should elect their officials or have a voice in making laws was not a new one when the United States came into being. Athens and other city-states of ancient Greece had forms of democracy.

Democracy as a form of government disappeared from ancient Greece and, over the centuries, the translation of the principles and ideals of democracy into practice has been very rare throughout the world. Most people have been ruled by kings, queens, emperors or small elite groups and, except for certain

The American political system is based on the principles of representative government and individual freedom. Here, voters at a town meeting in New England cast their ballots for town representative in a tradition of local democracy that dates back to early colonial days. The National Geographic members of the nobility, the people have had no voice in their government. That was the situation in Europe in 1492 when an Italian named Christopher Columbus, in ships provided by the king and queen of Spain, sailed westward, seeking Asia, and landed in the "New World."

NEW WORLD

The New World consisted of what are now the continents of North and South America. Most Europeans did not know until after Columbus had made his great voyage that these land masses existed. Within a few years, the more powerful nations of Europe were claiming great areas of each continent and establishing colonies to support their claims.

By the 1700s, England had established 13 colonies in the eastern part of what is now the United States. Most of the colonists were English or from other parts of the British Isles, such as Scotland, Ireland and Wales. There were also, however, many Germans in Pennsylvania, Swedes in Delaware and Dutch in New York, which was originally the Dutch colony of New Netherland but was captured by Britain in 1664.

Some of the early British colonists had come to the New World in hopes of enriching themselves; others came because Britain forced them to leave—they were troublemakers or people who could not pay their debts. Some came because of the opportunity, which did not exist for them in Europe, to own land or practice a trade. But there were other reasons, and those other reasons had great influence on the eventual shaping of the political system of the United States.

In the course of its long history as a nation, Great Britain had taken several steps toward democracy. England (including

Wales) had a parliament which made laws, and most people enjoyed a degree of individual freedom. England, however, had an official state religion, the Church of England, and those who did not accept that religion as their own were often persecuted. Many, such as the Puritans who settled in Massachusetts, left for the colonies in order to be able to practice their religion and not suffer for it. Though the Puritans themselves did not tolerate religious dissent in Massachusetts, early settlers often welcomed groups fleeing religious persecution in their homelands. Maryland was established as a haven for Catholics. William Penn, a member of the Religious Society of Friends (Quakers), was granted a large tract of land in the New World by King James II. There he founded the colony of Pennsylvania, where he set up laws protecting freedom of religion and speech. Those laws also enabled the Pennsylvania colonists to have a voice in their local government. Others fleeing persecution in Europe, including many Germans, eagerly settled in the colony. People such as William Penn set an example which helped spread democratic ideals and practice throughout the colonies.

Life in the colonies also helped strengthen democratic ideas. The colonists were far from their old homelands and in a sparsely inhabited new land of forest and wilderness. They had to work together to build shelter, provide food, clear the land for farms and in general to make their new homeland livable for them. This need for cooperation and sharing, combined with a belief in individualism, strengthened the idea that in the New World people were equal; that no one should have special rights and privileges.

Each colony had its own government. In the northern colonies (New England), for example, the colonists met in town meetings to enact the laws by which they would be governed. Other colonies were ruled by representatives of the British king, but always with some consultation with the colonists.

WAR AND INDEPENDENCE

As time passed, the colonist began to resent the governing power that Britain exercised over them. The British government required them to pay taxes to help pay for colonial expenses, but gave them no voice in passing the tax laws. British troops were stationed in the colonies and some people were forced to house the troops in their homes. The British motherland determined what the colonists could produce and with whom they could trade.

In 1774, a group of leaders from the colonies met and formed the "Continental Congress," which informed the king of the colonists' belief that, as free Englishmen, they should have a voice in determining laws that affected them. The king and the conservative government in London paid no heed to the concerns of the colonists, and many colonists felt that this was an injustice which gave them reason to demand independence from Britain. In 1775, fighting broke out between New England militia and British soldiers.

On July 4, 1776, the Continental Congress issued a Declaration of Independence, primarily written by Thomas Jefferson, a farmer and lawyer from the colony of Virginia. This document listed many grievances against the king and declared that from that time the "United Colonies" were no longer colonies of England. The Declaration described them as "free and independent states" and officially named them the United States of America.

Besides declaring the colonies to be a new nation, the Declaration of Independence set forth some of the principles of American democracy. The document says that all people are created equal, that all have the right to "Life, Liberty, and the Pursuit of Happiness," and that governments obtain their powers from "the consent of the governed." The Declaration, and the Constitution after it, combined America's colonial experience with reflection upon the thought of political philosophers such as John Locke to produce the new concept of a democracy governed by the people's representatives for the purpose of protecting the rights of individuals.

With help from France, England's old enemy, and from other Europeans, the American armies, led by George Washington, a surveyor and gentleman farmer from Virginia, won the War of Independence. The peace treaty, signed in 1783, set the western boundary of the new nation at the Mississippi River. The United States covered most of the eastern third of North America.

ARTICLES OF CONFEDERATION

When peace came, the United States was not one unified nation as it is today. Each new state had its own government and was organized very much like an independent nation. Each made its own laws and handled all of its internal affairs. During the war, the states had agreed to work together by sending representatives to a national congress patterned after the "Congress of Delegates" that conducted the war with England. After the war was won, the Congress would handle only problems and needs that the individual states could not handle alone. It would raise money to pay off debts of the war, establish a money system and deal with foreign nations in making treaties. The agreement that set up this plan of cooperation was called the Articles of Confederation.

The Articles of Confederation failed because the states did not cooperate with the Congress needed or with each other. When the Congress money to pay the national army or to pay debts owed to France and other nations, some states refused to contribute. The Congress had been given no authority to force any state to do anything. It could not tax any citizen. Only the state in which a citizen lived could do that.

Many Americans worried about the future. How could they win the respect of other nations if the states did not pay their debts? How could they improve the country by building roads or canals if the states would not work together? They believed that the Congress needed more power.

The Congress asked each state to send delegates to a convention in Philadelphia, the city where the Declaration of Independence had been signed, to discuss the changes which would be neccessary to strengthen the Articles of Confederation.

The smallest state, Rhode Island, refused, but delegates from the other 12 states participated. The meeting, later known as the Constitutional Convention, began in May of 1787. George Washington, the military hero of the War of Independence, was the presiding officer. Fifty-four other men were present. Some wanted a strong, new government. Some did not.

CONSTITUTION

In the course of the Convention, the delegates designed a new form of government for the United States. The plan for the government was written in very simple language in a document called the Constitution of the United States. The Constitution set up a federal system with a strong central government. A federal system is one in which power is shared between a central authority and its constituent parts, with some rights reserved to each. The Constitution also called for the election of a national leader, or president. It provided that federal laws would be made only by a Congress made up of representatives elected by the people. It also provided for a national court system headed by a Supreme Court.

In writing the Constitution, the delegates had to deal with two main fears shared by most Americans.

One fear was that one person or group, including the majority, might become too powerful or be able to seize control of the country and create a tyranny. To guard against this possibility, the delegates set up a government consisting of three parts, or branches, the executive, the legislative and the judicial. Each branch has powers that the others do not have and each branch has a way of counteracting and limiting any wrongful action by another branch.

Another fear was that the new central government might weaken or take away the power of the state governments to run their own affairs. To deal with this the Constitution specified exactly what power the central government had and which power was reserved for the states, the states were allowed to run their own governments as they wished, provided that their governments were democratic.

To emphasize its democratic intent, the Constitution opens with a statement, called a Preamble, which makes it clear that the government is set up by "We, the People" and its purpose is to "promote the general welfare and secure the blessings of liberty to ourselves and our posterity" (descendants).

Before the new government could become a reality, a majority of the citizens in nine of the 13 states would have to approve it. Those in favor of the adoption of the Constitution argued long and hard in speeches and writing. They finally prevailed, but the states made it clear that one more change would have to be made as soon as the new government was established.

Representatives of various states noted that the Constitution did not have any words guaranteeing the freedoms or the basic rights and privileges of citizens. Though the Convention delegates did not think it necessary to include such explicit guarantees, many people felt that they needed further written protection against tyranny. So, a "Bill of Rights" was added to the Constitution.

Although the world has changed greatly in the past 200 years, it has proved possible for the Constitution to be viewed as a living document, one that could be interpreted by scholars and judges who have been called upon to apply its provisions to circumstances unforeseen at the time it was written.

LEGISLATIVE BRANCH

The legislative branch is made up of elected representatives from all of the states and is the only branch that can make federal laws, levy federal taxes, declare war or put foreign treaties into effect. It consists of a Congress that is divided into two groups, called houses:

  • The House of Representatives comprises lawmakers who serve two-year terms. Each House member represents a district in his or her home state. The number of districts in a state is determined by a count of the population taken every 10 years. The most heavily populated have more districts and, therefore, more representatives than the smaller states, some of which have only one. In the 1980s, there are 435 representatives in the United States House of Representatives.

  • The Senate comprises lawmakers who serve six-year terms. Each state, regardless of population, has two senators. That assures that the small states have an equal voice in one of the houses of Congress. The Terms of the senators are staggered, so that only one-third of the Senate is elected every two years. That assures that there are some experienced senators in Congress after each election.

The main duty of the Congress is to make laws, inluding those which levy taxes that pay for the work of the federal government. A law begins as a proposal called a "bill." It is read, studied in committees, commented on and amended in the Senate or House chamber in which it was introduced. It is then voted upon.

If it passes, it is sent to the other house where a similar procedure occurs. Members of both houses work together in "conference committees" if the chambers have passed different versions of the same bill. Groups who try to persuade congressmen to vote for or against a bill are known as "lobbies." When both houses of Congress pass a bill on which they agree, it is sent to the president for his signature. Only after it is signed does the bill become a law.

THE EXECUTIVE BRANCH

The chief executive of the United States is the president, who, together with the vice president, is elected to a four-year term. Under a Constitutional Amendment passed in 1951, a president can be elected to only two terms. Except for the right of succession to the presidency, the vice president's only Constitutional duties are to serve as the presiding officer of the Senate; the vice president may vote in the Senate only in the event of a tie.

The powers of the presidency are formidable, but not without limitations. The president, as the chief formulator of public policy, often proposes legislation to Congress. The president can also veto (forbid) any bill passed by Congress. The veto can be overridden by a two-thirds vote in both the Senate and House of Representatives. As head of his political party, with ready access to the news media, the president can easily influence public opinion regarding issues and legislation that he deems vital.

The president has the authority to appoint federal judges as vacancies occur, including members of the Supreme Court. All such court appointments are subject to confirmation by the Senate.

Within the executive branch, the president has broad powers to issue regulations and directives regarding the work of the federal government's many departments and agencies. He also is commander in chief of the armed forces.

The president appoints the heads and senior officials of the executive branch agencies; the large majority of federal workers, however, are selected through a non-political civil service system. The major departments of the government are headed by appointed secretaries who collectively make up the president's cabinet. Each appointment must be confirmed by a vote of the Senate. Today these 13 departments are: State, Treasury, Defense, Justice, Interior, Agriculture, Commerce, Labor, Health and Human Services, Housing and Urban Development, Transportation, Energy and Education.

Under the Constitution, the president is primarily responsible for foreign relations with other nations. The president appoints ambassadors and other officials, subject to Senate approval, and, with the secretary of state, formulates and manages the nation's foreign policy. The president often represents the United States abroad in consultations with other heads of state, and, through his officials, he negotiates treaties with other countries. Such treaties must be approved by a two-thirds vote of the Senate. Presidents also negotiate with other nations less formal "executive agreements" that are not subject to Senate approval.

THE JUDICIAL BRANCH

The judicial branch is headed by the Supreme Court, which is the only court specifically created by the Constitution. In addition, the Congress has established 11 federal courts of appeal and, below them, 91 federal district courts. Federal judges are appointed for life or voluntary retirement, and can only be removed from office through the process of impeachment and trial in the Congress.

Federal courts have jurisdiction over cases arising out of the Constitution; laws and treaties of the United States; maritime cases; issues involving foreign citizens or governments; and cases in which the federal government itself is a party. Ordinarily, federal courts do not hear cases arising out of the laws of individual states.

The Supreme Court today consists of a chief justice and eight associate justices. With minor exceptions, all its cases reach the Court on appeal from lower federal or state courts. Most of these cases involve disputes over the interpretation of laws and legislation. In this capacity, the Court's most important function consists of determining whether congressional legislation or executive action violates the Constitution. This power of judicial review is not specifically provided for by the Constitution; rather, it is the Court's interpretation of its Constitutional role as established in the landmark Marbury v. Madison case of 1803.

CHECKS AND BALANCES

When Americans talk about their three-part national government, they often refer to what they call its system of "checks and balances." This system works in many ways to keep serious mistakes from being made by one branch or another. Here are a few examples of checks and balances:

  • If Congress proposes a law that the president thinks is unwise, the president can veto it. That means the proposal does not become law. Congress, can enact the law despite the president's views only if two-thirds of the members of both houses vote in favor of it.

  • If Congress passes a law which is then challenged in the courts as unconstitutional, the Supreme Court has the power to declare the law unconstitutional and therefore no longer in effect.

  • The president has the power to make treaties with other nations and to make all appointments to federal positions, including the position of Supreme Court justice. The Senate, however, must approve all treaties and confirm all appointments before they become official. In this way the Congress can prevent the president from making unwise appointments.

BILL OF RIGHTS

To all Americans, another basic foundation of their representative democracy is the Bill of Rights, adopted in 1791. This consists of 10 very short paragraphs which guarantee freedom and individual rights and forbid interference with the lives of individuals by the government. Each paragraph is an Amendment

to the original Constitution.

In the Bill of Rights, Americans are guaranteed freedom of religion, of speech and of the press. They have the right to assemble in public places, to protest government actions and to demand change. They have the right to own weapons if they wish. Because of the Bill of Rights, neither police nor soldiers can stop and search a person without good reason. They also cannot search a person's home without legal permission from a court to do so.

The Bill of Rights guarantees Americans the right to a speedy trial if accused of a crime. The trial must be by a jury and the accused person must be allowed representation by a lawyer and must be able to call in witnesses to speak for him or her. Cruel and unusual punishment is forbidden.

There were 16 other amendments to the Constitution as of 1991. That is not many changes considering that the Constitution was written in 1787. Only a few need to be mentioned here. One forbids slavery and three others guarantee citizenship and full rights of citizenship to all people regardless of race. Another gives women the right to vote and another lowered the national voting age to 18 years.

POLITICAL PARTIES

There is one more very important part of the American political scene which is not part of any formal written document: the political party system.

Political parties are organized groups of people who share a set of ideas about how the United States should be governed and who work together to have members of their group elected in order to influence the governing of the country. When members of a political party form a majority in Congress, they have great powers to decide what kinds of laws will be passed. With exceptions, presidents tend to appoint members of their party or supporters of the views of their party to executive branch positions, including those of secretaries (heads of federal executive agencies) within the presidential cabinet.

The writers of the Constitution feared that parties representing narrow interests rather than the general interest of all the people could take over the government. They hoped the government would be run by qualified people who did not have a second loyalty—a loyalty to a party. They believed their government would work well without parties. Despite this, parties began to form shortly after the Constitution went into effect; parties proved to be an effective way within a system of checks and balances for people with similar views to band together to achieve national goals.

Today, the United States has two major political parties. One is the Democratic party, which evolved out of Thomas Jefferson's party, formed before 1800. The other is the Republican party, which was formed in the 1850s, by people in the states of the North and West, such as Abraham Lincoln, who wanted the government to prevent the expansion of slavery into new states then being admitted to the union.

Most Americans today consider the Democratic party the more liberal party. By that they mean that Democrats believe the federal government and the state governments should be active in providing social and economic programs for those who need them, such as the poor, the unemployed or students who need money to go to college. The Democrats earned that reputation in the 1930s when there was a worldwide economic depression. Under President Franklin D. Roosevelt's "New Deal" plan, Democrats set up government programs that provided paid employment for people building dams and roads and public buildings. The government under the Democratic party established many other programs, including Social Security, which ensures that those who are retired or disabled receive monthly payments from the government. Labor unions also received active government, and Democratic party, support in the New Deal era.

Republicans are not necessarily opposed to such programs. They believe, however, that many social programs are too costly to the taxpayers and that when taxes are raised to pay for such programs, everyone is hurt. They place more emphasis on private enterprise and often accuse the Democrats of making the government too expensive and of creating too many laws that harm individual initiative. For that reason, Americans tend to think of the Republican party as more conservative.

Both major parties have supporters among a wide variety of Americans and embrace a wide range of political viewpoints.

There is so much variety in both major parties that not all members of Congress or other elected officials who belong to same party agree with each other on everything. There are conservative Democrats who tend to agree with many Republican ideas and liberal Republicans who often agree with Democratic ideas. These differences often show up in the way members of Congress note on certain laws. Very frequently, there are both Democrats and Republicans who do not vote the way their party leaders suggest. They put their own views or the views of the people they represent ahead of the views of their party leaders.

Americans do not have to join a political party in order to vote or to be a candidate for a public office. However, running for office without the money and campaign workers a party can provide is difficult. Many voters become members of a party because they feel strongly about the party goals or want a voice in selecting its candidates. Whether or not they belong to a party, voters may cast ballots for any candidate they wish. Everyone votes in secret, and no one can know how another votes or force another person to vote for any particular program or candidate.

There are other, smaller parties in the United States besides the two major parties. None of these smaller parties has enough popular support to win a presidential election, but some are very strong in certain cities and states and can have their own state or city candidates elected or can determine which major party wins by supporting one or the other.

Many people from other nations are surprised to learn that among the political parties in the United States is a Communist party and other Marxist Socialist parties. They are surprised because the United States is seen by many as the leader of the nations opposed to communism. Most Americans do not like the ideas represented by the Communist party and distrust communism in general. The fact that the party exists, seeks to attract supporters and participates freely in elections, however, is considered evidence that there are no exceptions to the freedoms and rights guaranteed in the Bill of Rights.

One concern many Americans have about their political system is the high cost of campaigning for public office. These costs have risen sharply in recent years, in part because most candidates, in order to reach a large number of voters, buy advertising time on television. In 1990, the average winning candidate for election to the House of Representatives spent $406,000—more than four times the average spent in 1976. People worry that the high cost of getting elected may force candidates to spend more time raising money than dealing with important issues and may discourage many qualified people from running for public office. They are also concerned because much of the money to fund political campaigns comes from organized interest groups rather than individuals. Many Americans question whether, after election, these officials will feel more beholden to the groups which gave them money than to the people they represent.

The concerns of the public—and of elected office-holders themselves—have started a movement to change the financing of elections. Some people advocate voluntary spending limits. Others want the government to set limits. It's uncertain exactly what changes will be made, but public concern is so great that reforms in political campaign spending are bound to come soon.

The emphasis on freedom, rights and equality has created in citizens of the United States strong feelings of independence, self- worth and even resistance to discipline, as well as a belief that people should be able to do what they want without interference so long as they don't interfere with the rights of others. These feelings and beliefs have brought about many social and political changes in the United States.

Some of the changes may not seem significant, but they tell something about the democratic American character. One example: In the early 1800s, President Thomas Jefferson began shaking hands with people he met, no matter who they were. He did this because he believed the old European custom of bowing was undemocratic. Americans have been shaking hands as a way of greeting ever since.

There have been many highly important changes brought about in citizens' lives because of Americans' demands for a better life. For example, in the 1930s, Congress passed laws that increased pay, decreased working hours and improved working conditions for workers in factories, in mines and on railroads. These laws recognized the rights of workers to form and be represented by independent unions. The laws resulted from protests by workers against what they considered unjust treatment by employers and the system of courts.

For most Americans, for most of the time, life is peaceful. They do their jobs and enjoy their homes and families, though they remain interested in public issues. They keep up with news of what the president or Congress is doing. Some may at times write letters to congressmen or to newspapers expressing their views. They might discuss taxes or government activities with friends and family; others will become actively involved in local political debates or in supporting candidates for political office. Unless something unusual is taking place, however, they do no more than that. They quietly let their democratic system work, confident that their freedoms are protected.

Suggestions for Further Reading

Acheson, Patricia C. Our Federal Government: How It Works. 4th ed.

New York: Dodd Mead, 1984.

Burns, James M., J.W. Peltason and Thomas E. Cronin. Government by the People: National, State, Local 12th ed..

Englewood Cliffs, N.J: Prentice-Hall, 1984.

Irwin, Jr. Wallace. America in the World: A Guide to U.S. Foreign Policy.

New York: Foriegn Policy Association, 1983.

Rossiter, C., ed.

The Federalist Papers.

New York: New American Library, 1961.

U.S. Congress. House. Committee on the Judiciary. How Our Laws Are Made. Prepared by Edward F. Willett, Jr.

Washington: U.S. Government printing Office, 1980.

U.S. Congress. House. Our American Government. What Is It? How Does It Function? 150 Questions and Answers. Washington: U.S. Government Printing Office, 1981.

Wilson, James Q.

American Government: Institutions and Policies, 3rd ed.

Lexington, MA: D.C. Heath, c. 1986.

GOING TO SCHOOL IN AMERICA TODAY

Each fall almost 50 million young Americans walk through the doorways of about 100,000 elementary and secondary schools for the start of a new school year. Filling classrooms from kindergarten to the 12th grade, they attend classes for an average of five hours a day, five days a week, until the beginning of the following summer.

These students are part of one of the most ambitious undertakings in the history of education: the American effort to educate an entire national population. The goal is—and has been since the early decades of the republic—to achieve universal literacy and to provide individuals with the knowledge and skills necessary to promote both their own individual welfare as well as that of the general public. Though this goal has not yet been fully achieved, it remains an ideal toward which the American educational system is directed. The progress which has been made is notable both for its scope and for the educational methods which have been developed in the process of achieving it.

About 85 percent of American students attend public schools (schools supported by American taxpayers). The other 15 percent attend private schools, for which their families choose to pay special attendance fees. Four out of five private schools in the United States are run by churches, synagogues or other religious groups. In such schools, religious teachings are a part of the curriculum, which also includes the traditional academic courses of reading, mathematics, history, geography and science. (Religious instruction is not given in public schools.)

The combined expenses of both education systems, public and private, exceed $190,000 million a year. From that point of view, American education is a powerful consumer. Who decides how many of these thousands of millions of dollars should be used annually for teachers' salaries, new computers or extra books? Private schools that meet state standards use the fees they collect as they think best. But where public taxes are involved, spending is guided by boards of education (policymakers for schools) at the state and/or district level. The same thing is true for decisions about the school curriculum, teacher standards and certification, and the overall measurement of student progress.

EDUCATION—A LOCAL MATTER

From Hawaii to Delaware, from Alaska to Louisiana, each of the 50 states in the United

States has its own laws regulating education. From state to state, some laws are similar; others are not. For example:

  • All states require young people to attend school. (The age limits vary: 32 states require attendance to age 16; eight to 18; etc.) Thus, every child in America is guaranteed up to 13 years of education. This is true regardless of a child's race, religion, sex, learning problems, physical handicap or inability to speak English.

  • Some states play a strong role in the selection of learning material for their students. For example, state committees may decide which publishers' textbooks can be purchased with state funds. In other cases, decisions about buying instructional material are left entirely to local school officials.

Americans have a strong tendency to educate their children about major public concerns—problems such as environmental pollution, nuclear issues, neighborhood crime and drugs. Responding to public pressure, boards of education in different areas often add courses on various relevant issues to the elementary and secondary school curriculum.

WHAT AN AMERICAN STUDENT LEARNS

American students pass through several levels of schooling—and thus, several curricula—on their way to a high school diploma. They attend:

• Elementary School. In statistical report

published by the federal government, "elementary school" usually means grades kindergarten (K) through 8. But in some places, the elementary school includes only grades K- 6. And sometimes grades 4, 5 and 6 make up what is called a "middle grade" school. (Many Americans refer to the elementary grades as "grammar school.")

• Secondary School. Again, in statistical reports, "secondary school" generally means grades 9-12. These grades are popularly called "high school." However, in many districts, "junior high school" includes grades 7-9. And when grades 7-9 are included with the 10th, 11th and 12th grades, all six are said to form a "senior high school."

Although there is no national curriculum in the United States, certain subjects are taught in all К to 12 systems across the country.

Almost every elementary school provides instruction in these subjects: mathematics; language arts (a subject that includes reading, grammar, composition and literature); penmanship; science; social studies (a subject that includes history, geography, citizenship and economics); music; art; and physical education. In many elementary schools, courses in the use of computers have been introduced. And in some cases, a second language (other than English) is offered in the upper elementary grades.

Most secondary schools offer the same "core" of required subjects: English, mathematics, science, social studies and physical education. But school boards differ greatly from one district to another in the amount of class time they want high school students to spend on these basic subjects. In some high schools, for example, students must complete three years of mathematics before graduation. The national average is lower.

Students are guided by school counselors in choosing electives, which can range from specialized academic to vocational subjects. For example, high schools offer more than one year—in most cases, several years—of math, science and the other core subjects. After they complete the required units in these core areas (for example, one year of American history), students can take additional units as electives (perhaps a year of European history and a year of world political issues).

Other elective courses vary from school to school. Some high schools specialize in particular types of subjects—business education, or industrial trades, or foreign languages, for example. A student planning to be a physician would want to attend a school offering many electives in science.

A CRACK IN THE SYSTEM?

By the early 1980s, the most popular electives were physical education, music performance, remedial (basic) English grammar and composition, driver education, health, "shop" (construction and repair of tools and machinery), marriage training and home economics (home care).

The trend in electives was clearly not toward academic subjects. This was the issue Americans debated with some concern in the early 1980s. The opportunity for elective courses in high school satisfies some ideals that are very important to Americans: • The opportunity to get an education that prepares a person for his or her life's work— whether in computer science, office work, agriculture or a trade.

  • The opportunity to pursue and study one's own interests—whether child development, political science or speaking a foreign language.

  • The opportunity to discover one's own talents and perfect them—whether in music, creative writing or ceramics.

The vision of school as the place for satisfying such goals is not a new one, but, until the 1950s, school boards made most decisions about which curricula would best prepare students for a productive life after high school. The trend of the 1960s and 1970s was to offer more and more choices to students. By the 1980s, American parents and educators were taking a second look at this practice. One reason for this concern was that allowing more free choice to students seemed linked to another trend that had also emerged in the previous two decades—the slow but steady decline of American students' average scores in standardized tests of mathematics, reading and science.

There was no mistaking the evidence. Nationwide testing services used at different grade levels and college entrance examinations demonstrated the drop in student scores. College administrators and business executives began to complain that high school graduates needed remedial courses in reading, mathematics and composition. About 99 percent of adult Americans reported in the 1980 census that they could read and write. But critics claimed that about 13 percent of America's 17-year-olds were "functionally illiterate." That is, they were unable to deal successfully with everyday demands such as reading printed instructions, filling out a job application, etc.

This was gloomy news. In the American mind, schools are a guarantee that the next generation will be informed, self-reliant citizens. Was the system failing some children? Every possible cause for the decline in average scores was examined and written about in the newspapers in the early 1980s. Publishers were blamed for producing textbooks that were too easy. The makers of standardized tests were criticized for using poor questions. Television was blamed for the effect of its uninspiring programs. (In a recent year, it was estimated that Americans between the ages of 6 and 19 years watched television for an average time of 25 hours a week.) School boards were criticized for not paying teachers enough to keep them in the field of education. And parents were accused of not making sure that their children did their homework.

It was easy, at the moment, to overlook how much the American education system had accomplished since its origin, 350 years earlier.

THE PURITANS AND EDUCATION

Americans trace the origins of their nation to the English colonists (settlers) who came to the eastern coast of North America in the early 17th century. The largest group of these first colonists, the Puritans, founded the Massachusetts Bay Colony in 1630. Like others who followed them to America, the Puritans sought the freedom to practice their religion—a freedom they could not enjoy in their native country. They found this freedom in the small towns and villages they built on the edge of the forest in Massachusetts.

One of the things the Puritans believed was that every person should be able to read the Bible. One hundred percent literacy seeme like a dream in the 17th century. Within just a few years after their arrival, they took steps to set up a system of education in their colony:

  • In 1634, they opened a "Latin grammar" school, a school for those who wanted to prepare for college.

  • In 1636, Harvard College was founded for the training of religious ministers.

  • In 1634 and 1638, the Puritans passed laws declaring that all property could be taxed for the common good, which included the support of schools.

  • In 1642 and 1647, the Bay Colony passed laws requiring all parents to provide reading education for their children.

Thus, in less than 20 years, the Puritans introduced two practices that still influence American youth: compulsory education for all children, and public taxation for schools. The situation was different in other British colonies in North America. In Pennsylvania, for example, where there were several different religious groups, decisions about education were left to the leaders of each church. In southern colonies such as Virginia, those who could afford tutors hired them for their sons (and sometimes for their daughters). The older sons of wealthy landowner: were sent to England for their education. Occasionally, a landowner might allow a literate adult to teach reading to the children of poor whites and, perhaps, a few blacks. But mostly, custom forbade the teaching of children of slaves to read.

Throughout the colonies, young men and women could receive an education in reading by becoming an apprentice in a small business. It ha been a practice in England to have young boys and girls live with the families of those for whom they worked (bakers, printers, etc.). In return for i youth's work, the business owner promised to teach him or her to read, as well as how to do a craft (bake or print, for example). This practice was brought to North America.

EDUCATION IN A NEW NATION

During the 17th and 18th centuries, the English continued to develop new settlements along the eastern seaboard of the continent. (Swedish, Dutch, German, and other European immigrants also settled in these colonies.) Each colony developed its own economy, its own style of local government and its own religion: and cultural traditions. Most religious groups were Protestant.

On July 4, 1776, the 13 colonies issued a Declaration of Independence, and went to war for their freedom from England. They won the war for independence in 1781, and negotiated favorable treaty in 1783. But it took until 178S for them to shape a unified national government. The shape and power of this new government, described in the Constitution, were determined after many debates and compromises. The new United States was to b a federal republic—a union of states with a strong central government representing all the people.

The states did not easily give up their ow political powers to this new central government. In fact, the 10th Amendment was added to the Constitution to guarantee that

"powers not delegated to the United States by the Constitution...[would be] reserved to the [governments of the] States..." One of these reserved powers was the right of each state to provide for the education of its people.

Actually, at the end of the 18th century, elementary education throughout the United States was in local hands. State governments were allowing local districts (small towns and villages) to set up and run their own elementary schools. Most often, these schools were in one- room buildings, with one teacher for all the students who could attend. The teacher, who was hired by a committee of citizens, had to teach what the local community expected.

What kind of education did Americans want for their children in 1800? At the very least, they wanted each child to learn to write his or her name, to do simple arithmetic, to learn the local rules of conduct. Most of all, they wanted their children to learn to read.

The first colonists had believed that literacy was important to the preservation of religious freedom. Americans in the early 1800s also believed that the ability to read was important to preserving a democratic republic. Thomas Jefferson, third president of the United States, argued that Americans should be given an excellent education. He felt that this was the only way to guarantee "the preservation... of liberty."

Several leaders in Jefferson's time urged the formation of a national system of education with uniform standards for schools in all the states. But none of these plans was ever tried. Americans wanted the best education for their children, but they feared the effects of giving its direction to the national government.

EARLY CHALLENGES, EARLY REFORMS

And so the future was decided: Education in the United States was to remain in the hands of state and local governments. But while the national government had no role to play in shaping American education, many national events did.

The 19th century was a time of great change within the United States:

  • Both the population and the land area of the nation grew rapidly after 1800. By 1850, there were 31 states in the nation and more than 23 million people. New states were added to the American nation as its people pushed westward.

  • After the middle of the 19th century, steel, railroad and other giant industries increased rapidly in America. This industrialization led to the growth of large cities—especially in northern states.

  • Immigration to the United States swelled to almost two million in the 1840s, and continued to increase for the rest of the century. (Between 1820 and 1920, about 33 million people immigrated to America.)

  • After 1840, the immigrant population included the first large numbers of Catholics; after 1870, many of America's immigrants did not speak English. Tensions often grew out of the cultural differences between newcomers and those who had lived in America for a longer time.

  • Between 1861 and 1865, a tragic civil war tore America apart. Northern (Union) and Southern (Confederate) armies fought one another at a terrible cost of lives and property. One of the outcomes of this war was the end of slavery.

All of these events and changes placed enormous pressure on the practice of education in America. From the 1830s, reformers such as Horace Mann (in Massachusetts) had been trying to improve the quality of schools in each state. The reformers wanted especially to do three things: (1) to train better teachers; (2) to shift some decision-making from school districts back to the state government; and (3) to increase student enrollment in elementary ("common") schools. The combined pressures of population moves, industrialization, urbanization and cultural tensions occasionally stalled these reform efforts. Still, there were certain ideas that refused to die in the young and changing nation. As each new group of immigrants arrived on American soil, they caught the spark of these ideas and made them their own. The Declaration of Independence, read aloud at every Fourth of July celebration, reminded Americans: "We hold these Truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights that among these are Life, Liberty, and the Pursuit of Happiness."

Increasingly, Americans believed that a basic education was an unalienable right, too. Because of prejudice, some groups were forced to wait longer than others for the exercise of this right: blacks, Native Americans (Indians), young women of every race. Nevertheless, improvements in educational opportunity continued to be made—in both the 19th and the 20th centuries:

  • Between the 1820s and 1860s, public high schools were opened in city after city throughout the United States. Until the 20th century, the purpose of these schools was to offer Latin and other academic courses to students preparing for college.

  • In 1852, Massachusetts passed the first compulsory education law in the new nation. It affected young people six to 16 years old. Over the next few decades, all other states in the country passed similar laws.

  • In 1917, the Congress passed the Smith- Hughes Act to provide federal money to high schools that devoted at least half their time to vocational education. This act gave a big boost to courses in specific trades and industries.

(It also showed one of the ways the federal government would influence education in the states—through funding.)

CURRICULUM IN THE 20TH CENTURY

By the early 20th century, compulsory education was in place in the United States. The old dream of universal literacy seemed about to come true. More than that, the schools were becoming centers for the Americanization of immigrants from all over the world. But there was still one major issue under debate— the matter of school curricula.

Some reformers in the early 1900s wanted secondary schools to offer "modem" subjects such as history, science and English composition. They also urged that high schools offer elective courses. Philosopher John Dewey and educator Francis Wayland Parker went beyond suggestions about courses. Dewey and Parker wanted to see changes in the methods of education and even in its goals. They felt, for example, that American schools should help develop the special abilities of each child and that subject matter should be adjusted to the child's innate capacities. They disliked

formalized teaching methods, arguing that each school should be a "little community" in which students were filled with a "spirit of service" toward the interests of the whole group. Dewey's suggestions about self-directed training and about dealing with each child's abilities were widely adopted. These methods were practiced in the laboratory school at the University of Chicago between 1896 and 1904 and became known as "Progressive Education." Individualized instruction within a classroom is still an important idea in American education, and "learning by doing" remains a highly respected approach to child development.

In 1918, a committee of the National Education Association, a teachers' organization, reported seven new goals for secondary education. The report urged schools to educate students in (1) health, (2) the command of fundamental skills, (3) family living, (4) vocational skills, (5) citizenship, (6) the proper use of leisure time and (7) ethical (moral) conduct. School boards across the nation responded eagerly to these suggestions. New courses were added to the curriculum: history, geography, science and English. Sports and other after-school activities became an unofficial part of the school curriculum. Teachers were now being asked to prepare children for every aspect of life.

LEARNING TO BE WORLD CITIZENS

After 1920, the К to 12 education in America remained very much the same until World War II. That tragic event introduced changes that affected every institution in America— including the schools. American parents— especially young couples who married in the late 1940s—wanted their children to be educated for the post-war world. At the same time, American blacks and other minority groups demanded educational opportunities equal to those of whites. In 1954, the Supreme Court ruled that the practice of segregating blacks into separate schools was unconstitutional. By 1955, the United States was a nuclear power, a member of the United Nations, and a competitor with the Soviet Union for world influence. American jobs were changed by new technology (especially by the computer), and American businesses spread around the globe. Television brought the faces of presidents, entertainers and people from all over the world into America's homes each evening. Discoveries by scientists opened new secrets of the stars and of the atom. Between 1950 and 1960, more new knowledge was developed than in all of the world's history before 1950.

Schools were asked not only to teach this new information, but to help students ask their own questions about it. The "inquiry" method of learning, focusing on solving problems rather than memorizing facts, became popular. More science courses were added to the curriculum, some as a result of the orbiting of the first man-made satellite, Sputnik, by the Soviet Union in 1957. The federal government began to spend millions for the development of new science curricula and for training teachers to use them. (Federal spending would spread to other fields, too, especially for programs to aid students with learning difficulties. By the early 1980s, the federal government was spending about eight to 10

thousand million dollars annually on elementary and secondary education.)

But a good secondary education was no longer enough for many Americans. In one school district after another, parents insisted on high school programs that would prepare their sons and daughters for admission to a university. More and more Americans viewed the university as the doorway to a medical or law degree, a position in government, or a management position in a major business office.

TOWARD THE 21ST CENTURY

The late 1960s and early 1970s—years of the American involvement in the Vietnam War— were difficult for all Americans. Drug abuse became a problem, even for teenagers. Despite the rapid development of new, well-equipped schools, students began to "drop out" of (leave) high school before graduation. Average test scores declined.

Were young people disillusioned by the war; were they confused by conflicting values in the world they saw every day on TV? In an effort to help students deal with their problems, schools added more "attractive" courses and increased their counseling services. But leading educators across the country claimed that American schools were drifting away from their chief task: to stimulate, challenge and educate students.

The Department of Education, established in 1979, by consolidating smaller federal elements, sponsored the National Commission on Excellence in Education to examine the question. In 1983, the Commission made several recommendations: lengthen the school day and year; establish a new core curriculum for all students (four years of English, three years each of math, science and social studies, one-half year of computer science); and raise the standards of performance in each subject.

Once more, across the nation, school boards responded. Course requirements were tightened.

As a result of the concern for excellence, test scores for American children are once again on the rise and many schools are returning to basic educational principles.

Schools also face new challenges. They must deal with a large influx of immigrant children, many of whom do not speak adequate English. They must meet community demands that the curriculum reflect the culture of the children attending the school. They must make sure that students develop basic job skills and, in many cases, they must meet the diverse needs of non-traditional students, such as teenaged mothers.

Schools are meeting these challenges in ways that reflect the diversity of the U.S. educational system. They are training large numbers of teachers of English as a second language and, in some communities, setting up bilingual schools. Many communities are working with teachers and school administrators to move the school curriculum away from the traditional European-centered approach and to include more African, Asian and other studies. They believe that will help raise the self-esteem of children whose ancestors came from places other than Europe and that it will also teach children of European ancestors an appreciation of other cultures. And since schools are, for many people, their only point of contact with the government, some communities have started "one-stop-shopping" schools, which offer counseling, child care, health services and other social services. Schools are also working to teach "thinking skills" to the 70 percent of U.S. students who do not go on to higher education.

These students, in order to advance beyond entry-level jobs, need to be able to cope with today's technologically advanced workplace. In the words of a recent report by the Commission on Achieving Necessary Skills:

"A strong back, the willingness to work and a high school diploma were once all that was necessary to make a start in America. They are no longer. A well developed mind, a continued willingness to learn and the ability to put knowledge to work are the new keys to the future of our young people, the success of our business and the economic well-being of the nation."

In 1989, President Bush and the governors of all 50 states set six basic educational goals to be achieved by the end of the century. ТЪеу are:

  • That all children will start school ready to learn

  • That 90 percent of all high school students will graduate

  • That all students will achieve competence in core subjects at certain key points in their education careers

  • That American students be first in the world in math and science achievement

  • That every adult American will be literate and have the skills to function as a citizen and as a worker

  • That all schools will be free of drugs and violence and offer a disciplined environment conducive to learning.

Two years later, in 1991, the government issued a "report card," assessing progress toward those goals. The report found that progress had been made toward most of those goals, but that everyone—schools, students and the government—would have to work very hard in order to meet all of the goals by the end of the century.

Suggestions for Further Reading

Cremin, Lawrence A. American Education: The National Experience 1783-1876

New York: Harper and Row, 1982.

Cremin, Lawrence A. Transformation of the School: Progressivism in American Education, 1876-1957

New York: Random House, 1964.

Johnson James A. and others, eds. Foundations of American Education: Readings. 5th ed.

Newton, MA: Allyn & Bacon, 1981.

The National Commission on Excellence in Education. A Nation at Risk:

The Imperative for Educational Reform Washington: Superintendent of Documents, U.S. Government Printing Office, 1983.

Rippa, S. Alexander. Education in a Free Society : An American History. 3rd ed. New York: David McKay Co., 1976.

Higher education

Out of more than three million students who graduate from high school each year, about one million go on for "higher education." Simply by being admitted into one of the most respected universities in the United States, a high school graduate achieves a degree of success. A college at a leading university might receive applications from two percent of these high school graduates, and then accept only one out of every ten who apply. Successful applicants at such colleges are usually chosen.on the basis of (a) their high school records; (b) recommendations from their high school teachers; (c) the impression they make during interviews at the university; and (d) their scores on the Scholastic Aptitude Tests (SATs).

The system of higher education in the United States is complex. It comprises four categories of institutions: (1) the university, which may contain (a) several colleges for undergraduate students seeking a bachelor's (four-year) degree and (b) one or more graduate schools for those continuing in specialized studies beyond the bachelor's degree to obtain a master's or a doctoral degree; (2) the four-year undergraduate institution—the college—most of which are not part of a university; (3) the technical training institution, at which high school graduates may take courses ranging from six months to four years in duration and learn a wide variety of technical skills, from hair styling through business accounting to computer programming; (4) and the two-year, or community college, from which students may enter many professions or may transfer to four-year colleges or universities.

Any of these institutions, in any category, might be either public or private, depending on the source of its funding. There is no clear or inevitable distinction in terms of quality of education offered between the institutions which are publicly or privately funded. However, this is not to say that all institutions enjoy equal prestige nor that there are no material differences among them.

Many universities and colleges, both public and private, have gained reputations for offering particularly challenging courses and for providing their students with a higher quality of education. The great majority are generally regarded as quite satisfactory. A few other institutions, conversely, provide only adequate education, and students attend classes, pass examinations and graduate as merely competent, but not outstanding, scholars and professionals. The factors determining whether an institution is one of the best or one of lower prestige are quality of teaching faculty, quality of research facilities; amount of funding available for libraries, special programs, etc.; and the competence and number of applicants for admission, i.e., how selective the institution can be in choosing its students. All of these factors reinforce one another.

In the United States it is generally recognized that there are more and less desirable institutions in which to study and from which to graduate. The more desirable institutions are generally—but not always— more costly to attend, and having graduated from one of them may bring distinct advantages as the individual seeks employment opportunities and social mobility within the society. Competition to get into such a college prompts a million secondary school students to take the SATs every year. But recently emphasis on admissions examinations has been widely criticized in the United States because the examinations tend to measure only competence in mathematics and English. In defense of using the examinations as criteria for admissions, administrators at many universities say that the SATs provide a fair way for deciding whom to admit when they have 10 or 12 applicants for every first-year student seat

WHY AMERICANS GO TO COLLEGE

The United States leads all industrial nations in the proportion of its young men and women who receive higher education. Why is this? What motivates a middle-income family with two children to take loans for up to $120,000 so that their son and daughter can attend private universities for four years? Why would both parents in a low-income family take jobs to support their three children at a state university—each at an annual cost of $4,000? Why should a woman in her forties quit her job and use her savings to enroll for the college education she did not receive when she was younger?

Americans place a high value on higher education. This is an attitude that goes back to the country's oldest political traditions. People in the United States have always believed that education is necessary for maintaining a democratic government. They believe that it prepares the individual for informed, intelligent political participation, including voting.

Before World War II, a high school education seemed adequate for satisfying most people's needs, but the post-war period produced dozens of complex new questions for Americans, including issues such as use of atomic power, experiments in splitting genes, space programs and foreign aid. Americans rarely express a direct vote on such complex matters, but the representatives they elect do decide such issues. In recent years, as a result, many Americans have begun to regard a college education as necessary to deal with such questions as an informed American voter.

In addition to idealistic reasons for going to college, however, most Americans are concerned with earning a good (or better) income. For some careers—law, medicine, education, engineering—a college education is a necessary first step. Some careers do not require going to college, but many young Americans believe that having a degree will help them obtain a higher salary on their first job. Today, that first job is likely to involve handling information: More than 60 percent of Americans now work as teachers, computer programmers, secretaries, lawyers, bankers, and in other jobs involving the discovery, exchange and use of data (facts). A high- school diploma is not sufficient preparation for most such employment.

SELECTING A COLLEGE OR UNIVERSITY

In addition to learning about a school's entrance requirements (and its fees), Americans have a lot of questions to think about when they choose a university or college. They need to know:

• What degrees does the school offer? How long does it take to earn one? At the undergraduate (college) level, a four-year "liberal arts" course of study is traditionally offered which leads to a bachelor of arts (B.A.) degree in such subjects as history, languages and philosophy. (The term "liberal arts" comes from liberales artes, a Latin expression for free, or human, arts and skills. In the time of the Roman Empire, these were skills and arts that only a free person—not a slave—could acquire.) Many liberal arts colleges also offer a bachelor of science (B.S.) degree in physics, chemistry or other scientific subjects. A technical training institution, offering such courses as agriculture or business skills, offers courses of varying length, and community college studies last two years.

Graduate schools in America award master's and doctor's degrees in both the arts and sciences. (The term "doctor" comes from the Latin word docere, meaning "to teach.") The courses for most graduate degrees can be completed in two to four years. But if a graduate program requires original research, a student could spend many additional months or even years in a university library or laboratory.

  • What curricula does a college or university offer? What are the requirements for earning a degree? In an American university, each college and graduate school has its own curriculum. At the undergraduate level, there may be some courses that every student has to take (for example, classes in world history, math, writing or research). But students do select their "major" (the field in which they want their degree), plus a specific number of "electives" (courses that are not required but that students may choose). The National Institute of Education, a government agency, reports that a total of more than 1,000 majors are offered in America's colleges and universities. The combined electives available in these schools probably amount to a number in the tens of thousands.

Typically, an undergraduate student has to earn a certain number of "credits" (about 120) in order to receive a degree at the end of four years of college. Credits are earned by attending lectures (or lab classes) and by successfully completing assignments and examinations. One credit usually equals one hour of class per week in a single course. A three-credit course in biology could involve two hours of lectures plus one hour in a science lab, every week. A course may last 10 to 16 weeks—the length of a "semester."

  • Is the college or university a public institution (operated by a state or local government) or a private one? If it is private, is it a religious school? The United States does not have a national (federal) school system, but each of the 50 states operates its own university, and so do some large city governments. (The government does grant degrees in the schools it operates for professional members of the armed services— for example, the United States Naval Academy at Annapolis, Maryland.)

About 25 percent of all schools of higher education in the United States are privately operated by religious organizations. Most are open to students of different faiths, but in some religious schools all students are required to attend religious services. There are also privately owned schools with no religious connection.

Both public and private colleges depend on three sources of income: student tuitions, endowments (gifts made by wealthy benefactors) and government funding. Some endowments are very large: Harvard, Princeton and Yale Universities have more than a thousand million dollars each. Public institutions receive a larger portion of public tax monies than do private schools.

  • How large is the school? There are many small American colleges—some with fewer than 100 students. But the larger universities tend to keep attracting larger numbers of enrollments. By the mid-1980s, at least seven

universities had total enrollments of over 100,000 each. (One of the seven, the State University of New York, has more than 60 campuses in different parts of the state.)

Why do the large universities flourish Until recent years, a major answer to this question was: They offer the best libraries; facilities for scientific research. Access to a "mainframe" (very large) computer and to modern laboratories attracts leading scientists to the faculties of such schools. And students enroll to study with the experts. Research programs continue to be important to the reputation of America's universities. But ii recent years, the percentage of advanced degrees awarded in the "pure" (research) sciences has defined. The same has been true for the liberal arts. Students continue to see the largest, most respected universities—but for new and different programs.

TRENDS IN DEGREE PROGRAMS

During the 1970s and 1980s, there was a trend away from the traditional liberal arts. Instead students were choosing major fields that prepare them for specific jobs. In 1987, 56 percent of the four-year bachelor's degree: were conferred in business and management, computer and information science, education, engineering, health professions, and public affairs. Only 13 percent of the degrees we conferred in the traditional arts and sciences.

But some observers believe this trend toward pre-professionalism may be ending that students are switching back to tradition areas of study. They cite the fact that in H bachelor's degrees in mathematics were u percent since the low point they had reach 1981. Bachelor's degrees in English and literature, foreign languages, history, and physics also saw an upswing.

In many ways, this new popularity о liberal arts is a return to the early traditions of American education.

TRADITIONS IN EDUCATION

When the colonies that eventually became United States of America were settled in 1600s, the world already had some very с universities. The University of Al-Azhar Cairo was then more than 600 years old. had had its University of Bologna for centuries. Oxford and Cambridge in England and the University of Paris were founded 12th century.

European colleges were an offspring universities. The first colleges were open Paris in the 15th century as residence halls university students. Usually, all the students in one residence studied the same subject (for example, law, medicine or theology). The "college" gradually came to mean a place studying a specific subject (for example, law). And thus, colleges developed into school

Both institutions—colleges and universities—were an important part of 1 England when its people began to migrate to North America. Within the first 25 years Massachusetts Bay Colony more than 1С graduates from Oxford Cambridge and E joined its founders. It was natural for the early colonists to set up the same institute America that they had known in their nativecountry. And—since many colonists came to America for religious freedom—it is not surprising that their first colleges trained young men to be ministers. Harvard College was founded for this purpose in 1636; so were William and Mary College (Virginia) in 1693, and Yale (Connecticut) in 1701. By the time the colonies won their independence from Britain in 1783, six more colleges had been added: Princeton (New Jersey), Pennsylvania, Columbia (New York), Brown (Rhode Island), Rutgers (New Jersey) and Dartmouth (New Hampshire). All are active, respected universities today.

NEW WORLD COLLEGES

The colonies prospered in the 18th century. Men and women who left England and other European countries as poor people became landowners and traders on American shores. In Europe, college was regarded as a place for the elite (members of the wealthy "upperclass"), but in early colonial America no rigid traditions of class existed. So those who could afford it sent their sons (but not their daughters) to a colonial college. Not all these sons, however, went on to the religious ministry. By the middle of the 18th century, only half the graduates of American colleges were becoming ministers. The other 50 percent usually chose careers as lawyers, doctors and businessmen.

What did colonial colleges teach? As in Europe, Latin and Greek were basic subjects. So were philosophy and the study of religion. But, responding to the interest of the "new student" in the New World (as America was then called), colleges introduced "modem" subjects, too. Students read and discussed the new political ideas of England's John Locke and France's Montesquieu. They were given a taste of geography. A few colonial colleges even offered courses in the so-called "practical" subjects—surveying, navigation, husbandry (farming), commerce (trade) and government.

But the basic goals and methods of 18th- century academic education did not change in colonial colleges. These colleges still followed the models set down by Oxford and Cambridge: They were dedicated to forming their students' characters, to handing down the knowledge of previous generations. They did not offer to lead their students in doing fresh research or in adding new ideas to what the world already knew. Even after the independence Qf the United States in 1783, this model of higher education would continue in the United States for most of the next century.

DEMOCRACY AND EDUCATION

By the time of George Washington's inauguration as the first president in 1789, several very powerful ideas had worked their way into American thinking. Inspiring documents had attended the birth of the new nation: Thomas Paine's pamphlet, "Common Sense," the Declaration of Independence, the Constitution of the United States and the Federalist Papers (essays in which the new Constitution was discussed). Reading and debating the contents of these works was an education in itself. Americans became deeply conscious of the principles of democracy and of the proper role of government in a republic.

The two principles of excellence in education and popular control of government were sometimes difficult to keep in balance. For example, when the founders of the new nation urged more education for all citizens, Americans applauded the idea. But when Washington and his first five successors proposed opening a national university in the nation's capital, the Congress said no. The people's representatives feared giving too much power to the new central government. Decisions about education, they decided, should continue to be made by each state and locality.

TНЕ 19TH CENTURY

The 19th century hit the United States like a series of strong, gusting winds. If these winds had a common direction, it was westward: Millions of Europeans sailed west across the Atlantic to live in the new nation. And millions of these newcomers joined the descendants of earlier immigrants in a westward trek across the North American continent. As pioneers, they planted homes, farms, towns and colleges as they moved toward the Pacific Ocean. Most of these new colleges were poor, but they accepted almost everyone who had the time and interest to apply. And with this development, a crack appeared in the European model.

Another crack appeared with the

admission of women into college. The first three women to receive their B.A.'s from an American school graduated from Oberlin College, Ohio, in 1841. But Oberlin—which had admitted all applicants regardless of race or sex since the 1830s—was an exception. Most colleges in the first half of the 19th century refused women applicants. It was also considered improper for women to attend the same class as men. (Even in Oberlin, women were not allowed to attend an evening class demonstrating the use of telescopes for observing stars.) These attitudes changed slowly. Vassar (New York), the first American college founded for women, did not open until 1865. Wellesley, Smith (both in Massachusetts) and a few others followed within the next 35 years.

The most unusual change in American higher education may have begun with an unusual law—the Land Grant College Act of 1862. Under this law, the federal government gave huge tracts of public land to the states for the development of agricultural and technical colleges. The states sold the land and used the money to build these colleges. The Land Grant Act marked the beginning of federal influence on higher education—an influence based on financial aid. It also was the beginning of another trend: Land grant colleges became deeply involved in researching new methods of scientific farming.

In 1869 Harvard's new president, Charles Eliot, reorganized his college—the nation's oldest—into a university. He raised Harvard's entrance requirements, added new courses (including electives), and toughened Harvard's standards for awarding degrees. A few years later, the Johns Hopkins University opened in Maryland, followed more than a decade later by the University of Chicago (Illinois) and Stanford University (California). These new research-oriented institutions introduced graduate school programs (a level of education European nations had had for some time). By the beginning of the 20th century, almost all the other characteristics of American higher education were in place:

  • A number of graduate and undergraduate schools began to specialize (focus on just one field of study). "Normal Schools," for example, were founded to prepare those who wanted to be teachers.

  • Many colleges and universities that had been operated by religious groups were now simply private—or even public—schools.

  • Most colleges and universities were coeducational (open to men and women). In the years following the end of slavery, black Americans, too, began to attend colleges and universities. (But it would take many more years to erase school segregation—the practice of educating blacks and whites in separate schools.)

Despite all these changes, however, higher education in the United States was still regarded as something for a sort of elite: the most talented, the wealthy, or at least those who could afford not to work full-time while they attended college or a university.

EDUCATION FOR ALL

In 1944 Congress passed the Servicemen's Readjustment Act, soon popularly called the 'GI Bill of Rights." ("GI," at the time, was a nickname for the American soldier. The nickname came from an abbreviation for "Government Issue"—the uniforms and other articles "issued" to a soldier.) The Act promised financial aid, including aid for higher education, to members of the armed forces after the end of World War II.

The war ended in the following year. The prediction had been that 600,000 war veterans would apply for aid for education. By 1955, more than two million veterans of World War II and of the Korean War had used their GI Bill of Rights to go to college. Many of these veterans were from poor families. Thirty percent were married when they applied for college aid; 10 percent had children. More than a few had to work part-time while they took courses. It was difficult, but these veterans believed that a college degree (which they could not afford on their own) would improve their chances for a good job in the post-war economy. Some went to liberal arts colleges; others to technical and professional institutions. Their outstanding success in all these schools forced everyone connected with higher education to rethink its purpose and goals. Within just a few years, American veterans had changed the image of who should go to college.

In post-war America, other groups sought their place on America's campuses, too. The enrollment of women in higher education began to increase. Racial segregation in elementary and secondary education ended, and thus blacks achieved an equal opportunity to get into any college of their choice.

By the 1960s, some colleges introduced special plans and programs to equalize educational opportunities—at every level, for all groups. Some of these plans were called "affirmative action programs." Their goal was to make up for past inequality by giving special preference to members of minorities

seeking jobs or admission to college. (In the United States, the term "minority" has two meanings, often related: (a) A minority is any ethnic or racial group that makes up a small percentage of the total population; (b) The term also suggests a group that is not the dominant political power.) Some colleges, for example, sponsored programs to help minority students prepare for college while still in high school.

By the 1970s, the United States government stood firmly behind such goals. It required colleges and universities receiving public funds to practice some form of affirmative action. But when colleges began to set quotas (fixed numbers) of minority students to be admitted, many Americans (including minority citizens) protested. They felt that this was another form of discrimination.

As with most (but not all) problems in American public life, the conflict was resolved by change and compromise. Colleges continued to serve the goal of affirmative action—but in less controversial ways. One large university, for example, announced a new policy: It would seek to admit students who would add diverse talents to the student body. It thus dealt with all applicants— minorities, included—on a basis that was not restricted to high school performance and entrance tests, but which took into account the talents, voluntary activities and "life experience" of the student.

What success did these efforts have? American college students are an increasingly diverse group. In 1987, 54 percent were women. Women received 51 percent of the bachelor's and master's degrees awarded that year, and 35 percent of the doctorates and professional degrees. But not all groups are doing so well.

Although 59 percent of the students who graduated from high school in 1988 enrolled in college that same year, only 45 percent of the African-American high school graduates went on to college. Educators and others are working to increase that percentage.

U.S. colleges and universities are also enrolling a higher percentage of non- traditional students—students who have worked for several years before starting college or students who go to school part-time while holding down a job. In 1987, 41 percent of college students were 25 years of age or older and 43 percent were part-time students.

HIGHER EDUCATION—TOMORROW

Can America's colleges and universities rest on their accomplishments? About 12 million students currently attend schools of higher education in America. They are students in a society that believe in the bond between education and democracy. They have at their disposal great libraries (Harvard alone has more than 10 million volumes); the latest in technology; and faculties with a tradition of research accomplishments. (The world's first electronic computer was developed at the University of Pennsylvania, for example.) They are free to pursue their interests, to develop their talents, and to gain professional rank.

Still, many Americans are not satisfied with the condition of higher education in their country.

Perhaps the most widespread complaint has to do with the college curriculum as a whole, and with the wide range of electives in particular. In the mid-1980s, the Association of American Colleges (AAC) issued a report that called for teaching a body of common knowledge to all college students. According to the AAC report, this common core of subjects should include science and the study of cultural differences (as well as basic literacy). A somewhat similar report, "Involvement in Learning," was issued by the National Institute of Education (NIE). In its report, the NIE concluded that the college curriculum has become "excessively vocational work-related." The report also warned that college education may no longer be developing in students "the shared values and knowledge" that traditionally bind Americans together. A serious charge: Is it true?

For the moment, to some degree, it probably is. Certainly, some students complete their degree work without a course in Western civilization—not to mention other world cultures. Others leave college without having studied science or government. As one response, many colleges have begun reemphasizing a core curriculum that all students must master.

On the other hand, many students and some professors have charged that university curricula are too "Euro-centered," that they emphasize European culture at the expense of the cultures of Africa, Asia or Latin America, for example. This has led to a movement toward "multiculturalism," or the addition to the curriculum in many institutions of courses on such subjects as African literature or on the contributions of women to society. Some traditionalists argue that this trend has gone too far.

Such problems are signs that American higher education is changing, as it has throughout its history. And as in the past, this change may be leading in unexpected directions: The Puritans set up colleges to train ministers. But their students made their mark as the leaders of the world's first Constitutional democracy. The land grant colleges were founded to teach agriculture and engineering to the builders of the American West. Today, many of these colleges are leading schools in the world of scientific research. American universities were established to serve a rather small elite. In the 20th century, GIs, women and minorities claimed their right to be educated at these same universities. The full impact of this change is probably yet to be seen.

Americans have always had a stake in "making the system work." They have especially critical reasons for doing so in the field of education. People in the United States today are faced with momentous questions: "What is America's proper role as the world's oldest Constitutional democracy; its largest economy; its first nuclear power?"

Americans cherish their right to express opinions on all such issues. But the people of the United States are also painfully aware of how complex such issues are. To take part in dealing with new problems, most Americans feel they need all the information they can get. Colleges and universities are the most important centers of such learning. And whatever improvements may be demanded, their future is almost guaranteed by the

American thirst to advance and be well- informed. In fact, the next change in American education may be a trend for people to continue their education in college—for a lifetime.

Suggestions for Further Reading

Barnes, Gregory A.

The American University: A World Guide. Philadelphia: ISI Press, 1984.

Ben-David, Joseph.

American Higher Education: Directions Old and New.

The Carnegie Commission on Higher Education. New York: McGraw-Hill, 1972.

The College Board.

The College Handbook.

New York: Scribner, issued annually.

Peterson's Annual Guide to Undergraduate Study: Guide For Colleges.

Princeton, N.J.: Peterson's Guides, issued annually.

Rudolph, Frederick.

The American College and University: A History. New York: Knopf, 1962.

THE PUBLIC WELFARE SYSTEM IN AMERICA

By Richard Pawelek (Educational Writer)

Although people in countries around the world know about the aid provided by the government and people of the United States to other nations in times of need, many are unfamiliar with the public welfare system which exists within the United States itself. Because the economic system of the country is one of private, individual, free enterprise, even those who havestudied about the United States believe, in many cases, that American citizens must always fend for themselves. While it is true that Americans are expected to provide for their own needs— and for most American citizens it is a point of honor to be able to do so without accepting help from other individuals or from the government—the United States has had, since the 1930s, an extensive system of social welfare to help those who cannot help themselves.

HISTORY OF AMERICAN WELFARE

From the days of British colonial rule in North America until the 1930s, there was little disagreement about the proper role of government with regard to the welfare of the American people. Local government gave a small amount of money to the very poorest, but most people refused to accept this help unless they were desperate. The feeling that one should work hard and be self-reliant was strong—and remains so today. People provided for themselves as well as they could. They expected to do this, and it was expected of them by society. The needy could get help from churches, charitable organizations or their family and friends, but most Americans believed that anyone who was willing to work could find a job. Immigrants arriving in the New World with little or no money depended on others from their old homeland to help them get started on a new life.

In the late 19th and early 20th centuries, a number of nations in Europe were establishing and administering government-funded public welfare programs. No equivalent movement existed in the United States until the beginning of the 20th century, due largely to the availability of abundant and fertile land, industrialization providing limitless jobs for immigrants, and no social hierarchy to prevent them from competing for such jobs.

Millions of European immigrants had almost limitless opportunities to establish a good life for themselves, and many of them, through applied intelligence and hard work, succeeded beyond their wildest dreams. Such opportunities were not always available to black Americans (most of whom were held in slavery until the end of the Civil War in 1865) and Native Americans. The majority of Americans could, if they worked hard, establish themselves in comfort, both socially and economically within a generation or two. Government aid was unnecessary for this majority.

By 1900, however, there began a public recognition that part of the population was disadvantaged and there was a need to do something for those citizens. A social movement known as "Progressivism," which preached the reform of society through government intervention, gradually began to replace the "laissez-faire" philosophy of the preceding century. The needs of the poor and disadvantaged were, for the first time, attended to by government-employed social workers. Health and recreational services were established for the poor living in urban tenements. Many states passed laws restricting child labor, protecting workers, limiting work hours and providing workmen's compensation. Government at all levels came to accept responsibility for the general welfare of every citizen.

In 1929, the United States and the rest of the industrialized world entered a period of severe economic decline known as the Great Depression. With lessened demand for products, industries shut down. Tens of thousands of people lost their jobs; the amount of money in circulation shrank. For the first time in the history of the United States, many people who wished to work were in need of help in order to provide food for their families, but there was no federally organized system to provide that help. The President, Herbert Hoover, introduced programs to solve the problem, but changing a system takes time, and most people felt that not enough was being done.

This difficult situation called for changes, and to most Americans the first step in this direction was to replace the leaders of their government. In 1932 the people elected Franklin D. Roosevelt president and gave his party majorities in both houses of Congress.

Within days after Roosevelt took office in 1933, the old idea that direct federal government support was not a useful way to help people faded into history. Suddenly, Congress was establishing many public welfare programs which were radically different from any earlier activities undertaken by the American government. The government began using its money and power to provide jobs for people on public projects such as nature conservation, building dams, repairing roads, renovating public buildings and establishing new electrical systems for rural areas.

Among the programs that began during the Depression years was the Social Security program, approved in 1935. The program, assuring that retired people have a small regular income each month and providing unemployment insurance, disability insurance, public assistance to the needy and child welfare, has been a major program in the United States ever since.

In the years since Roosevelt, other presidents have established other government social welfare activity. One of them, Lyndon B. Johnson, asked Congress in 1964 to declare a "war on poverty." Another set of government programs was enacted. As with those established under Roosevelt, some of these (such as the Medicare program), are now an accepted aspect of American life. Medicare helps elderly people pay medical bills. In addition, Medicaid provides medical help for the poor, and the food stamp program provides subsidized food to poor families.

Since the 1930s, however, many Americans have felt that though some public welfare programs are needed and beneficial, they can, when taken to excess, become too costly and can weaken individual initiative.

In 1980, as a Republican presidential candidate, Ronald Reagan argued that one cause of unemployment, rising prices and government deficit spending was that too much money was being spent by the federal government, including on its social welfare programs. He was elected in 1980 and then reelected in 1984 by an electorate seeking a remedy for economic problems which had become steadily worse during preceding administrations.

President Reagan asked Congress to decrease the number and scope of social welfare programs, and at the same time, to decrease taxes. In the wake of the president's actions, inflation declined and employment increased. As the economic situation improved for the majority, many came to accept the idea that public welfare on too broad a scale may cause economic problems throughout the society. Two questions arose: "Can even a rich nation afford broad public welfare programs?" And "are public welfare programs the most effective way to address problems of economic or social inequality?" The debate over affluence, economic policies, public welfare, individual initiative and what might be done to solve the problem of poverty seems destined to continue for quite some time.

WELFARE

The majority of Americans—about 85 percent—are neither wealthy nor poor. They belong to the broad economic category considered to be "middle class." This means that they have jobs in factories or offices, run stores, or are trained professionals such as teachers, nurses, farmers, police officers and salespeople. Middle class people ordinarily live comfortably, own cars, spend some time each year on holiday and can pay—at least in part—for a university education for their children. Economically above this middle class are some very wealthy people; below the middle class are the poor. Poverty in the United States is difficult to define. Generally, a family of four with a yearly income of $11,600 or less is considered to be poor by American standards. Many of the poor have less income than this "minimum" amount. Daily life is

difficult for the very poor. Without the welfare system they would not earn enough money to buy enough food or other necessities. Many would live in inferior housing and would not be able to pay for medical treatment or higher education for their children.

Most Americans are troubled by the fact that poverty exists in their land. The United States is, after all, known for its wealth, its abundance of food and its opportunity for all tc build a good life. The goal is to operate a free enterprise economy in which everyone who wants to work can find employment at which he or she can earn enough money to live comfortably. Despite that goal, there is always a percentage of people who want to work but who cannot find employment for which they are suited. The percentage of the population unemployed varies with the national economic situation. In recent years, the official figure for unemployment has averaged between five and seven percent.

The plight of the poor and unemployed would be much worse than it is if it were not for help that they can and do receive from the federal and state governments. The public welfare system in the United States is so large that in the early and mid-1980s nearly one half of all money spent by the federal government was for "social payments"—money used to help people. The percentage has doubled since the 1960s, when only about 25 percent of the money spent by the federal government supported these welfare needs.

In addition to federal programs, there are programs in each of the 50 states which are designed to help people in need.

Among the many programs that help people living in poverty are:

  • Welfare payments—sums of money which are given by the government each month to those whose income is too low to provide necessities such as food, clothing and shelter.

  • Medicaid—free medical and hospital care.

  • Food stamps—books of special stamps which can be used to buy food at any store.

  • School breakfast and lunch programs providing free meals to schoolchildren.

  • Surplus food programs, under which food is purchased in huge quantities by the government and distributed free of charge to the poor.

In addition, the poor—and even people who are not poor—can become eligible to live in public housing. Public housing developments are groups of apartment buildings built at government expense. Federal, state and city government agencies are in charge of seeing that the apartments are made available to people with low incomes. Government agencies also take care of the buildings, providing guards, maintenance and heat.

When public housing is not available, poor people who need a place to live are sometimes placed in privately owned apartments or in hotels for which the rent is paid by the government.

SOCIAL SECURITY

There are many other government programs that provide help to people. The Social Security program remains the largest. It is financed by a tax paid by all working people. Virtually everyone who works in the United States has seven percent (in 1990) of his or her

wages deducted to support the Social Security program. This money is used in several ways:

When people reach retirement age—they must be at least 62—they can stop working and receive a monthly Social Security payment. (Most Americans do not retire until after age 65, however, when the payment is slightly higher.)

When a worker becomes disabled and cannot work, he or she is usually eligible for Social Security payments. Social Security payments are available to widows and young children of workers who die before retirement age.

Older Americans (over age 65) are also eligible for medical and hospital care under a federal government program called Medicare. Although this program does not pay all medical expenses, it does help a great deal. On average, it pays about 74 percent of the money needed for hospital care and about 55 percent of the money needed to pay doctors' fees.

BENEFIT PROGRAMS

There are a number of other ways in which the federal or state governments help people:

Unemployment Insurance: Each state provides money to workers who lose their jobs through no fault of their own. The unemployed worker can receive weekly payments for up to six months while he/she looks for a new job. In times of recession—times when jobs are very hard to find—payments are sometimes made over a longer period, up to a year. Payments vary from state to state and federal money is also used to supplement the payments. The states also have agencies which retrain workers or help them find new jobs, using information about available work provided by private companies.

Veteran's Benefits: Persons who have served in the armed forces can receive inexpensive or cost-free hospital care at special veteran's hospitals. Those wounded or disabled while serving the nation in the military also receive pensions and free medical care.

Education: Public schools are located in all states. There is at least one in every city neighborhood. Every town, no matter what size, and all rural areas have them. All children—even children who are not American citizens—must be given a completely free education at these schools for up to 12 years, ending when the young person is 17 or 18 years old. Higher education at a college or university is not free in most cases, but all states and many cities operate colleges and universities at which the cost of an education is much lower than that at privately operated schools. Young people who qualify because their family's income is low can get loans or grants through government programs. Loans must be repaid when the student is working after graduation.

All states and cities also have free public library systems. Anyone can come to these libraries to read or borrow books, magazines or phonograph records.

Business: There are certain government agencies which help people who run businesses or who wish to run a business, sometimes by providing loans to business people. The idea is that by helping business, the government can help others, too, because businesses provide jobs as well as needed goods and services.

Job Training: Government programs help young people and adults from poor families or minority groups learn a skill that will get them a good job. These programs, which are designed to help young people with talent in mechanics, the arts or certain other trades, are in addition to the free public high schools.

MIDDLE CLASS LIVING

A part of any discussion of the public welfare system in the United States must be a mention of how those Americans who do not receive aid from this program—i.e. those who are not poor, aged or disabled— manage to provide for themselves in a society in which few services are subsidized by the government.

People from many countries find it difficult to understand how the majority of Americans live comfortable lives without the support of a public welfare system. Medical care in the United States is expensive; university education can cost $20,000 per year; living well after a worker retires requires more money than will be paid through the Social Security system. Most Americans prepare for these needs by saving a part of their salaries in savings banks; others invest in industries or service corporations in hopes of receiving greater profits.

Most Americans also buy insurance. Private companies sell insurance of many kinds. In buying insurance, a working person agrees to pay a set sum of money every month or at other regular intervals. In exchange, he or she receives money when needed. Life insurance guarantees a sum of money to survivors of the person in case of death. Medical and hospital insurance guarantees payment of large medical and hospital bills. There is also dental insurance and insurance that pays money when a home bums down. An American can also insure a car, furniture or other personal belongings.

Other benefits for working Americans are provided by the companies they work for or the labor unions to which they belong.

All large businesses and many smaller ones offer their workers benefits. These benefits can include free or low-cost medical insurance and life insurance. Many companies also have retirement plans. The companies put money aside to pay their workers when they retire. There are also profit-sharing plans through which extra money is put aside for workers when the company makes a great deal of money in any one year.

Many labor unions also have special funds from which workers can receive monthly checks when they retire or if they become disabled and cannot work. Some unions also pay for medicine that the workers need but which may not be purchased by medical insurance. Some pay workers a small amount of money if they lose their jobs.

The cost of higher education is usually paid by a combination of private savings, income from a part-time job held by the student, and low interest loans or grants of money given to needy students by the federal government but administered by the university.

VOLWTARISM AND PHILANTHROPY

In addition to receiving aid from the public welfare system, funded through taxation and administered by the federal and local governments, Americans in need can depend on help from a broad spectrum of private charities and voluntary organizations. Some of these organizations are supported by business and industry. Three of many major organizations funded by industry—the Carnegie Foundation, the Ford Foundation and the Rockefeller Foundation—have provided tens of millions of dollars to programs for teacher education, the strengthening of libraries, biomedical research, social science research and the improvement of public administration. American companies have provided 50 thousand million dollars for the promotion of the public welfare during the past 25 years.

On a smaller scale, 55 percent of all American adults do some form of volunteer work, donating a total of 84 thousand million hours to the public welfare each year. On an average, each American also donates 1.8 percent of his income to charity. This money supports privately operated colleges and universities, hospitals, orphanages, homes for the blind and aged, and such internationally active organizations as the Red Cross. Both businesses and private individuals may record the amounts of their contributions to charity and decrease their tax obligations. This system constitutes a public policy in support of private policy.

DERATE OVER WELFARE

The future of public welfare programs and the public welfare system in the United States is not in question. The direction in which the system will move and its scope will depend on the consensus at which the American people arrive regarding its overall benefits and disadvantages.

Some believe that increased direct expenditure by the federal government is the best means to eliminate poverty.

Others believe that there should be limits on government-funded public welfare programs. Public welfare programs, critics argue, are too costly, ineffective, and above all remove incentives for poor people to work to attain the education, training and jobs which could allow them to help themselves escape poverty.

They speak of "empowering" America's poor to help themselves. They say the welfare system does not reward individual initiative—it encourages people to stay unemployed and spend, rather than save, money. They favor programs that help poor people buy their own homes. In several cities, these programs have helped tenants buy—and run—public housing projects. They also favor "enterprise zones," in which businesses are encouraged by tax incentives to provide jobs to inner city residents.

Still other people propose maintaining a minimal "safety net" program for the disabled and the truly needy, and look to continued general economic growth as the best means of aiding the poor.

All of the studies and the arguments about poverty and public welfare programs clearly show that Americans are concerned about a problem that has not been solved. They differ only in their view as to how much the federal government should do about it directly

.

MEDICINE AND HEALTH CARE

By Anne Cusack (Educational Writer)

In the final decades of the 20th century, Americans increasingly view good health as something to which they have a right. They believe they have a right to good health because widespread advances in medical research have made it possible to treat many previously "unbeatable" diseases, and because the Constitutional responsibility of the American government to "promote the general

Welfare" is far more broadly interpreted today than it has been in the past. These rising expectations regarding health care in the United States are a result of vastly increased medical knowledge; and the belief that in an affluent and democratic society all people should have access to well-trained physicians, fully equipped hospitals and highly sophisticated procedures for the treatment of disease. While remarkable progress in the field of medicine has satisfied many of these expectations, each new discovery or procedure brings with it new challenges to be overcome and new questions to be answered. One example is the treatment of heart disease.

Treatment of heart disease is one of modern medicine's triumphs. Today surgeons routinely perform heart surgery that would have been extraordinary, or even unthinkable, just a few years ago. Even heart transplants, though by no means routine, are becoming more common. In 1987, 1,441 were performed in the United States. Transplants, however, pose serious difficulties: a donor heart must become available, blood and tissue must match, and the patient's immune system must be suppressed with medication to ensure that the body does not reject the new heart.

THE ARTIFICIAL HEART

In 1982, American physician William C. DeVries undertook a major step beyond transplants when he implanted an artificial heart known as the Jarvik-7 into the chest of a retired

the Humana Corporation, which owns a chain of private hospitals, Dr. DeVries implanted artificial hearts into two patients, which successfully kept blood pumping steadily through their bodies. However, both patients remained ill, suffered strokes, or brain seizures, and other complications. One of these patients, however, survived for nearly two years before dying in mid-1986.

The artificial heart is a great achievement for modern medicine, but it also poses important questions that are at the center of the debate over the course of medical care in the United States. For example, does the artificial heart offer enough benefits to patients to justify the suffering caused by such an operation? What is the quality of life for an individual who, for the time being, must remain attached to the bulky air compressor which powers the heart? Who should be chosen to receive artificial hearts? What other medical needs might be neglected if many millions of dollars are spent on providing people with artificial hearts?

ACHIEVEMENTS AND LIMITS

The development of the artificial heart represents the kind of dramatic medical advance that Americans have come to expect in recent decades. As medical knowledge has advanced, so has average life expectancy, from 69 years in the 1950s and '60s to 75 years today. Physicians now can treat heart disease and cancer with a variety of drugs or surgical techniques. Individuals whose kidneys have failed can live for years with regular dialysis, or cleansing of their blood, to remove waste products. Drugs are used to control high blood pressure—a risk factor in both strokes and heart attacks. Cardiac pacemakers, or heart regulators, keep many people from dying of abnormalities in the heart rhythm. Surgery, drugs and radiation treatments keep cancer patients alive longer. Childhood leukemia and Hodgkins' disease no longer carry with them an automatic sentence of death. Surgeons can replace damaged joints with artificial ones, and eye doctors use lasers and other advanced techniques to preserve or restore sight. Advances in microsurgery have even made it possible to reattach limbs which have been detached in accidents, and burn victims benefit from the development of new skin grafting techniques. Among the hundreds of newly developed drugs are tranquilizers, or calming drugs, which have made it possible to release many patients from mental hospitals.

Physicians, however, are not miracle workers, and the public's expectations of medical progress sometimes outstrip reality. About 65 percent of Americans who died in 1988 suffered from cancer, heart disease or other problems of the circulatory system. Modern medicine can treat—but usually not cure—such conditions: There are no inoculations against cancer or heart disease. Since physicians often cannot predict who will benefit from a treatment, they generally recommend treating every patient who has even a slight chance of benefiting. On the other hand, many medical tests and procedures involve risk, so the value of medical treatment must be weighed against the possibility that the procedure itself may cause disease or injury.

THE PHYSICIAN

Self-employed private physicians who charge a fee for each patient visit are the foundation of medical practice in the United States. Most physicians have a contractual relationship with one or more hospitals in the community. They send their patients to this hospital, which usually charges patients according to the number of days they stay and the facilities— operating room, tests, medicines—that they use. Some hospitals belong to a city, a state or, in the case of veteran's hospitals, a federal government agency. Others are operated by religious orders or other non-profit groups. Still others operate for profit.

Some medical doctors are on salary. Salaried physicians may work as hospital staff members, or residents, who often are still in training. They may teach in medical schools, be hired by corporations to care for their workers or work for the federal government's Public Health Service.

Physicians are among the best paid professionals in the United States. In the 1980s, it is not uncommon for medical doctors to earn incomes of more than $100,000 a year. Specialists, particularly surgeons, might earn several times that amount. Physicians list many reasons why they deserve to be so well rewarded for their work. One reason is the long and expensive preparation required to become a physician in the United States. Most would- be physicians first attend college for four years, which can cost nearly $20,000 annually at one of the best private institutions. Prospective physicians then attend medical school for four years. Tuition alone can exceed $10,000 a year. By the time they have obtained their medical degrees, many young physicians are deeply in debt. They still face three to five years of residency in a hospital, the first year as an intern, an apprentice physician. The hours are long and the pay is relatively low.

Setting up a medical practice is expensive, too. Sometimes several physicians will decide to establish a group practice, so they can share the expense of maintaining an office and buying equipment. These physicians also take care of each other's patients in emergencies.

Physicians work long hours and must accept a great deal of responsibility. Many medical procedures, even quite routine ones, involve risk. It is understandable that physicians want to be well rewarded for making decisions which can mean the difference between life and death.

MEDICAL COSTS

Physicians' fees are only one reason for rising health costs in the United States. Medical research has produced many tests to diagnose, or discover, patients' illnesses. Physicians usually feel obliged to order enough tests to rule out all likely causes of a patient's symptoms. A routine laboratory bill for blood tests can easily be more than $100.

Sophisticated new machines have been developed to enable physicians to scan body organs—even the brain—with a clarity never before possible. One technique involves the use of ultrasound—sound waves beyond the frequencies that human beings can hear—to produce images. Others use computers to capture and analyze images produced by X-rays or magnetic fields.

These machines often make unnecessary older diagnostic tests which are painful and sometimes dangerous. But the machines are extremely expensive: The price of a single machine can exceed one million dollars.

New technologies also mean new personnel. Physicians, nurses and orderlies i no longer staff a hospital alone. Hospitals n require a bewildering number of technical specialists to administer new tests and open advanced medical equipment.

Physicians and hospitals also must buy malpractice insurance to protect themselves should they be sued for negligence by patients who feel they have been mistreated or have received inadequate care. The rates that physicians were charged for this insurance i very steeply in the 1970s and '80s as patients became more medically knowledgeable, and juries sometimes awarded very large amour of money to injured patients.

As a result, hospital costs and physicians' fees rose steadily through the 1960s and '70s. By 1986, the average cost a stay in the hospital had climbed to more than $500 a day. Government agencies became convinced that it was necessary to limit rising medical costs. One approach i: require hospitals to prove that a need exist for new buildings and services. Hospitals; have faced pressure to run their operations more efficiently, and to decrease the duration of hospital stays for patients receiving routine treatment or minor surgery.

PAYING THE BILLS

The United States today has evolved a mixed system of private and government responsibility for health care. While private citizens and health insurance companies spent about 230 thousand million dollars on health care in 1986, federal, state and local governments spent 179 thousand million dollars for medical services of all kinds. Public funds financed much of the research on the artificial heart, but it was a private corporation Humana, which paid for artificial heart surgery and patient care. This interchange between public and private sectors is typical of how United States provides many kinds of health and medical services.

How do most Americans pay their medical bills? For the vast majority, the answer is medical insurance. About five out of even six workers, along with their families, are covered by group health insurance plans, paid for jointly by the employer and employee о the employee alone. Under the most common type of health plan, the individual pays a monthly premium, or fee. Typically, employees who wish more extensive medical coverage will choose a plan requiring higher premiums.

In return, the insurance company covers most major medical costs, except for a minimum amount, called the "deductible," which the employee pays each year before insurance coverage begins. Benefits then cover a certain percentage, often 80 percent, of the patient's bills in excess of the deductible. S policies provide that after the employee's b have reached a certain amount, the insurer covers 100 percent of all additional costs. Depending on the plan, deductible amounts

most health insurance policies range from $50 to $300. Insurance plans vary considerably, with some offering coverage for dental costs and others providing for mental health counseling and therapy.

Another type of health care plan available to many workers is a Health Maintenance Organization (HMO). An HMO is staffed by a group of physicians who agree to provide all of an individual's medical care for a set fee paid in advance. HMOs emphasize preventive health care, since the organization loses money rather than gaining fees when it is necessary to prescribe treatment or place someone in the hospital. For this reason, medical experts generally credit HMOs with helping to hold down overall medical costs. In 1987, about 660 HMOs served about 29 million people.

MEDICAID AND MEDICARE

Although most families have some form of private health insurance, some citizens cannot afford such insurance. These people receive medical coverage through two major social programs enacted in 1965.

Medicaid is a joint federal-state program which funds medical care for the poor people. The requirements for receiving Medicaid, and the scope of the medical care available, vary widely from state to state. Medicaid has proved more costly than expected, and has been exploited for unjustified gain by some physicians. As a result, the government has decreased Medicaid services by making the requirements for those entitled to participate in the program more strict. Nonetheless, Medicaid has greatly increased the use of health care services by the poor.

Medicare is a federal program financed through the Social Security Administration, which provides a national system of retirement and other benefits. Medicare pays a substantial part of the medical bills of Americans who are over 65 years of age or are disabled. Medicare is not a poverty program, but is rather a form of federally administered and supported health insurance. One part of Medicare covers a major portion of hospital bills for the elderly and is financed by a portion of the Social Security tax. Another part is financed by premiums paid by Medicare recipients, as well as from direct federal funds. Everyone who collects Social Security is covered by Medicare.

As is the case with the rest of the health care system in the United States, Medicare has felt the pressure of rising costs. In response, the government has taken two steps. First, Medicare has raised the amount of the deductible that patients must pay before insurance benefits begin. Second, it has changed its method of paying hospitals. Instead of paying hospitals through a vague formula called "reasonable charges," Medicare now pays according to the patient's diagnosis. This provides an incentive for the hospital to keep costs down. If, for example, the hospital can treat a patient who needs gall bladder surgery for less than Medicare pays to treat such an illness, the hospital makes a profit. If the patient's treatment costs more than Medicare pays, the hospital loses money.

In addition to controlling costs, the United States confronts the problem of those who cannot afford private health insurance and yet are not eligible for either Medicaid or

Medicare. One estimate is that more than 30 million people or 1 in 7 Americans have no health insurance during at least part of the year. These may be individuals who are unemployed for a time, families close to the poverty line or those living in remote rural areas. Such individuals can go to public hospitals, where they can always receive treatment in an emergency, but they often fail to obtain routine medical care that could prevent later chronic or serious illness.

ETHICAL ISSUES

The very successes of modern medicine have produced issues and dilemmas unknown in previous periods. The ability to treat newborn infants with severe deformities is one example. Should expensive operations be performed to save the lives of babies who will be seriously retarded or disabled all of their lives? Some parents want every possible effort made to save such babies, in the hope that treatment to improve their child's condition may be developed in the future. Others, less optimistic, think that an early death is better than a life of pain and suffering. In either case, who should make such life-or-death decisions: the parents, the physician, the hospital administrators, the community (through passage of laws)?

The availability of amniocentesis and legal abortion also raises complicated ethical questions. Physicians can now withdraw a small amount of the amniotic fluid that surrounds a fetus in the womb. They can thus obtain fetal cells and study them for possible abnormalities. They can tell, for example, whether the fetus has Down's Syndrome, a defect that causes mental retardation and, often, other physical disabilities. Since amniocentesis carries a slight risk of harming the fetus, it is usually performed only on older mothers who are at greater risk for giving birth to infants suffering from birth defects. Many types of birth defects, however, cannot be discovered through amniocentesis.

Parents who learn of severe abnormalities can choose to abort the fetus prior to the 24th week of pregnancy. Abortion, however, is an intensely controversial subject in the United States, as it is in many other countries. Although abortion is legal in the United States, many feel that it should be legal only when the mother's life is in danger. Others believe that abortion should never be undertaken under any conditions.

Occasionally, a very small living infant is born prematurely. Such infants seldom survive, and the risk of their suffering permanent handicaps is great. Many hospitals have established special intensive care units which can now save many such premature babies. But should all premature infants be treated in this manner, especially if they are below a certain weight and therefore likely to suffer severe disabilities?

At the other end of the spectrum, the situation of unconscious patients also triggers intense debate. Physicians can use respirators—machines that breathe for patients—and other medical equipment to keep patients alive indefinitely, even though the patients will not regain consciousness. When is it proper to turn off these machines and let the patient die?

Most physicians now recognize that there is a point at which further treatment merely prolongs the agony of death, and with the family's consent, they may decide not to resuscitate (restart the stopped heart) an old person dying of cancer. Young victims of auto accidents who are unconscious pose a different set of issues. Often, the decision to maintain an unconscious, critically ill patient may turn on whether or not the person is "brain-dead"— with no measurable electrical activity in the brain. Physicians today recognize that these patients are, in fact, dead, and their life support systems can be removed. Such patients also become valuable sources of organs for transplants for other patients.

HEALTH CARE CHALLENGES

Although Americans, on the average, are healthier and live longer today than ever before, a number of challenges still confront the medical care system in the United States. While advanced technology can provide artificial hearts or transplanted kidneys to a few at high cost, others still suffer from diseases, such as tuberculosis, that medicine already has "conquered."

Older Americans are one of the fastest growing segments of the population. About five percent of the elderly population live in nursing homes. Many suffer from Alzheimer's disease, an increasingly common ailment that affects the brain, leaving its victims mentally confused and hard to care for. Other patients, who might have died in previous years from strokes and other ills, live on; but they suffer from speech and memory defects, paralysis and other disabilities. As Americans have grown more aware of the specific health needs of the elderly, the field of gerontology, the study of the aging process, has attracted increasing numbers of physicians. Medical research has focused on this health issue as well, notably with the establishment of the federal government's National Institute on Aging.

The nation's infant mortality rate is also a concern. The number of infants per thousand live births who died before their first birthday remains higher for the United States than for several other industrialized nations. This rate is also higher for blacks and other minorities than for white Americans. Health authorities agree that better nutrition and prenatal (before birth) health care could substantially lower the infant mortality rate among these minority groups.

Delivering better health care to poor and disadvantaged groups in the United States is only one way of improving the nation's overall health. Research in recent years has made it clear that much disease is the result of the way people choose to live. Money spent to persuade people to lose weight, exercise regularly, eat more healthful foods and stop smoking can often provide greater benefits for more people than the most advanced medical technologies. For example, studies have linked a significant drop in the rate of lung cancer to a nationwide decline in cigarette smoking.

Another severe challenge to the health care system is Acquired Immune Deficiency Syndrome, or AIDS.

This worldwide disease, first reported in the United States in 1981, is caused by a virus spread by sexual contact, needle sharing (such as in illegal drug use) or exchange of blood (such as in transfusions). Since 1981, more than 83,000 Americans have died of AIDS. Scientists and pharmaceutical companies are

working on vaccines to prevent this disease and medicines to treat it. As of 1991, several drugs had been developed to treat some of the symptoms of AIDS, but not to cure or prevent the disease.

In addition to the grief and pain caused by this disease, it has strained the system because many AIDS patients do not have adequate health insurance. Some are cared for by friends and relatives or at clinics run by churches and other groups. Others are treated in hospitals under the Medicaid program.

PATTERNS OF CHANGE

The health care system in the United States today is in a period of rapid change on many different fronts. One example is the distribution of medical services. By the mid-1980s, the United States, in a reversal of a long-standing pattern, no longer faced a shortage of physicians. There was, in fact, a developing surplus of medical doctors. But physicians often prefer to practice in urban areas or comfortable suburbs. As a result, many inner city areas and rural communities still lack sufficient physicians and adequate medical facilities.

As the number of medical specialties has grown in recent years, patients sometimes have found it frustrating to deal with a number of different physicians for differing ailments, rather than with the traditional family physician. Medical schools have responded by creating a new specialty—family medicine. Such family physicians can diagnose and treat many kinds of illnesses, though they also send patients to specialists when necessary. Not every medical problem requires a highly trained specialist, or even a physician. In some communities, physicians' assistants, working with medical doctors, perform some routine medical procedures. Nurse mid wives manage normal pregnancies and deliveries, calling upon obstetricians only if problems develop.

The Humana Corporation's highly publicized artificial heart program highlights another change in American medical practice. Profit-making corporations are playing an increasingly large role in providing medical care, and chains of private, "for-profit" hospitals are growing. Private companies also compete for contracts to run public hospitals for a fee, promising more efficient and cost- conscious management.

Can profit-making corporations deliver more economical and higher quality medicine? Or do they simply draw patients with sufficient funds or health insurance away from non-profit and public hospitals, leaving these institutions to cope with the poorest and sickest patients?

Liberal social critics deplore the lack of government planning and central oversight inherent in a free market approach to health care. Conservative critics, on the other hand, feel that government-funded health insurance and medical programs are inefficient and more expensive than private medical care in the long run. Critics on both sides often agree, however, the medical profession has been given too much freedom in determining the cost of medical care.

While some groups might benefit from funds spent to improve medical care further, many people feel that differences in the way people live account for much of the health gap between rich and middle class and the poor. Is it possible to spend too much money saving a single life? Would spending less money on advanced medical treatments increase the amounts available for better nutrition, pollution controls, safety devices, campaigns to increase exercise and cut back smoking, and other preventive measures? Should people be held responsible for habits and behaviors which make them sick?

Physicians, politicians, medical experts and ordinary citizens were debating these questions in the early 1990s. The answers are by no means clear-cut, but involve a number of trade-offs and compromises between equally desirable goals. In a nation in which more than 11 percent of the Gross National Product (the value of all goods and services) is spent on medical services of all kinds, Americans are in agreement on one central point: Quality, affordable health care must be available to everyone.

Suggestions for Further Reading

Aaron, Henry J.

Painful Prescription: Rationing Hospital Care. Washington: Brookings Institution Press, 1984.

Gorovitz, Samuel. Doctor's Dilemmas: Moral Conflict and Medical Care. New-York: Macmillan, 1982.

Starr, Paul.

The Social Transformation of American Medicine. New York: Basic Books, 1982.

Thomas, Lewis. The Youngest Science: Notes of a Medicine-Watcher. New York: Viking, 1983.

U.S. Office of Technology Assessment. Medical Technology and Costs of the Medical Program. Washington: U.S. Government Printing Office, 1984.

RELIGION IN AMERICA

APILGRIM PEOPLE

"They knew they were pilgrims," wrote William Bradford, one of their first governors, of the little group of English men and women who set sail from the city of Leyden, Holland, in 1620. Though the people of Holland had welcomed them, the little group of English Protestants had never felt really at home there. Now they were sailing for England on the first step of their journey to the New World.

The Pilgrims left behind them a continent torn by religious quarrels. For over a thousand years, Roman Catholic Christianity had been the religion of most of Europe. But by the 16th century, many people had grown to resent the richly decorated churches and ornate ceremonies of the Catholic Church. They resented the power of the Pope, the head of the Catholic Church, as well as the bishops, many of whom lived as magnificently as civil rulers.

Early in the 16th century, Martin Luther, a

Since the United States became a nation, Americans have insisted upon the right to practice the religion of their choice freely. Religious freedom is guaranteed by the First Amendment to the Constitution. Here, a baptismal ceremony at a Methodist Church in Carlisle, Ohio.

Gordon Baer

German monk, broke with the Catholic Church. Luther's teaching emphasized direct personal responsibility to God, challenging the role of the Church as an intermediary. A few years later, John Calvin, a French lawyer, also left the Catholic Church. One of his basic concepts was the idea of God as absolute sovereign, another challenge to the Church's authority.

As a result of their protesting of widely accepted teachings, Luther, Calvin and other religious reformers soon became known as Protestants. Their ideas spread rapidly through northern Europe. Soon established Protestant Churches had arisen in several European lands.

The modern concept of religious tolerance was not widespread. People were expected to follow the religion of their king. Catholics and Protestants fought each other and many religious people on both sides died (for their beliefs.

In England, King Henry VIII (1491-1547) formed a national Church with himself as its leader. But many English people considered the Church of England too much like the Catholic Church. They became known as Puritans, because they wanted a "pure" and simple Church. The ideas of John Calvin particularly appealed to these Puritans.

When James I became King of England in 1603, he began to persecute the Puritans. Many went to prison or left the country. The Puritans could not always agree among themselves either. Many small Puritan groups formed in

England. The Pilgrims who went to the New World belonged to one of them.

The Pilgrims left England with a patent, or permission to settle land, from the Virginia Company, a private company which already owned another colony at Jamestown, Virginia. The Pilgrims landed at Cape Cod, a sandy hook of land in what is now the state of Massachusetts. Their patent gave them no authority to settle there—they were too far north. The Pilgrims turned south, but they ran into waves and storms. So they turned north again and anchored in the cape harbor.

Some of the people who had joined the Pilgrims in London began to complain. They said the Pilgrim leaders had no right to govern land not controlled by the Virginia Company. The Pilgrim leaders were faced with a governmental crisis. How could they unite their people to face the dangers of the wilderness?

As religious believers, the Pilgrims had formed a congregation, or small group of church members, by joining themselves together and choosing a minister. They did this through a covenant, or contract, and they considered such congregations the basic unit of the Church.

When the Pilgrims assembled on their ship, the Mayflower, they formed a government in the same way they had formed their congregation. They made a contract, which became known as the "Mayflower Compact."

With this contract, they agreed to form a "civil body politic" which could make "just and equal laws" for the colony. Most of the grown men of the group signed the compact. Then they began the search for a place to build their homes.

Other Puritans soon followed the Pilgrims to Massachusetts and established towns there. Like other Protestants, they read the Bible often and claimed the right to interpret or explain the meaning of the Holy Book for themselves. The Puritans were particularly interested in the Old Testament.

The Old Testament describes the history of the Jewish people as a contract between God and Israel. God's contract with the Jewish people was the model for the covenants by which the Puritans formed their congregations. The Puritans thought of themselves as a special people. "We shall be as a city upon a hill," wrote John Winthrop, (1588-1649) another Puritan leader. "The eyes of all people are upon us."

Like other followers of Calvin, the Puritans considered worldly success a sign of being saved. They considered their increasing prosperity a sign that God was pleased with them. They generally assumed that those who disagreed with their religious ideas were not saved and therefore should not be tolerated.

In 1636, Roger Williams (1603-1683) was forced out of Massachusetts for disagreeing with the ministers there. He founded a colony in what later became the state of Rhode Island. Rhode Island allowed religious freedom to everyone, and it became a refuge for people persecuted for their religion.

Two other American states began as havens of religious freedom. Maryland was founded as a refuge for Catholics. And Pennsylvania was founded as a refuge for Quakers, a religious group which adopted a very plain way of life and refused to participate in war or to take oaths.

RELIGIOUS LIBERTY FOR ALL

By the middle of the 18th century, many different kinds of Protestants lived in America. Lutherans had come to America from the Palatinate in Germany. The Dutch Reformed Church flourished in New York and New Jersey. Presbyterians (one of the largest Calvinist groups) came from Scotland and Huguenots (French Protestants who subscribed to Calvin's doctrines) from France. Congregationalists, as the Puritans came to be called, still dominated in Massachusetts and the neighboring colonies, an area which came to be known as New England.

Although the Church of England was an established Church in several colonies, Protestants lived side by side in relative harmony. Already they had begun to influence each other. The Great Awakening of the 1740s, a "revival" movement which sought to breathe new feeling and strength into religion, cut across the lines of Protestant religious groups, or denominations.

At the same time the works of John Locke (1632-1704) were becoming known in America. John Locke reasoned that the right to govern comes from an agreement or "social contract" voluntarily entered into by free people. The Puritan experience in forming congregations made this idea seem natural to many Americans. Taking it out of the realm of

social theory, they made it a reality and formed a nation.

It was politics and not religion that most occupied Americans' minds during the War of Independence (1775-1783) and for years afterward. A few Americans were so influenced by the new science and new ideas of the enlightenment in Europe that they became deists, believing that reason teaches that God exists but leaves man free to settle his own affairs.

Many traditional Protestants and deists could agree, however, that, as the Declaration of Independence states, "all men are created equal, that they are endowed by their creator with certain unalienable rights," and that "the laws of Nature and Nature's God" entitled them to form a new nation. Among the rights that the new nation guaranteed, as a political necessity in a religiously diverse society, was freedom of religion.

The First Amendment to the Constitution of the United States forbade the new federal government to give special favors to any religion or to hinder the free practice, or exercise, of religion. The United States would have no state-supported religion. In this way, those men who formulated the principal tenets of the newly established political system hoped to insure that diversity of religious belief would never become the source of social or political injustice or disaffection. But Protestant Churches kept a privileged position in a few of the states. Not until 1833 did Massachusetts cut the last ties between Church and State.

The First Amendment insured that the American government would not meddle in religious affairs or require any religious beliefs of its citizens. But did it mean that the American government would have nothing at all to do with religion? Or did it mean that the government would be religiously neutral, treating all religions alike?

In some ways, the government supports all religions. Religious groups do not pay taxes in the United States. The armed forces pay chaplains of all faiths. Presidents and other political leaders often call on God to bless the American nation and people. Those whose religion forbids them to fight can perform other services instead of becoming soldiers.

But government does not pay ministers' salaries or require any belief—not even a belief in God—as a condition of holding public office. Oaths are administered, but those who, like Quakers, object to them, can make a solemn affirmation, or declaration, instead.

The truth is that for some purposes government ignores religion and for other purposes it treats all religions alike—at least as far as is practical. When disputes about the relationship between government and religion arise, American courts must settle them.

American courts have become more sensitive in recent years to the rights of people who do not believe in any God or religion. But in many ways what Supreme Court Justice William O. Douglas wrote in 1952 is still true. "We are a religious people," he declared, "whose institutions presuppose a Supreme Being."

In the early years of the American nation, Americans were confident that God supported their experiment in democracy. They had just defeated Great Britain—probably the most powerful nation in the world at that time. Protestant religion and republican forms of

government, they felt, went hand in hand. America had a divine mission to make her unique combination of political freedom and "Yankee" thrift and ingenuity a model for the world to follow.

LIBERAL PROTESTANTS

In the early 19th century, another Great Awakening, or revival, swept through New England. By no means were all of New England's clergymen happy with this upswelling of religious feeling.

Many had given up Calvin's idea of predestination, which is the belief that God chooses those who will be saved, and that man cannot win salvation through good works or other means—salvation can only come from God, and then, only to the "elect." Some Protestant clergy now preached that all men had free will and could be saved. Others moved on to positions yet more liberal, giving up many traditional Christian beliefs.

In this liberal setting, poets and philosophers flourished. Ralph Waldo Emerson (1803-1882) developed a Transcendental philosophy, which stressed the presence of Spirit in man and nature. Individual experience and Puritan virtues like self-reliance received a new spiritual foundation. The writings of Emerson and other Transcendentalists are read by millions of schoolchildren in American elementary and high schools.

The idea of progress was appealing to liberal Protestants of the 19th century. Why should religious doctrines not become more rational as science made the natural world more open to human understanding?

In Europe, and particularly in Germany, scholars were reading and studying the Bible in a new way. They questioned the reality of Bible miracles, and challenged traditional beliefs about Bible authors. Moses, it was said, could not have written the first books of the Old Testament. Not all the Pauline letters had been written by Paul.

These and other opinions of Bible scholars frightened many religious people. But liberal Protestants believed that if Christianity were to continue to appeal to educated people, it must accept these ideas.

In the same spirit, liberals wrestled with the problems which Charles Darwin's theory of evolution presented. If human beings had descended from other animals—an idea which almost all scientists quickly accepted—then the story of Adam and Eve, the Biblical first parents of human beings could not be literally true.

To the many questions raised by the progress of science, Protestants sought and found answers. These answers stressed the moral and spiritual meaning of the Bible but did not depend on its reliability as a book of factual history.

What set apart 19th century liberal ministers from their descendants in the 20th century was their optimism about man's ability to make progress. Some, like Henry Ward Beecher (1813-1887), still held that poverty and sin went hand in hand. Some liberal ministers were not very critical of the excesses of capitalism. But others, like Walter Rauschenbusch (1861-1918), thought that the Church should concern itself with reforming society. They discovered a "social message" in

the Gospels, the Biblical accounts of Christ's life, and began to concern themselves with the problems of workers and the city poor.

Modern liberal clergymen are less optimistic about the speed and extent of social reform. But they are still convinced that the Church must fight for the rights of poor people. They manage shelters for homeless people. They feed the poor, run day-care centers for children and speak out on social issues. They seek areas of agreement with other Christians, with Jews and with those of other world religions. Many are active in the ecumenical movement, which seeks to bring about the reunion of Christians into one church.

EVANGELICAL RELIGION

While some New England clergymen embraced the rational side of Puritanism, others turned toward the emotional or spiritual side. These ministers welcomed the "Second Awakening" of the early 19th century. They preached the message of man's sinfulness and Christ's redeeming grace. Evangelical religion, a conservative kind of Protestantism which relies on the authority of the Bible, spread rapidly.

Evangelical preachers spoke simply and directly about the Christ of the New Testament Gospels who died to save mankind. The religious enthusiasm which this preaching aroused often led to the forming of associations, or groups, to carry on the work of reforming morals or spreading the gospel. These groups were often interdenominational; all Protestants were welcome to join them.

Some groups were formed to fight sin; others were formed to spread God's word around the world. Missionaries were sent to Africa, the Far East and to the American Indians in the western United States. Religious tracts, or books, were printed. Some of these groups, such as the American Bible Society, exist today.

Evangelical religion was fervent throughout America and especially on the frontier. Methodist and Baptist preachers competed with each other to win the settlers' souls for Christ.

The Methodists, beginning as an evangelical society of the Church of England, became established as an American church in 1784, sending traveling preachers, or circuit riders, into the Appalachian mountains and beyond. The Baptists, like the Methodists, used "lay" preachers (unordained, dedicated, laymen, who did not have the benefits of formal seminary educations) who preached to small frontier congregations on Sunday. The Baptists believed in adult baptism by immersion, symbolizing a mature and responsible conversion experience. Traveling evangelists preached at camp meetings, revival gatherings which became a regular part of life in the American West.

Settlers would ride many miles to hear a famous revival preacher or evangelist. They would camp for days in the open fields, hearing sermons, and staying up, sometimes all night, to pray, sing hymns and talk with each other. "Conversions," or religious experiences of God's grace and remorse for sin, were often very dramatic. In some cases, people wept, fainted and danced about as if in a trance.

The Methodists and Baptists grew rapidly in numbers. As both denominations matured their pastoral leadership was assumed by ordained pastors with formal seminary educations. They are still the chief denominations in the southern United States. They have many members in other parts of the country as well.

Evangelical religion won over black slaves as well as their white masters. On some plantations, or large farms, black preachers held their own services. In the North, free blacks organized two different African Methodist Episcopal Churches early in the 19th century.

Most religious people were slow to condemn slavery, though from the earliest days the Quakers opposed it and risked their lives helping black slaves to freedom. By the 1850s, however, northern ministers of many denominations were preaching that slavery was a national sin.

In the South, however, many clergymen defended slavery and even owned slaves. They said that both the Old and New Testaments treated slavery as a normal part of society. The slavery question and the Civil War caused a splitting of the Baptist, Methodist and Presbyterian denominations which lasted into the 20th century.

Northern victory in the Civil War (1861- 1865) meant freedom for the slaves. In the war- damaged South, most of the freed slaves became poor farmers, working land they did not own for a share of the crop. Segregation, or racial separation, became a way of life.

Many whites were just as poor as blacks. Black and white alike sought comfort in a conservative, evangelical form of religion. The South became a stronghold of "old time religion." In 1925, a biology teacher, John Scopes, was convicted under a Tennessee state law which forbade teaching the theory of evolution in a public school. Scopes' conviction was overturned on a legal technicality. But a number of other states in the South passed laws against teaching Darwin's theory. Even today, teaching the theory of evolution to the exclusion of religious teachings is controversial in parts of the United States.

After the Civil War, northern factories grew rapidly. American Protestants did not give up trying to help the poor or convert non- Christians. But they spent a major part of their moral energy for the next 50 years on the temperance movement—an attempt to make all alcoholic drink illegal. Finally they succeeded, and for over ten years (1920-1933) it was illegal to buy beer, wine or liquor in the United States.

But America was changing. By the late 19th century, a kind of Protestant consensus, or agreement, about God's place in American life and government had developed. The arrival of large numbers of Catholic and Jewish immigrants challenged that consensus.

CATHOLICS

By the Civil War, over a million Irish Catholics, many driven by hunger, had come to the United States. Most were working people. Anti- Catholic prejudice was so strong that, on a few occasions, it broke out in mob violence. In 1844, two Catholic churches were burnt and 13 people died in rioting that swept through the city of Philadelphia, Pennsylvania. More often prejudice took the form of discrimination, particularly at the polls. By 1960, however, John F. Kennedy's presidential election victory put to rest the Catholic religion as an issue in national politics. (Kennedy was a Roman Catholic.)

Catholics were not shut out of public schools and hospitals but they wanted their own institutions. So they built their own schools, colleges and hospitals. Catholics believed that these institutions were needed to preserve their faith. Many Catholics now attend public schools and secular colleges. But Catholic institutions, especially in large cities, still serve large numbers of Catholics and a growing number of non-Catholics, who are attracted by the discipline and education offered in these schools.

By the 1950s, many Catholics had risen to positions of leadership, not only in labor unions, but in business and politics as well. As Catholics grew more confident about their place in American life, they began to challenge, not the basic idea of separation of Church and State, but the way American courts interpreted it. The costs of modern education had made their schools very expensive to maintain. Catholics began to seek some way in which they could obtain public funds to help meet these expenses. Other private schools, not necessarily religious in origin or concern, also sought this help.

The lawmaking bodies of many states were sympathetic to these demands. But most attempts to provide help for religious schools were ruled unconstitutional (declared to violate the Constitution) by the Supreme Court of the United States. Giving public money to a religious school was held to violate the clause, or part, of the First Amendment which prohibits the establishment of religion. Public money for religious schools remains an issue in American politics in the 1980s.

If Catholics feel that government should support the non-religious aspects of private education, other American groups call for even less government connection to religion. Sunday closing laws were a real hardship to Jews and Seventh Day Adventists. In effect, they were forced to observe two Sabbaths, or days of rest—their own and the majority Christian one as well. Non-believers, and some religious people as well, objected to prayer and Bible reading in public schools. They thought that a modern government in a free society should be basically secular.

In 1962, the Supreme Court declared that prayer and Bible reading could not be used to start the day in public schools. Such activities, the court ruled, amounted to an establishment of religion. The Court decision was extremely unpopular. In 1983, a survey showed that eight out of 10 Americans favored amending the Constitution to allow prayer in school.

THREE FAITHS

Like Catholics, Jews were a small minority in the first years of the American republic. Until the late 19th century, most Jews in America were of German origin. Many of them belonged to the Reform movement, a liberal branch of Judaism which had made many adjustments to modern life. Anti-Semitism, or anti-Jewish prejudice, was not a big problem before the Civil War. But when Jews began coming to America in great numbers, anti- Semitism appeared. At first Jews from Russia and Poland, who as Orthodox Jews strictly observed the traditions and dietary laws of

Judaism, clustered in city neighborhoods.

Usually, Jewish children attended public schools. The children of the immigrants moved rapidly into the professions and into American universities, where many became intellectual leaders. Many remained religiously observant. Others, while they continued to think of themselves as ethnically Jewish, adopted a secular, non-religious outlook.

When faced with prejudice and discrimination, Jews responded by forming organizations to combat prejudice. The Anti- Defamation League has played a major role in educating Americans about the injustice of prejudice and making them aware of the rights, not only of Jews, but of all minorities.

By the 1950s, a kind of "three faiths" model of the United States had developed. Americans were considered to come in three basic varieties: Protestant, Catholic and Jewish, the order reflecting the strength in numbers of each group. In 1990, Protestants of all denominations numbered about 79,000,000 people. Catholics, the largest single denomination, numbered 55,000,000. Over 5,900,000 Jews lived in the United States. But an increasing number of Americans did not fit into any of these categories. And some who could be considered Protestant had styles of life and beliefs that did not fit into "mainstream" America.

RELIGIOUS DIVERSITY

The United States has always been a fertile ground for the growth of new religious movements. Frontier America provided plenty of room to set up a new church or found a new community. For example, the ancestors of the Amish, very strict Protestants who live in rural areas and scorn modern life, came from Germany in the 18th century to escape persecution.

Many religious communities and secular Utopias, or experiments in new forms of social living, were founded in 18th- and 19th-century America. Most did not last long. But some prospered for a while and a few are still in existence. Twentieth century Americans who follow the impulse to withdraw from society and "join a commune" are following in an old American tradition.

Small sects and "cults" do have certain tendencies in common. Often they regard the larger society as hopelessly corrupt. Prohibition of alcohol, tobacco and caffeine are common. Sometimes dramatic expectations about the future—predictions of the end of the world or the dawning of a new age—form the main tenets, or doctrines, of the group. Often the founder is a charismatic person, a dynamic personality who claims some special revelation or relationship with God. Some groups never win a large following. Others grow smaller or disappear when the founder dies or his prophecies fail to come true. Still others prosper, win large followings and "graduate" into the ranks of the "respectable" denominations.

Some groups, like the Amish of Pennsylvania, simply want to be left alone in their rural communities. They wish to keep their children out of high school so they will not be affected by modern society.

A few prefer faith healing to modern medicine or object to certain medical practices.

What should society do when a Jehovah's Witness refuses a blood transfusion for himself or his child?

Questions like these often come before the courts in the United States. They are generally settled according to a principle the Supreme Court established when it ruled that the Mormons, a large and prosperous Christian sect which settled the state of Utah, could not marry more than one wife. Individuals may believe anything they please in America, but they may not do anything they want, even if the action is based on a religious belief. Such questions do not usually cause great controversy, because they do not reflect basic divisions in American society. The Mormons, for example, continue to flourish, and are one of the fastest growing church groups in the United States.

But other questions reflect continuing conflicts in American life. When a 1973 Supreme Court decision made abortion legal in America, many Catholics were shocked. Many evangelical Protestants and Orthodox Jews also objected. Yet more liberal Protestant and Jewish clergymen joined nonbelievers in maintaining that abortion is a basic right in a pluralistic, or religiously varied, society.

Open religious prejudice is relatively rare in America today. Inter-religious meetings and discussions are frequent. One major cause of the new harmony between members of the "three faiths" has been the Second Vatican Council of the Catholic Church (1962). This Council modified many Church rules, including burdensome restrictions on interfaith marriages. Catholics felt much freer to participate in interdenominational worship services than they had before the Council.

Other world religions are increasing their numbers and influence in America. Over two million members of the Islamic religion live in America. Some are immigrants or the children of immigrants; others are Americans, including some black Americans who have converted to Islam.

Buddhism is a growing faith in America. Recent immigration from Asia has raised the number of Buddhists in America to several hundred thousand—no one seems quite sure how many. Several hundred thousand Hindus have also come to America. In recent years, young native-born Americans have shown great interest in these and other Eastern religions and philosophies.

American pastors are as varied as the flocks they serve. Some of them are women. The Protestant Episcopal Church now ordains women as priests, although the Catholic Church continues to have an all-male clergy. The United Methodist Church has appointed women as bishops. Women can also be ordained as rabbis among some Jewish congregations. Contemplative monks like the Trappists spend their lives in prayer and labor in the monastic tradition of the Middle Ages. Catholic nuns teach and manage large hospitals. Chaplains of all faiths visit the sick in hospitals and nursing homes.

Pastors of churches are expected to be active in the civic affairs of their communities. Often they have psychological training and spend part of their time counseling people with personal problems. They preach to congregations assembled in small chapels and huge city cathedrals, in modern synagogues, and even sometimes in drive-in churches, where people can worship without leaving their cars! Some evangelical preachers reach a television audience of millions.

How do Americans of so many different religions manage to live together under common laws and pursue common goals? Mc Americans are proud of America's religious variety. They consider it a natural result of religious freedom. On public occasions they stress the ideas most religious people share— belief in God and the importance of living a good life.

THE AMERICAN FAMILY

Belonging to a family is one bond almost everyone in the world shares, but family patterns vary from country to country. In some countries, for example, the grandparents are the family leaders. In other countries, many families live and work together as one on community farms. What are families like in the United States?

The family in the United States is diverse and changing, but still central to the identity and well- being of virtually all Americans. Here, the Drane family of Massachusetts enjoys playing a soccer game in the yard of their house. A. Diakopoulos

FAMILY PATTERNS

The United States has many different types of families. While most American families are traditional, comprising a father, mother and one or more children, 22 percent of all American families in 1988 were headed by one parent, usually a woman. In a few families in the United States, there are no children. These childless couples may believe that they would not make good parents; they may want freedom from the responsibilities of child- rearing; or, perhaps they are not physically able to have children. Other families in the United States have one adult who is a stepparent. A stepmother or stepfather is a

person who joins a family by marrying a father or mother.

Americans tolerate and accept these different types of families. In the United States, people have the right to privacy and Americans do not believe in telling other Americans what type of family group they must belong to. They respect each other's choices regarding family groups.

Families are very important to Americans. One sign that this is true is that Americans show great concern about the family as an institution. Many Americans believe there are too many divorces. They worry that teenagers are not obeying their parents. They are concerned about whether working women can properly care for their children. They also

worry that too many families live in poverty. In one nationwide survey, about 80 percent of the Americans polled sid the American family is in trouble. At the same time, when these people were asked about their own families, they were much more hopeful. Most said they are happy with their home life.

How can Americans be happy with their individual families but worried about families in general? Newspaper, motion pictures and television shows in the United States highlight difficulties within families. Family crimes, problems and abuse become news stories. But most families do not experience these troubles. Since the earliest days of the United States, people have been predicting the decline of the family. In 1859, a newspaper in the city of Boston printed these words: "The family in the old sense is disappearing from our land." Those words could have been written yesterday. But the truth is that families are stronger than many people think.

Four out of five people in the United States live as members of families and they value their families highly. In one poll, 92 percent of the people who were questioned said their family was very important to them.

Families give us a sense of belonging and a sense of tradition. Families give us strength and purpose. Our families show us who we are. As one American expert who studies families says, "The things we need most deeply in our lives—love, communication, respect and good relationships—have their beginnings in the family."

Families serve many functions. They provide a setting in which children can be born and reared. Families help educate their members. Parents teach their children values— what they think is important. They teach their children daily skills, such as how to ride a bicycle. They also teach them common practices and customs, such as respect for elders and celebrating holidays. Some families provide each member a place to earn money. In the United States, however, most people earn money outside the home. The most important job for a family is to give emotional support and security.

Families in a fast-paced, urban country such as the United States face many difficulties. American families adjust to the pressures of modern society by changing. These changes are not necessarily good or bad. They are simply the way Americans adjust to their world.

CHANGING AMERICAN FAMILY

When Americans consider families, many of them think of a "traditional family." A traditional family is one in which both parents are living together with their children. The father goes out and works and the mother stays home and rears the children. The biggest change in families in the United States is that most families today do not fit this image. Today, one out of three American families is a "traditional family" in this sense.

The most common type of family now is one in which both parents work outside the home. In 1950, only 20 percent of all American families had both parents working outside the home. Today, it is 60 percent. Even women with young children are going back to work. About 51 percent of women with

children younger than one year old now work outside the home.

Another big change is the increase in the number of families that are headed by only one person, usually the mother.

Between 1970 and 1988, the number of single-parent families more than doubled— from 3.8 million to 9.4 million. In 1988, nearly one out of every four children under 18 lived with only one parent.

Some families look even less like the typical traditional family. They may consist of a couple of one race who have adopted children of another race, or from another country. In many states, single people may also adopt children. Some people take in foster children—children whose parents cannot take care of them.

Another change is that families in the United States are getting smaller. In the mid- 1700s, there were six people in the average household. Today the average household contains between two and three people. A household is defined as any place where at least one person is living.

One recent change is that the number of marriages is rising. The number of babies born also has been climbing steadily for the past 10 years. Many experts see these trends as a sign that Americans are returning to the values of marriage and family.

HISTORY OF THE AMERICAN FAMILY

To understand why these changes are happening, let us look at the history of the family in the United States.

When the United States was established, more than 200 years ago, it was a big, sparsely settled country. Earlier, this land had been a colony of Great Britain. For many years the immigrants who settled in the United States were nearly all of European origin, but later people came to the United States from all over the world. Life was hard for these early families. The average marriage in colonial America lasted only 10 years because many people died young. Few people lived to be older than 60. A widow or widower often remarried many times. Even with today's high rate of divorce, many marriages last longer now than marriages did in the 1700s.

Later, Americans began settling the American West. They were looking for land to farm and for a better life. They left behind their homes, their relatives and their friends. When these settlers said good-bye to the people they loved, usually it was forever. These first settlers of the Midwest and the great Plains of the northwestern United States were isolated; often their nearest neighbor was many miles away. Family members had to work together and to depend on each other to survive.

The family formed an important economic group. All of its members helped to bring food and money into the home. They worked on a farm, planting and harvesting, or they worked making goods to sell at a market. Few people got married as a result of love or affection alone. Most people married because they needed a family in order to make a living. When people married, often they looked for the husband or wife who could bring the most material goods into the marriage. In colonial America, men who did not marry were

heavily taxed. Almost 99 percent of the population married.

Many changes came to families when the United States shifted from being mainly a farming nation to being an industrial nation. This happened in the late 1800s. In 1820, fewer than eight percent of Americans lived in cities. By 1900, about 40 percent of all people lived in cities. People began earning their money outside the home in factories. Instead of getting married on the basis of economic need, people could marry primarily for love.

As men and women became less dependent on their families for a livelihood, the number of divorces began to increase. Between 1900 and 1920, the divorce rate doubled; in 1900, there were four divorces for every 1,000 married couples. This trend alarmed people, but divorce was not new. The first divorce in the United States occurred in 1639 and involved a man who had married twc women. Still, divorce was difficult. A wife was her husband's property. If a husband abused his wife, she had few alternatives and sometimes a wife, or even a husband, would run away from a bad marriage.

The decade of the 1950s is thought to have been the most family-oriented period in American history. People praised and glorified families. Hundreds of thousand of young couples married. They married at the youngest ages in the history of the United States. In the 1950s, by the time men and women reached 21 years old, more than two-thirds of them were married. Today fewer than half of all 27-year- olds are married.

The 1950s was also a "baby boom" time, with very high birth rates. In one year alone more than 4.3 million babies were born. The average mother had more than three children; today the average mother has one or two children.

Today, some people look at the American family of the 1950s as a model or as a goal for the family. Many experts, however, see the 1950s as an exceptional period. They say that the marriage and family patterns of Americans today are closer to those prevalent during the rest of American history than was the pattern of the 1950s

Slowly some of the values accepted during the 1950s began to change. During the 1960s and the 1970s, some women found that they wanted more from life than rearing children, and caring for household matters. Women began to see that they had choices. They could have a job or a family, or both. More women began taking jobs. According to the magazine, U.S. News and World Report, the number of families in which both husbands and wives worked grew by four million during the 1970s.

The period of the late 1970s and early 1980s has also been called the decade of the "me generation." This is a time in which people have explored new ways of living. In the 1970s many couples began living together without being married. These couples questioned why they needed a marriage license.

For about 10 years, the number of unmarried couples living together grew rapidly. Birth control also became more widel) accepted. Couples were able to choose when they wanted to start a family.

Other changes also occurred. One change was an increase in divorces. In 1970, there

were 47 divorces for every 1,000 married couples. By 1980, this number had grown to 114 divorces for every 1,000 married couples.

In the mid-1980s, more traditional marriage and family practices returned. Today, married couples are the fastest growing type of household in the United States. Women and men are rediscovering the joys of home and family life. Even leaders who speak out strongly for women's rights are modifying their views regading the relative importance of the family.

Looking at the history of families in the United States helps to explain how the American family is changing. But what do these changes mean? Are they good or bad? In order to understand, let us look at what is behind these numbers.

DIVORCE

About half of all marriages in the United States end in divorce. These numbers are very high, as they are in many other industrialized countries. A divorce happens when a husband and a wife legally end their marriage. The number of divorces grew steadily in the United States for many years. Now, however, the number has stopped growing. During the past few years the number of divorces has been decreasing.

Couples in the United States may still be getting divorced at a fairly high rate, but this does not mean that they do not believe in marriage. It simply means that they are giving up being married to a particular individual. Most people in the United Sates who get divorced marry again. About 80 percent of all men who get divorced remarry. About 75 percent of all women who get divorced remarry.

United States divorce laws allow men and women to terminate bad marriages; getting a divorce is now rather easy in the United States. And while a 1924 study of families in one town in the American Midwest found few happy couples, in 1977, researchers who went back to the same town found that more than 90 percent of the married couples in that town said they were satisfied or very satisfied with their marriages.

WORKIMG MOTHERS

Today 60 percent of all American women work outside their homes. This is a big change for the United States. Only 40 years ago, 75 percent of all Americans disapproved of wives who worked for wages when their husbands could support them financially. Today most people accept that many women work outside the home.

There are two reasons why mothers and wives work. One reason is that there are many opportunities for women. A woman in the United States can work at many jobs, including an engineer, a physician, a teacher, a government official, a mechanic or a manual laborer. The other reason women work is to earn money to support their families. The majority of women say they work because it is an economic necessity.

About 80 percent of women who work support their children without the help of a man. These women often have financial

difficulties. One in three families in the United States headed by a woman lives in poverty. Many divorced Americans are required by law to help their former spouses support their children, but not all fulfill this responsibility.

A wife's working may add a strain to the family. When both parents work, they sometimes have less time to spend with their children and with each other.

In other ways, however, many Americans believe that the family has been helped by women working. In a recent survey, for example, the majority of men and women said that they prefer a marriage in which the husband and wife share responsibilities for home jobs, such as child rearing and housework.

Many teenagers feel that working parents are a benefit. On the other hand, when parents have younger children, who require more time and care, people's views are more mixed about whether having a working mother is good for the children.

What happens to children whose parents work? More than half of these children are cared for in daycare centers or by babysitters. The rest are cared for by a relative, such as a grandparent. Some companies are trying to help working parents by offering flexible work hours. This allows one parent to be at home with the children while the other parent is at work. Computers may also help families by allowing parents to work at their home with a home computer.

MARRIAGE AND CHILDREN

Unlike their parents, many single adult Americans today are waiting longer to get married. Some women and men are delaying marriage and family because they want to finish school or start their careers; others want to become more established in their chosen profession. Most of these people eventually will marry. One survey showed that only 15 percent of all single adults in the United States want to stay single. Some women become more interested in getting married and starting a family as they enter their 30s.

One positive result may come from men and women marrying later. People who get married at later ages have fewer divorces. Along with the decision to wait to marry, couples are also waiting longer before they have children, sometimes in order to be more firmly established economically. Rearing a child in the United States is costly.

Some couples today are deciding not to have children at all. In 1955, only one percent of all women expected to have no children. Today more than five percent say they want to remain childless. The ability of a couple to choose whether they will have children means that more children who are born in the United States are very much wanted and loved.

GENERATION GAP

If children in the United States are wanted and loved, why do they fight with their parents? At least this is one view of families that American television shows present. The other type of family shown on American television is one in which everyone is great friends with everyone else. These families seem to have no problems.

In real life, most families in the United States fall somewhere in the middle. Talk about a "generation gap" has been exaggerated. The generation gap is a gap between the views of the younger generation of teenagers and the views of their parents.

Many parents in the United States want their children to be creative and question what is around them. In a democratic society, American children are taught not to obey blindly what is told to them. When children become teenagers, they question the values of their parents. This is a part of growing up that helps teenagers stabilize their own values. In one national survey, 80 percent of the parents answering the survey said their children shared their beliefs and values. Another study showed that most teenagers rely on their parents more for guidance and advice than on their friends.

When American parents and teenagers do argue, usually it is about simple things. One survey found that the most common reason parents and teenagers argue is because of the teenager's attitude towards another family member. Another common reason for arguments is that parents want their children to help more around the house. The third most common basis for arguments between parents and teenagers is the quality of the teenager's schoolwork.

Arguments which involve drug or alcohol use occur in a much smaller group of families. Most parents (92 percent) said they were happy with the way their children are growing up.

UPROOTEDNESS

How do problems arise in American families? One view is that American families do not have enough stability and that people move too much to have community roots. Of course, many American families remain for generations in the same town or even in the same house. At the same time, the United States is a mobile, adaptable country. People are willing to work hard in order to advance in their jobs. Good workers are offered new opportunities in their jobs, sometimes in a different city. Families must make the decision. Do they want to take the new job in a new town? Or do they want to give up the opportunity?

The thousands of American families who do decide to move each year may face a difficult time adjusting to a new life. They leave behind a community that they know. They leave behind schools that they trust and friends and family members whom they love. They leave behind a church or religious group. They leave behind a web of supports that helps keep a family strong.

In a new town, children and parents can become lonely. This loneliness strains a family. For example, the area of the United States where people move the most often, the Southwest, also is the area with the greatest number of divorces.

People in the United States know how hard moving can be, so they try to lessen the strain for these families. Many neighborhoods form groups to make newcomers feel at home. Teachers in schools also have meetings to welcome new students. These teachers might pair a new student with a "buddy"—another

student to help the new student.

Some children and parents mature from meeting new people and living in a new place. These experiences can bring families closer together.

Americans are actually moving less often than they did 20 years ago. In 1960, about 20 percent of the population moved. In 1987, about 18 percent of the population moved. These people moved shorter distances, too. Almost ninety percent of the people who moved in 1987 stayed within the same state. In families in which both parents are working, a family may decide not to move because one parent would have to give up his or her good job.

FAMILY VIOLENCE

Not all families learn to work out their problems. Sometimes family problems can explode into violence. Twenty percent of all murders in the United States involve people who are related. Often people learn violence from their mothers or fathers. These people repeat the vicious pattern by abusing their children or beating their wives. There are also cases of wives abusing their husbands. Violence in the family is a serious problem in the United States, as it is in many countries.

People are looking for answers. One solution is to arrest people who abuse members of their family. Traditionally, police in the United States hesitate to interfere with family problems. However, the shame of an otherwise law-abiding man being arrested for hurting his wife has been shown to be effective in stopping him. Many cities and towns in the United States also offer "safe homes" in which an abused person can find shelter. Help is also available for parents who abuse their children. By working together in groups, parents can learn how to break the pattern of hurting their children.

STRONG FAMILIES

In a perfect world, families would have no problems. Parents would know how to rear their children to be responsible adults. Americans and others throughout the world are trying to learn what makes strong families. Perhaps families can learn how to solve their problems. Researchers at the University of Nebraska have found some answers. Strong, happy families share some patterns whether they are rich or poor, black or white.

Strong, happy families spend time together. After dinner, for example, happy families may take walks together or play games. Strong families also talk about their problems. They may even argue so that problems can be resolved before they get too big. Members of strong families show each other affection and appreciation. Members of strong families are also committed to one another and they tend to be religious. Finally, when problems arise, strong families work together to solve them.

The values that Americans cherish, such as democracy and economic and social freedom, are values that Americans want for their families. Americans work hard to make their families successful. Today, however, families are changing, but they are not disappearing. Americans accept that strong, happy families come in many sizes and shapes

Suggestions for Further Reading

Berger, Brigitte and Peter L. Berger. The War Over the Family: Capturing the Middle Ground. New York: Anchor/Doubleday, 1984 (cl983).

College of Home Economics,

Iowa State University.

Families of the Future:

Continuity and Change.

Ames, Iowa: Iowa State University Press, 1983.

Gordon, Michael, ed. The American Family in Social-Historical Perspectives. 3rd ed. New York: St. Martin, 1983.

Levitan, Sar A. and Richard S. Belous. What's Happening to American Families?: The Family and Its Discontents.

Baltimore: Johns Hopkins University Press, 1981.

Scott, Donald M. and Bernard Whisky, eds. America's Families: A Documentary History.

New York: Harper and Row, 1982.

. THE LAW AND THE JUDICIARY

"Equal Justice Under Law." These words, which affirm that the United States is a nation governed according to law and that that law protects and directs the actions of all people equally, are carved in marble, high overhead, on the front of one of the most significant buildings in Washington, D.C. The four-story marble building, in the style of an ancient Greek temple, is the one in which the Supreme Court of the United States does its work.

The Supreme Court consists of a chief justice and eight associate justices, and the responsibility and power of these nine people are extraordinary. Supreme Court decisions can affect the lives of all Americans and can change society significantly. This has happened many times in the course of American history. In the past, Supreme Court

The United States prides itself on being a nation of laws. The Supreme Court, which considers cases involving the interpretation of the meaning of the U.S. Constitution, is the country's highest and most powerful court.

James K.W. Atherton, The Washington Post rulings have halted actions by American presidents, have declared unconstitutional— and therefore void—laws passed by the Congress (the government's lawmaking body), have freed people from prison and have given new protection and freedom to black Americans and other minorities.

The Supreme Court is the court of final appeal and it may rule in cases in which someone claims that a lower court ruling on a Federal law is unjust or in which someone claims there has been a violation of the United States Constitution, the nation's basic law.

THE COURT SYSTEM

There are many federal courts in the system which has the Supreme Court as its head. In addition, each state within the United States has established a system of courts, including a state supreme court, to deal with civil, criminal and appellate proceedings. There are also county and city courts. Even many of the smallest villages, those in which only a few hundred people live, have a local judge, called a "justice of the peace," who handles minor legal matters. There are separate military courts for members of the armed forces and other specialized courts to handle matters ranging from tax questions to immigration violations.

In the United States, a person accused of a crime is considered to be innocent until he or she is proven guilty. The Constitution requires that any accused person must have every opportunity to demonstrate his or her innocence in a speedy and public trial, and to be judged innocent or guilty on the basis of evidence presented to a group of unbiased citizens, called a jury. A person who has been judged guilty must still be treated justly and fairly, as prescribed by law. A person treated unjustly or cheated by another or by a government official must have a place where he or she can win justice. That place, to an American, is a court.

ROLE OF THE CONSTITUTION

American concern for justice is written into the basic law of the land, the United States Constitution, which establishes the framework for the federal government and guarantees

rights, freedom and justice to all.

The Constitution, written in 1787, established a government of three branches. One of these is the judicial branch, and the Supreme Court of the United States is the most powerful part of it.

The other two branches of the national government are the legislative, which consists of a Congress of elected representatives of the people, and the executive, headed by the president as chief of state. The people who designed this government and wrote the Constitution distributed power among the three branches so that no one person or group of people in the government could exercise enough power to control the others. The procedure for naming justices to the Supreme Court is one example of how this distribution of powers, called "checks and balances," works.

The chief justice and the associate justices are named by the president. This authority represents great power, considering the major effect court decisions have on the legal system and on society in general. The writers of the Constitution tried to make certain, however, that presidents would name only qualified justices and also that they could not remove justices with whose decisions they disagreed. This insures the independence of the judicial branch. For that reason, no one can become a member of the court unless the upper house of Congress—the United States Senate— approves. The Senate does not approve an appointment until its members are satisfied that the candidate is qualified. Once approved, a justice cannot be removed by either the president or the Congress without very good reason, nor can the salary of the justices be reduced. The chief justice and associate justices, therefore, serve on the court for life and need not—and should not—take into consideration political issues or the opinions of officials in the other branches of government when making legal decisions.

WHAT THE COURT DOES

The main work of the Supreme Court is to make the final decision in legal cases in which a charge of violation of the Constitution is made. The Constitution gives certain powers to each branch of the federal (national) government. It also gives certain powers to the governments of the states, creating a federal system in which power is divided between national and state authorities. Whenever a charge is made that a person or agency in any part of the federal or a state government has broken the law, the Supreme Court may eventually be asked to decide the case. When it does, the decision itself becomes law.

Most cases—and some of the best-known— that come before the Supreme Court involve charges that individual rights or freedoms have been violated. Such cases arise because the Constitution guarantees these rights and freedoms to everyone.

Most of the rights and freedoms that Americans enjoy are guaranteed in 10 short paragraphs amended (added) to the United States Constitution in 1791. These first 10 amendments make up "the Bill of Rights." They guarantee freedom of speech, freedom of religion, freedom of the press and freedom to assemble in public and to ask the government

to consider grievances. Among the other guarantees are the right in criminal cases to be judged in a public trial by an impartial jury, to be represented by a lawyer at one's trial and freedom from cruel or unusual punishment. Because of the Bill of Rights, police cannot stop and search or arrest a person without good reason, nor can they search anyone's home without clear cause and the permission of a court.

Elsewhere, the Constitution recognizes other rights. A very important one is the right to "due process." That means that no one can be deprived of life, liberty or property unless all proper legal procedures have been followed. Police, government officials and agencies and judges must be very careful not to omit or shorten these prescribed legal procedures in any case. No one person, group of persons or institution can be deprived of even the most minor legal right by the enactment of a law, by official action, by arrest, or in the course of a trial.

The importance to Americans of the Constitution, the law and the principles of equal justice is best understood through discussion of some cases that the Supreme Court has decided. While this discussion does not cover all the types of cases that come before the court, it shows the variety of decisions the court makes.

CHILDREN AND SCHOOLS

Most schools in the United States below the college level are public schools, though there are some private schools. Public schools are paid for by tax money and free to those who attend them. Each state has its own public schools for the children who live in the state. Rules for operating the schools are made by the state government, by lower-level school districts or by city governments in the cities where the schools are located. The federal government usually has no right to determine how the schools should be run. That doesn't mean, however, that schoolchildren do not have rights guaranteed by the federal Constitution. They have, as the following examples illustrate:

• For many years, public schools in some states were segregated. Some were open only to white children, while black children attended their own "separate but equal" schools. Plessy v. Ferguson, a Supreme Court decision of 1896, accepted the justice of this arrangement and ruled against those who argued that all public schools should be open to students of both races.

In 1954, the father of a black girl living in Kansas decided that it was wrong that his daughter could not attend a school near their home because the school was for white children only. Instead, she had to walk much farther to a school for black students. The father also believed the Constitution was being violated because he considered the education offered in the distant school for black children to be inferior to that offered in the white school, and he took the case to court. The Constitution guarantees equal rights to all, and says no state can offer privileges to some people without offering these privileges to others. In 1954, the Supreme Court was asked to decide whether the girl's Constitutional rights were being violated because she was forced to attend a separate and—as claimed by her father—inferior school. In this case, Oliver Brown v. Board of Education of Topeka, Kansas, the court ruled in favor of the girl's father and several other individuals who joined the case and against the state educational system. Since that time, black children have had the right to attend school with white children in all states. Deliberately segregated public schools are illegal. • Many people from other countries enter the United States illegally. Among them are people from Mexico and other Central American countries who cross the border in order to find work in the United States. One result of this illegal border crossing is that many children who are not citizens of the United States live in states such as Texas, New Mexico and California, which border Mexico.

People who enter the United States legally and who intend to become citizens enjoy nearly all of the rights of American citizens. Officials of the state of Texas believed, however, since educating children in public schools is very expensive, the children of people who came there illegally didn't necessarily have the right to an education paid for by public tax money. In 1975, the lawmakers of Texas passed a law stating that children of illegal aliens could not attend Texas public schools. Some people in Texas thought the law was unjust. They sued the state of Texas and the case eventually reached the Supreme Court. The Supreme Court ruled that the law deprived people of equal rights—and since that decision no state has been allowed to deny public school education to any child.

R IGHTS OF THE ACCUSED

Many cases that come before the Supreme Court involve charges that the police or a judge has violated the rights of a person accused of a crime. It doesn't matter whether the person actually committed the crime or not; the Supreme Court does not rule on the guilt or innocence of those accused, but only on whether or not laws and legal procedures conform to the Constitution. The Court rules on whether the individual's right to due process— the proper and correct handling of a legal case—has been violated. If it has, the person must go free, possibly to stand trial again with due process guaranteed. Here are two major cases of this type:

•In 1961, a Florida man named Clarence Gideon was arrested by police as he stood near a small store into which someone had broken earlier and stolen some beer. Gideon was arrested because another man said he saw the theft take place. Gideon was not represented by a lawyer in court. He claimed he was innocent, and tried to act as his own lawyer. The witness succeeded in convincing the jury that Gideon was guilty, and Gideon went to prison. Gideon read law books in the prison library and then wrote to the Supreme Court, saying he had been denied the right to be represented by a lawyer. The Court ruled that Gideon was correct. It said that people who are accused of serious crimes must have lawyers to defend them, even if they cannot afford to pay such lawyers. In that case, the state must pay the lawyer's fee. • In 1963, a man named Ernesto Miranda was arrested in the state of Arizona. As police questioned him, Miranda confessed to a

kidnapping and rape. His confession was cited as evidence against him at his trial. Miranda appealed to the Supreme Court. He claimed his rights had been violated because the police had not told him he could remain silent or that he had a right to be represented by a lawyer. The Supreme Court agreed that Miranda's rights had been violated and his conviction was overturned. Ever since, police have been required to inform arrested people that they do not have to answer questions and that they have the right to be represented by a lawyer.

PRESIDENTS

Even the most powerful official in the United States, the president, can have his actions declared illegal by the Supreme Court. One of the best-known examples is a 1952 case involving President Harry S Truman. In 1952, armed forces under the control of the United Nations, those of the United States among them, were fighting a war in Korea. Those forces depended on supplies from the United States. In early 1952, the union to which steelworkers belonged announced a nationwide strike of the steel industry. As president, Truman was also supreme commander of the armed forces. In that capacity, he ordered the government to take over the operation of all steel plants so that the supply of steel for the war effort would not be cut off. The Supreme Court ruled that he could not do this. It stated that only Congress has war powers, and not the president. It said the president did not have the legal right to control any industry.

RELIGION, SPEECH AND PRESS

American concern for freedom of religion, press and speech is reflected in the hundreds of cases that have come before the Supreme Court:

  • A well-known Supreme Court case of the early 1960s involved a woman named Madalyn Murray, who believed that freedom of religion also meant the freedom not to have a religion. Mrs. Murray felt it was wrong that in the city of Baltimore, Maryland, public schoolchildren were required to read from the Christian Bible. The Supreme Court agreed with Mrs. Murray. It ruled that the First Amendment to the Constitution requires the state to be neutral in its relations with believers and nonbelievers. Thus, any religious exercises in public schools are unconstitutional.

The ruling in the Murray case was one of many that have caused great controversy. Religious people were offended that the court had decided that a public school—run by a government—could not require Bible readings. Other rulings voided laws that required prayers. (Prayer in religious schools is protected by the Constitution because such schools are run privately and not by a government.)

  • A man named Eddie Thomas worked in a factory in which military material for the government was manufactured. Thomas worked in a part of the factory which did not make military material. One day, he was transferred to a department producing military material, despite his claim that his religion forbade him to do anything involving the making of weapons. He was told he couldn't continue to work for the company if he refused to take the new job. Thomas then left his position and went to a state government office to claim unemployment payments, which are made to people who lose their jobs through no fault of their own. He was told he couldn't receive the payments because he had quit his job for no good reason. The Supreme Court, in 1981, ruled that the government office was wrong. It could not force him to go back to work in violation of his religion and his conscience.

• In 1971, two major United States newspapers began publishing a history of American involvement in the war in Vietnam (in Southeast Asia). The history was in the form of a report prepared for high government officials. It had been stolen from government files and given to the newspapers. The American government went to court to stop the newspapers from publishing the report. The Supreme Court ruled, however, that because the Constitution guarantees freedom of the press the government could not do this—and the newspapers continued to publish installments of the report.

ABORTIOX

In 1973, in Roe v. Wade, the Supreme Court ruled that, under a right to privacy, the Constitution guarantees women the right to have abortions—to end pregnancy by a surgical procedure within the first three months, and with some restrictions thereafter. Ever since, people who believe that abortion means taking a human life have tried to get the court to overturn that controversial ruling. By the end of its 1991 term, the Court had not done so. But it had let stand some restrictions on a woman's right to an abortion. For example, in 1989, a Supreme Court decision gave state legislatures some leeway in passing laws governing abortions within their borders.

WINNERS AND LOSERS

Not everyone whose case goes before the Supreme Court is a winner. Losers have included prisoners who claimed they were treated unjustly because they were locked up two to a cell built for one. The Supreme Court did not think this "overcrowding" was "cruel and unusual punishment," which the Constitution prohibits.

Another loser was a man who was arrested for calling a policeman a "fascist" and using other abusive language loudly in public. The Supreme Court ruled that freedom of speech does not give people the right to use words that unjustly harm the reputation of another person.

It should also be noted that not all Americans are satisfied with all Supreme Court decisions. Many Americans believe that the court too often "takes the side of the criminals" in declaring proceedings invalid because an accused person's rights have been violated. Others argue, however, that protecting the innocent is the real intent of these rulings, and that it is better to have a few criminals go free than to have one innocent person be jailed.

Not all cases are settled in the Supreme Court. Only a small percentage win the attention of the chief justice and the associate justices. Many cases sent to the Supreme Court are studied by the justices and then sent back to the court or person from which they came. That means that, as a lower court has ruled on the case, the ruling remains in effect.

Lower courts often hear cases and make decisions that are extremely important to large groups of people. In recent years, for example, Native Americans—better known as American Indians—have gone to courts to have land returned to them. The land may have been taken from them by white people a hundred or more years ago. In one case argued in the 1980s, Indians in the state of Connecticut were awarded nearly 400 hectares of land that had been taken from their people in the 1700s. In the 1980s, the land was owned by the people who lived on it, but the federal government awarded the Indians money to buy back the land and to open their own businesses on it.

CRIME AND DRUGS

Why is such an extensive system of courts necessary? Despite the respect of most Americans for law and the determination of the legal system to protect the rights of individuals, the United States, like all other countries, does experience crime. Especially in large cities, the crime rate can be high.

A high percentage of crime in the United States is directly related to the illegal sale and use of drugs. Drugs are smuggled into the country by organized groups of criminals despite intense efforts by the government to stop the illegal drug trade. Those who become addicted to drug use sometimes rob or break into houses or stores to get money to pay for the drugs.

Drug abuse has caused great concern in the United States. The federal government has worked hard to stop the growing of opium poppies, of coca plants and of cannabis (source of marijuana and hashish) in other nations. It has also set up special agencies, sometimes working with agencies from other nations, to catch the smugglers outside and inside the United States. Teachers and many other citizens work together to teach children about the dangers of drug use. Many government agencies in the states and private citizen groups work to help drug addicts give up their drug use and turn to useful lives.

COPING WITH CRIME

Concern about crime has also led to special government programs and special programs of private citizen groups to stop crime and to help prisoners lead useful lives after their prison sentences end.

In one program, young people are brought into the prisons to talk with prisoners. The idea is that prisoners can do more than any other people to stop young people from turning to crime. The experience of being inside a prison also might have a crime-deterrent effect on the young people.

In some programs, prisoners learn a useful trade so they won't return to crime when they are released. Government programs also encourage private businesses to give young people from poor families jobs so they will be able to earn money legally and will not feel that criminal activity is their only means of getting what they need.

Most states have set up funds to help

victims of crimes. This government money, taken from taxes, might help to pay doctor or hospital bills if the victim was injured, or to replace certain types of stolen goods, or to make up for wages lost as a result of having to appear in court to testify against an accused person rather than being at work.

Like travelers in foreign countries everywhere, visitors to the United States frequently worry about the crime rate. A visitor might wonder, "Just how safe will I be?" An American might answer, "I wouldn't worry about that if I were you. Here, as elsewhere, you should be careful—all of us should—but, chances are, nothing will happen to you."

Despite this caution, which includes locking their homes and cars, most Americans do not spend their time worrying about crime. They move freely and live their lives aware that, worldwide, wherever there are many people there is crime, and that by exercising some caution they will probably not have difficulty.

Another fact that an American might point out to a person planning to visit the United States is that there is much less crime in some places than in others. Crime rates differ from city to city. Within cities, crime rates vary from neighborhood to neighborhood. A visitor to almost any large city merely has to ask someone if a particular area is safe to visit. One study, published in 1985, compared the amount of crime in cities of all sizes around the United States. Its conclusion: "Some places are so safe you couldn't pay someone to assault you, while others are just plain dangerous."

Most Americans would also probably point out that the rules for safety in the United States are also rules that one should follow anywhere one travels.

In no country, regardless of its political or economic system, has the problem of crime been solved, though the American people and their government continue to search for ways to create a safe and more just society. One thing is certain. Whatever is done to try to decrease criminal activity, it will be done within the strict rules provided by the Constitution and watched over carefully by the system of courts. Summed up, those rules guarantee that which is most important to the American people: "Equal Justice Under Law."

Suggestions for Further Reading

Friedman, Lawrence M. Introduction to American Law. New York: Norton, 1984.

Friendly, Fred W. and Martha J.H. Elliott. The Constitution: That Delicate Balance New York: Random House, 1984.

Garraty, John A., ed.

Quarrels That Have Shaped the Constitution. New York: Harper and Row, 1964.

Germann, A.C., F.D. Day, and R.R.J. Gallati. Introduction to Law Enforcement and Criminal Justice. Springfield, IL: C.C. Thomas, 1985.

The Supreme Court Historical Society. Equal Justice Under Law: The Supreme Court in American Life.

Washington: U.S. Government Printing Office, 1980

.

Соседние файлы в предмете [НЕСОРТИРОВАННОЕ]