Wages and hours

Wages and hours. Wages are the price paid for work. They are usually figured by the hour or by the week.

Wages

are the main source of income for most people in the United States. Wages may be classified as money wages and real wages. Money wages are the actual amount of money a worker receives from an employer. Real wages represent the amount of goods and services workers can buy with their money wages. The prices of goods and services may change sharply over time. As a result, economists must compare real wages to determine how the ability of workers to buy changes. Such comparisons adjust for changing prices.

Real-wage comparisons are especially important over long periods of time. During such periods, money wages may increase sharply even though real wages may increase little or even decline. For example, from 1985 to 1995, money wages increased by 33 percent while real wages actually declined by about 5 percent. Thus, workers could buy more goods and services with their average weekly wages in 1985 than in 1995.

The growth of real wages over time is much more important than the growth of money wages because real-wage levels determine the purchasing power of workers. A main contributing factor in the growth of real wages over time is growth in productivity. When more goods and services are produced without an increase in the cost of production, prices stay low, and wages can buy more. The growth in productivity is measured by the workers’ average output per worker-hour. Output per worker-hour measures the amount of goods and services an average worker produces in one hour. Output per worker-hour increases as workers become more skilled, and as machinery, tools, and factories become more efficient.

From 1909 to 1950, output per worker-hour in the United States rose an average of 2 percent annually. From 1950 to 1969, it increased at an average annual rate of 2.8 percent. However, from 1969 to 1989, the growth slowed to an average rate of 1 percent yearly. In 1980, 1982, 1989, and 1990, worker productivity actually declined. Since 1990, average worker productivity per hour has risen an average of about 2 percent a year.

Since the 1940’s, employers have spent an increasing percentage of their labor costs on fringe benefits, rather than take-home pay for the worker. The most popular fringe benefits include pension plans, medical and dental insurance, paid holidays, and paid sick time. Employers usually consider benefits as a substitute for wages, rather than as an additional contribution to the workers.

Hours.

Before the Industrial Revolution , most people worked on farms where the workday ran from sunrise to sunset. Factory operators tried to enforce the same hours during the Industrial Revolution of the 1700’s and early 1800’s, despite the difference in working conditions and the type of work. Gradually, the 10-hour day and the 6-day week became the normal working period in U.S. and European factories.

Labor began its demands for an 8-hour day in the mid-1800’s. But the 8-hour day did not become common in the United States until after World War I (1914-1918). During the 1930’s, the 5-day, 40-hour workweek came into general practice in the United States. This practice has changed little through the years. By the early 1980’s, the average workweek was 35 hours. Flexible work scheduling, called flextime or flexitime, began in West Germany in 1967 and spread to the United States during the 1970’s. Flextime workers may choose their own daily work hours, within certain limits, as long as they work the required number of hours per week. Most flextime systems require all employees to be present during a period called the core hours.