Public Health Milestones
The average life expectancy of a child born in the US in 1900 was 50 years. By 2010, this number had increased 60% to 80 years.1 Biomedical, epidemiological, and behavioral research combined with public policy played a major role in this gain. Below are major public health milestones of the last 100 years. Scroll down or click on the link to learn more about each milestone.
When polio cases peaked in the late 1940s and early 50s, the fear was so great that parents were afraid to let their children play outside. Thirty-five thousand Americans were afflicted each year and over half a million people globally either died or were paralyzed by polio every year.2
Millions of people donated to the March of Dimes Foundation to advance research on the poorly-understood polio virus. The Foundation funded Dr. Jonas Salk’s research on a polio vaccine at the University of Pittsburgh. His inactivated polio vaccine (IPV) was approved in 1955 and drastically reduced the number of polio cases in the US within the first few years.3
March of Dimes also supported Dr. Albert Sabin’s work on an oral form of the vaccine (OPV) that was approved for use in 1963 and contributed to the eradication of the disease from the US. In 1970, Dr. Sabin gave rights to the OPV to the World Health Organization and together, the Salk and Sabin vaccines led to a 99% reduction in worldwide polio cases.4 Now, much of world is polio-free and we are closer than ever to eradicating the disease once and for all. Introduction of the polio vaccine has saved the U.S. an estimated $180 billion in treatment costs for the disease.5
An improved Salk Polio vaccine is included as part of routine childhood vaccination and is often combined with the vaccine against Diphtheria, Tetanus & Pertussis (DTaP). Routine childhood vaccines protect against 15 diseases. The CDC estimates routine vaccination against these diseases will prevent over 700,000 deaths among children born in the last 20 years, with a net savings in direct medical costs of nearly $300 billion.6
LEGACY OF POLIO
Children suffering from polio served as a major catalyst for the establishment of physical therapy (PT) as a field in the early 20th century. In response to polio epidemics in 1917 and wounded soldiers from WWI, the Army Medical Department created a training program in muscle training and corrective exercise that bred the first pioneers in PT. The field that has gone on to play essential roles in helping wounded veterans, AIDS patients, and those suffering from injuries.7
Smoking increased drastically during the 20th century. Amid controversy and skepticism on the harms of smoking, epidemiologists used case-control studies and statistics to first link smoking with lung cancer deaths in the mid-1900s. Pathologists and scientists soon confirmed the effects of smoking on the lung and heart.
A national commission was convened at the National Institutes of Health, spurred by the American Cancer Society (ACS), American Public Health Association (APHA), and others, to review the scientific literature of over 7,000 articles. In 1964 the Surgeon General issued a landmark report concluding that smoking was responsible for a 70% higher mortality rate over non-smokers. The report was widely circulated in the media, appearing on the front pages of newspapers and as a lead story on television and radio.
Still, much work remained in reducing smoking rates. In 1965, 42% of adults smoked. By 2013 this rate had been cut by over half to 18%. Smoking rates also fell in teens, from 36% in 1997 to 19.5% in 2009. Public policy and taxation played an important role. Advertising was banned from television effective 1970. In recent years, the FDA banned flavored cigarettes and mandated larger graphic warning labels. By 2010, 25 states and the District of Columbia had smoke free laws at restaurants, bars or workplaces and the tax on a pack of cigarettes increased from $0.42 in 2000 to $1.44 in 2010. Additionally mass media campaigns can be extremely effective as a 2012 CDC media campaign led to an estimated 100,000 smokers quitting.
Lung cancer, for which smoking is the greatest risk factor, is still one of the leading causes of death in the world. Likewise, chronic obstructive pulmonary disease (COPD) is the 3rd leading cause of death in the U.S. Read our COPD fact sheet to learn more.
Heart disease has been the leading cause of death in the US since 1921, when it surpassed pneumonia.11 This marked a transition in the major type of public health challenges facing America from infectious disease to chronic disease.
The death rate from cardiovascular disease (CVD), which includes heart disease, peaked in 1968 and has since steadily declined. Thanks in large part to improved medications, treatments, and public health interventions, there are 1.2 million fewer deaths each year than if the rate had remained the same.12
Up into the mid-20th century little was known about the epidemiology of cardiovascular disease. In 1948 the Framingham Heart Study (FHS) – under the direction of what is now the NIH’s National Heart, Lung, and Blood Institute (NHLBI) – commenced an ambitious health research project to identify risk factors for CVD. Researchers followed over 5,000 people from Framingham, Massachusetts over decades using physical exams and extensive interviews. The study, now on its third generation of participants, linked CVD to risk factors such as high cholesterol, high blood pressure, and smoking.13
The FHS and other epidemiological studies helped change the way disease was thought about, introducing the concepts of both risk factors and prevention. Intervention studies beginning in the 1960s showed that targeting these risk factors with exercise, diet modification, and later on, aspirin, could put a dent in the disease.14 By the early 1980s it was clear that an intervention strategy was needed not just to target those at high risk, but also the overall population.
The NHLBI established National Education Programs focusing on blood pressure and cholesterol in the 1970s and 80s, respectively. These initiatives continue to issue reports based on scientific literature and increase the public’s awareness of the risk factors for CVD, such as sodium intake, through national campaigns and collaborations with local and state health departments.15
In addition to public policy and education initiatives, major advances in medical treatments, such as stents and balloons used in angioplasty, helped cut heart disease deaths by nearly 60% in the last three decades.16 Statins, first developed by Merck in the mid-1970s, have helped lower blood levels of the harmful LDL component of cholesterol. Recent studies are helping to target statin therapy to those most benefited by it, using genetic screening.17
We are in the midst of an obesity epidemic. The number of obese children has tripled from 1970 to 2008, presenting major challenge as this population ages.18 In addition to biomedical research into diabetes, obesity, and heart disease, public policy and educational initiatives are working to combat the epidemic. New York City banned all trans fats from restaurants in 2008 and in 2015 the FDA banned artificial trans fats from processed foods, steps which are expected to decrease coronary heart disease.19 First Lady Michelle Obama’s “Let’s Move!” initiative has brought attention to the need for healthier school meals and kids to be more active.
At the turn of the 20th century there was little that could be done for cancer except complete surgical removal. In 1937 Congress established the National Cancer Institute (NCI) and 1956, NCI researcher Dr. Min Chiu demonstrated for the first time that chemotherapy could cure metastatic malignant cancer.
In 1970 cancer became the 2nd leading cause of death. The following year, President Richard Nixon signed the National Cancer Act, which supported “basic research and applications of the results of basic research to reduce the incidence mortality and morbidity from cancer.” Mary Lasker, a long time champion for medical research played a large role in supporting the legislation.
The first widely-used cancer screening tool was the Pap smear to detect cervical cancer. Though it was developed by researcher Dr. Georgios Papanikolaou at Cornell University in 1928, it wasn’t until the American Cancer Society (ACS) endorsed the test in the 1950s that the Pap smear gained wide acceptance in the medical field. The Pap smear contributed to a 74% decrease in deaths from cervical cancer from 1955 to 1992.20 Still, 12,000 women each year are diagnosed with cervical cancer in the US.21 In recent years, two vaccines were developed against human papillomavirus (HPV), the main cause of cervical cancer, offering hope that most cases will one day be preventable.
Screening also played a major role in the decline in colorectal cancer deaths. Colon cancer is the 2nd leading cause of cancer death for both men and women.22 In order to promote colorectal screening, the CDC launched the “Screen for Life” national campaign in 1999. Thanks to public health campaigns like this, older adults were nearly twice as likely to get a colorectal cancer screen in 2005 than in 1987.23 Better cancer treatments, early detection techniques, and preventative measures have particularly benefited children, with a 66% decrease in childhood cancer deaths in the past 40 years.24
Despite major advances, cancer remains the 2nd leading cause of death in the US. A study from the NCI projects the annual cost for treating cancer to rise to over $150 billion by 2020.25 Recent advances offer promise in the arenas of targeted therapy, immunotherapy, robotic surgeries, and identifying genetic links to cancer. Read our cancer fact sheet to learn more about the need for more investment for research into the disease.
In the early 1980s little was known about AIDS and a diagnosis was a virtual death sentence, with an average survival time of 18 months. Great progress was made in 1984 when researchers at the Institute Pasteur in Paris and the National Institutes of Health (NIH) in the US independently identified a retrovirus as the cause of AIDS. That year Gay-related immunodeficiency (GRID) was renamed to human immunodeficiency virus (HIV).
Education, testing and partner notification were important interventions in the early days. Much of this burden fell to local and state health departments, doctors, and community groups.26 The first drug against HIV changed the landscape when it was approved by the US Food and Drug Administration (FDA) in 1987. AZT, which inhibited the reverse transcriptase needed by the virus to replicate, was developed by researchers at the National Cancer Institute in partnership with Burroughs-Wellcome Company, now GlaxoKlineSmith, and Duke University.
Still HIV deaths continued to rise and AIDS became the leading cause of death for all Americans aged 25-44 in 1994. The next year the FDA approved the first protease inhibitor, ushering in the age of Highly Active Antiretroviral Therapy (HAART) and a sharp decrease in deaths.27 Now there are over 20 FDA approved anti-HIV drugs, which are used in different combinations. The Department of Health and Human Services (HHS) provides guidelines to doctors on proper use of these drugs. Thanks to HAART, HIV is now manageable chronic disease where a 20-year-old diagnosed today can expect to live a near-normal life span.
There has been much fear and misinformation surrounding the disease, especially in the 1980s and early 1990s. Public service announcements (PSA) and national campaigns played an important role in informing the public. In 1988 the Surgeon General sent a brochure about AIDS based on CDC guidelines to every household in the largest public health mailing at the time.28
New HIV infections peaked in the mid-1980s with 130,000 cases and dropped to around 50,000 in the 1990s, where it has remained stable.29 The use of HAART in at-risk individuals not infected with HIV, known as pre-exposure prophylaxis (PrEP) was approved in 2012.30 Numerous clinical studies were crucial to changing perceptions in the medical field about the use of PrEP. It is hoped this new tool, combined with education about safe sex practices, will further decrease HIV incidence.
Infection rates are much higher in certain populations in the U.S., such as black gay men, and are at epidemic levels in certain pockets of the country. Young people aged 20 to 24 make up one-fourth of new infections, meaning there is much work to be done in prevention and education. Of those infected with HIV, 86% are diagnosed and only 37% are on anti-retroviral therapy (ART).31
In addition to blindness, convulsions, muscle cramps and weakness, lead poisoning can cause long-term impairments of intelligence, hyperactivity, and learning and behavioral problems. Children are particularly susceptible due to the tendency to put things in their mouth which absorbs very small amounts of lead and can lead to developmental disorders.
Although exposure to lead was connected to health problems for over a century, attempts at regulation met strong resistance until research from Harvard Medical School in the 1970s linked lead exposure to lowered IQ in children. This catalyzed federal action with the CDC issuing guidelines for diagnosing and treating lead poisoning in children. The Environmental Protection Agency (EPA) quickly banned lead from gasoline and all gas was unleaded by 1986 Congress banned led from interior house paints in 1978. Before these bans, 88.2% of young children had lead blood levels (BLLs) considered elevated and at risk. By the early 1990s only 9% were above the level of concern and the overall mean BLL of children had fallen 78%.32
In one of the most remarkable achievements of public health, the infant mortality rate decreased by over 90% from 1915 to 1997.33 Medical treatments, improvements in nutrition, and public health all made major contributions. For example, sudden infant death syndrome (SIDS) decreased by over 50% between 1994 and 1999 in the wake of a national educational campaign instructing mothers to place sleeping babies on their backs.34
Birth defects are the leading cause of infant mortality and account for 20% of newborn deaths.35 In addition, neural tube defects --such as spinal bifida -- can cause major disability and high healthcare costs throughout life. Recognition for the need to track and study birth defects advanced rapidly in the mid-1990s. In 1996 Congress directed the CDC to establish the Centers for Birth Defects Research and Prevention.
Based on numerous research studies, the CDC and Public Health Service (PHS) recommended all childbearing women in the US to take daily folic acid supplements in 1992 to reduce the risk of birth defects. The March of Dimes and CDC launched educational campaigns about the importance of vitamin supplementation in very early pregnancy. In 1998, in a landmark decision, the FDA required all enriched grain to be fortified with folic acid. Research found that neural tube defects, such as spinal bifida, declined 35% compared to the years directly prior to this fortification.36
In the years between 1997 and 2008, the prevalence of developmental disorders in children increased 17% with a 33% increase in ADHD. Even more worrying, autism increased nearly four-fold.37 Researchers are making progress, identifying genetic and environmental risk factors for the disorder, but much work remains to be done.
Although largely preventable, tooth decay is one of the most common chronic conditions among children in the US. One in four children in poverty live with untreated tooth decay.38 In addition to pain and eating and speaking difficulties, dental cavities can lead to life-threatening complications, including infection.
Epidemiological research at the NIH first documented the link between high fluoride levels in groundwater and resistance to dental cavities in the 1930s. Fluoridation of community water began in the US in 1945 and spread throughout the country by the mid-20th century.39 In 2012 nearly 75% of the US population was drinking water with sufficient levels of fluoride to prevent tooth decay. Studies have found that fluoridation of water reduces tooth decay by around 25% in children and adults.40 Between fluoridation of water and toothpaste, dental sealants, and better overall dental care, rates of cavities fell precipitously in the second half of the century with the average number of teeth decayed, filled, or missing in 12-year-olds decreasing from 4 in the late 1960s to 1.3 in the early 90s.41 Additionally, the percentage of elderly Americans who were toothless declined from nearly half in 1960 to 13% in 2012.42
The 21st century heralded in a new era for dentistry where the link between the mouth and overall health is increasingly appreciated. In 2000 the Surgeon General issued the first report on “Oral Health in America”, which highlighted the connection between oral health and general health and existing disparities in dental care as 80% percent of cavities occur in just 25% of children.43 Disease-causing bacteria can lay dormant in reservoirs in the oral cavity leading to chronic infection and possibly predisposing individuals to cardiovascular disease.44 Research into the well-established link between gum disease and heart disease is ongoing.
Health disparities in dental care have persisted since the Surgeon General’s 2000 report. In 2015, the CDC reported its latest findings that untreated tooth decay was nearly twice as high in black adults as in white adults. Hispanic adults also had higher levels of untreated cavities than white adults.45 As a recent study found that preventable oral health conditions accounted for over 830,000 emergency room visits each year, addressing health disparities like these could provide incredible returns on investment.46 Read our fact sheet to learn more about childhood oral health.
Click here to view footnotes in the text.