The first published report of what would ultimately become known as HIV and AIDS appeared in the Centers for Disease Control and Prevention (CDC) Morbidity and Mortality Weekly Report in June 1981.1 When these initial cases emerged, very little was known about the disease; it did not have an agreed-upon name, researchers had not yet determined what caused it, there were no tests or recognized treatments, and by the time most patients presented with symptoms, they had only months to live.2
In the United States, where the disease was first seen in already marginalized communities—including men who have sex with men (MSM), people who inject drugs, and people who exchange sex for money—the public and policymakers were slow to respond to this new health threat.3-5
The history of the HIV epidemic highlights amazing scientific discovery and fierce advocacy in the face of adversity, but it also shows stigma, discrimination, and disparities based on race, sexual orientation, and socioeconomic status. There is hope as scientific advancements have turned HIV from a death sentence to a manageable chronic condition, but there is still neither a cure nor a vaccine.2
In September 1982, the CDC had labeled the condition acquired immunodeficiency syndrome (AIDS) and by 1984, researchers had identified the cause as a virus they named human immunodeficiency virus (HIV). Additional advances in knowledge included identifying how it spread and what it did to the immune system. By 1985, there was a test for HIV that could determine those who had the virus before they had symptoms.2,6
Treatment advances came next. In 1987, the antiretroviral drug azidothymidine (AZT), now called zidovudine, offered a glimmer of hope. It was meant to prolong the lives of those living with HIV and was considered moderately effective. Importantly, however, it was also found to prevent mother-to-child transmission if taken during pregnancy. New combination therapies were introduced in the early-to-mid 1990s that included two antiretroviral drugs. These were more effective at limiting the amount of virus in the body but often involved people taking more than 20 pills a day and dealing with numerous side effects.2,7,8
A major breakthrough came in 1996 with the introduction of highly active antiretroviral therapy (HAART), a combination of multiple drugs, including a newly developed class of antiretrovirals called protease inhibitors.2 HAART quickly became the standard of care in the US, and in the following year, AIDS-related deaths declined by 47%. Over the next decade, improvements in these therapies, including single pills that combined multiple medications, meant that people could take fewer pills. Testing also continued to improve over the years, with the first oral test approved in 1994, the first at-home testing kit approved in 1996, and the first rapid test approved in 2002.6
Additional medical advancements, most notably the advent of pre-exposure prophylaxis (PrEP) medicine in 2012, have helped us get closer to achieving this goal of ending HIV transmission. PrEP is a prevention strategy for people who are at risk of HIV. It may involve methods such as a prescription medicine and safer sex practices to reduce that risk.2 Research shows that PrEP medicine is highly effective if taken correctly. The CDC recommends PrEP medicine for those at high risk of HIV and the agency continues to invest in other prevention strategies, including promoting condom use and syringe service programs. The uptake on PrEP medicine, however, has been lower than hoped.2,9-11
While there is no cure for HIV, scientific advancements have helped make significant progress toward helping end the epidemic.2
HIV was first identified in MSM, which served to increase the discrimination this community was already facing. It quickly became clear, however, that the emerging HIV epidemic was impacting other communities as well. The first cases of HIV in women were identified in 1983. And, in 1986, the CDC reported that AIDS was disproportionately impacting Blacks/African Americans and Hispanic/Latino Americans.6,12
Early in the epidemic, misinformation was abundant. The public worried that they could get infected through casual contact like shaking hands, sharing a drink, or using public restrooms.4,13 In 1985, a young man named Ryan White became the face of AIDS discrimination when his school refused to let him in the building. White had hemophilia and contracted HIV from a blood transfusion, and the school feared that he would infect classmates. His court battle against the district made national news and helped educate the public about how HIV was and wasn’t spread, and it raised awareness of the discrimination people living with HIV faced every day.6,14,15
Still, public health messages throughout the 1990s and early 2000s focused on personal behavior—whether it was “promiscuous” sex or sharing needles for drug use.3 This fueled blame and further marginalized the communities most impacted by HIV and meant that people living with HIV and those perceived to be at higher risk faced increased discrimination in areas like medical care, housing, and employment.16,17
In the face of this kind of stigma, many of the communities most impacted by HIV became fierce advocates for their needs. Founded out of the LGBTQ+ movement, the HIV community has a history of activism that has shaped the course of the epidemic.
In 1983, a group of activists met at a gay and lesbian health conference and discussed the importance of self-determination for those most impacted by HIV. The document they created became known as The Denver Principles. It rejected terms like “AIDS victims” and “AIDS patients” that implied defeat or helplessness and introduced the phrase “people with AIDS” to the lexicon. And, it enumerated ways that people could support those most impacted by the epidemic.6,18
Throughout the 1980s and 1990s, groups like GMHC (formerly the Gay Men’s Health Crisis), ACT UP (the AIDS Coalition to Unleash Power), NMAC (formerly National Minority AIDS Council), the Latino Commission on AIDS, and the Black AIDS Institute (BAI) formed to offer direct services, provide education, amplify the voices of marginalized communities, and advocate for laws and policies that could help those suffering most. They fought for increased investment in research, care, prevention, and other services that people living with HIV needed, like housing and employment programs.19-24
These groups and others also pushed hard for a coordinated government strategy to address HIV in this country. In 1995, President Bill Clinton established the Presidential Advisory Council on HIV/AIDS (PACHA), which included AIDS activists and leaders of advocacy groups.6,25,26 Still, it wasn’t until 2010 that the federal government, under the Obama administration, released the first comprehensive National HIV/AIDS Strategy. Members of those communities most impacted by the epidemic were involved in crafting this plan.6,27
AIDS activists have now spent decades moving the government, industries, and individuals forward, and their contribution to the progress we have made cannot be overstated. Moreover, AIDS service organizations across the country continue to play an important role in making sure people have access to the prevention, treatment, and care they need.
In 2021, AIDS turned 40 while the world was immersed in another pandemic. It helped shed light on the tremendous scientific progress we made in understanding, treating, and preventing HIV. A virus that was once a death sentence is now a chronic manageable condition. While there is no cure, with care and treatment as directed by a healthcare provider, people living with HIV can live longer, healthier lives.28,29
Now we must all focus our attention on reducing HIV discrimination and disparities and helping break the barriers to equitable care so that those most in need have access to these advancements in prevention and care.
Explore the major milestones from the beginning of the epidemic to today.