• Center on Health Equity & Access
  • Clinical
  • Health Care Cost
  • Health Care Delivery
  • Insurance
  • Policy
  • Technology
  • Value-Based Care

Age 25 is Optimal for Screening Adolescents, Young Adults Without Identified Risk Factors for HIV

Article

Although an additional one-time screening at any age between 15 and 30 yields important gains in HIV diagnosis rates and life expectancy for HIV-infected people, screening at age 25 would provide the most favorable clinical outcomes and the best value for money, according to a study in the Journal of Adolescent Health.

For US adolescents and young adults (AYA) without identified risk factors, a one-time routine HIV screening at age 25, without peak of incidence, would optimize clinical outcomes and be cost-effective, according to a study published in the Journal of Adolescent Health.

While the CDC, in 2006, recommended routine HIV screening at least once for those aged 12 to 64, regardless of risk factors, HIV screening rates among the population remain low. According to the authors of the study, only 12% of US high school students in 2005 reported ever being screened, with only a 1% increase in 2012. For older AYA (18-24), the rate of screening dropped from 37% reported in 2000, to 30% in 2010.

“Of all AYA aged 13 to 24 living with HIV, 51% are estimated to be unaware of their HIV status, substantially higher than the 13% of HIV-infected US adults estimated to be unaware of their status,” wrote the authors. “People unaware of their HIV infection miss opportunities for treatment and improved individual health, as well as contribute disproportionately to HIV transmission.”

Using a Cost-Effectiveness of Preventing AIDS Complications microsimulation model, the authors evaluated the cost-effectiveness of alternative strategies for routine one-time HIV screening in AYA aged 13 to 24 as well as current HIV screening and testing practices in the US (13% ever screened by age 18 and 30% by age 24). The authors simulated HIV-uninfected 12-year-olds without identified risk factors who faced age-specific risks of HIV infection and modeled 5 screenings strategies: a one-time screening at age 15, 18, 21, 25, or 30. Each screening was in addition to current US practices.

Among the 12-year-old population, the projected number of new HIV infections peaked at age 24; however, more than 75% of lifetime infections were projected to occur afterward (mean age: 37.3 years [SD 16.9 years]). The authors found that any one-time screening in addition to current practice led to an increased projected CD4 cell count at diagnosis and life expectancy among HIV-infected people. Of the one-time screening strategies, screening at age 25 led to the highest projected mean CD4 cell count (345 cells/μL) at diagnosis and the greatest gains in life expectancy from age 12 (589.82 months, an increase of 4.8 months over current practice).

All one-time screenings detected a small proportion of lifetime infections (.1%-10.3%). Compared with current US screening practices, screening at age 25 led to the most favorable care continuum outcomes at age 25: proportion diagnosed (77% versus 51%), linked to care (71% versus 51%), retained in care (68% versus 44%), and virologically suppressed (49% versus 32%). Compared with the next most effective screening, a screening at age 25 provided the greatest clinical benefit and was cost effective ($96,000 per year of life saved) by US standards (under $100,000 per year of life saved).

“Although an additional one-time screen at any age between 15 and 30 led to important gains in HIV diagnosis rates and life expectancy for HIV-infected people, a screen at age 25 provided the most favorable clinical outcomes and the best value for money,” concluded the authors.

Related Videos
Jared Baeten
Jared Baeten
William R Short, MD, MPH
Dr Jessica Robinson-Papp
Dr. Jessica Robinson-Papp
Dr. Robinson-Papp
Related Content
© 2024 MJH Life Sciences
AJMC®
All rights reserved.