Back to contents

Clinical monitoring as effective as CD4 count and viral load for deciding when to switch to second-line ART in poor countries

Published: 30 April 2008

Modelling study suggests 60% alive after 20 years with access to one second-line regimen

Switching to second-line antiretroviral treatment when HIV-positive patients in resource-limited settings develop new symptoms would be almost as effective in prolonging survival as using CD4 cell counts or viral load test results, according to a mathematical modelling study published last week in The Lancet.

The results diverge from those of the Home Based AIDS Care (HBAC) study in Uganda, which found that individuals who received clinical monitoring alone had a higher risk of new AIDS-defining events or death than patients who received CD4 cell count monitoring in a randomised study.

In an accompanying editorial David Moore of the British Columbia Centre for Excellence in HIV/AIDS and Jonathan Mermin, one of the principal researchers on the HBAC study, note that more research is needed to determine when laboratory monitoring would be most useful.

The issue of laboratory monitoring and when to switch treatment in resource-limited settings was extensively reviewed in HIV & AIDS Treatment and Practice last year.

Laboratory monitoring of antiretroviral treatment

In an ideal world, all patients receiving antiretroviral therapy (ART) would undergo regular viral load testing in order to detect viral load rebound above 50 copies/ml, the current limit of detection of tests in common use.

However, viral load testing is expensive and requires a dedicated laboratory with highly trained staff. Outside national reference laboratories and universities few countries in sub-Saharan Africa and Asia can provide viral load testing for people on ART.

CD4 counts may be easier to measure, and World Health Organization (WHO) guidelines suggest that a CD4 count decline of 50% from the previous peak, or a one-third decline in the previous six months, should be considered a trigger for changing treatment.

But even CD4 counts are out of reach for many clinics, and until cheaper point of care tests for viral load and CD4 cells become available, clinical symptoms are the only indicator available in many resource-limited settings for determining when to change to second-line treatment.

Some clinicians and policy makers have misgivings about expanding access to antiretroviral treatment without these laboratory tests, fearing that patients will develop drug resistance due to prolonged periods of viral replication after a rebound occurs. This could lead to transmission of drug-resistant virus and a poor response to second-line therapy in a context where only a limited number of drugs are available for second-line treatment.

Researchers at the Royal Free and University College Medical School in London, the London School of Hygiene, the University of Copenhagen and the World Health Organization developed a mathematical model based on data from simulated patients in order to examine the long-term consequences of different types of treatment monitoring.

At the outset it was assumed that 58% of patients were women, the average age was 30, the median CD4 count 66 cells/mm3, the median viral load 5.4log10 copies/ml (250,000 copies/ml), 32% had a previous history of TB and all had previously been diagnosed with a WHO stage 4 illness (AIDS). Thirteen per cent had previously used single-dose nevirapine for prevention of mother to child transmission.

These assumptions were based on data from existing cohorts.

After the patients started treatment, progression to second-line treatment was then governed by the interplay between baseline viral load, adherence and the number of active drugs in the regimen. The model also incorporated the increased risk of non-HIV related mortality in resource-limited settings.

These factors in the model were programmed based on data from resource-limited settings, derived from meta-analyses of treatment outcomes and adherence in sub-Saharan Africa, Brazil and South-East Asia.

Treatment monitoring was carried out every six months in this model, and patients were switched if:
  • Viral load rebounded above 500 copies/ml or 10,000 copies/ml after at least six months of continuous ART.
  • The CD4 cell count fell by 50% from its peak or 33% within the past six months (after a confirmatory test), and in both cases had fallen below 200 cells/mm3, occurring after at least nine months of treatment.
  • One new WHO stage 3 or 4 event, or two new stage 3 events, occurring at least six months after starting treatment.
Virological failure (viral load above 500 copies/ml) was detected in 16% of patients after year 1, 28% by year 5, 37% by year 10 and 51% by year 20. Eighty-seven per cent of these virological failures would have been detected at year 1 if a viral load threshold of 10,000 copies/ml were used, but only 32% if a new WHO stage 3 or 4 event were used as the trigger for switching. A CD4 cell decline was even less reliable – only 25% of virological failures would have manifested a 33% decline in CD4 cell count in the previous six months.

A similar pattern held true at years 2, 3, 4 and 5.

However, when survival rather than detection of treatment failure was used as the outcome by which the success of the monitoring strategy was measured, there was no difference between any of the monitoring strategies.

Rather than calculating the proportion of patients alive at specific points, the researchers expressed the outcomes in terms of the percentage of life years out of a possible 20 years that persons could live.

 

 

5 years survival

10 years

20 years

Viral load

83%

77%

67%

CD4

82%

75%

64%

WHO stage 3 or 4 symptoms

82%

76%

61 - 68%*


*Three different measures were compared – only at year 20 was substantial variance observed.

Resistance risk with different switching strategies

The relatively high rates of survival in the clinical symptoms switch group appear to have come about despite very high projected rates of resistance – and a lack of third-line treatment. Patients took a first-line regimen of d4T/3TC/nevirapine and then switched to lopinavir/ritonavir plus AZT/ddI.

Eight-three per cent of those who switched at a viral load of 500 copies/ml were projected to have nevirapine resistance and 26% had at least one d4T or AZT-associated (thymidine analogue) resistance mutation. In those who switched due to clinical symptoms (grade 3 and 4 WHO symptoms), 48% had three or more thymidine analogue mutations.

But the number of active drugs available for the second-line regimen was only modestly lower for those who switched according to clinical criteria when compared to switching with a viral load above 500 copies/ml - 2.37 active drugs versus 2.71 active drugs. This was despite the fact that those who switched due to clinical criteria had spent an average of four years on a failing regimen.

Survival outcomes did not differ greatly when tenofovir was used in first-line therapy instead of d4T.

The authors note that second-line regimens that do not contain AZT (using abacavir or tenofovir instead) might have greater activity in the presence of thymidine analogue mutations, and so would be expected to do somewhat better. However they warn that since most data on drug resistance have been gathered in populations infected with HIV-1 subtype B, more data are needed on resistance patterns and responses to second-line therapy in populations where other subtypes - particularly C, D, E and the recombinant forms - predominate.

However, when the risk of resistance according to life years lived with the mutation was analysed, there was little difference between the strategies. But when the researchers calculated the proportion of life years lived with a resistance mutation at a viral load above 1000 copies/ml – the level at which people are likely to be capable of transmitting that drug-resistant virus to others – the switch threshold of viral load above 500 copies/ml showed a substantially lower potential risk than others for onward transmission.

Cost-effectiveness

Clinical monitoring was much more cost-effective than viral load monitoring, by a magnitude of four to tenfold. The cost per life year gained of clinical monitoring, using the occurrence of a new WHO stage 4 event or multiple stage 3 events, was $927. The cost of viral load monitoring was $1500 and $4011 per lif year gained for a switch threshold of 500 copies/ml and 10,000 copies/ml respectively.

The authors conclude: "In summary, our results suggest that use of ARV therapy without monitoring of viral load or CD4 cell count does not have marked detrimental effects on patient survival or on development of resistance. This finding is particularly relevant in view of the limited array of anti-retroviral combinations available to the developing world. Access to ART should be expanded to all settings as rapidly as possible; lack of access to laboratory monitoring should not be allowed to hinder this process."

Reference

Phillips AN et al. Outcomes from monitoring of patients on antiretroviral therapy in resource-limited settings with viral load, CD4 cell count, or clinical observation alone: a computer simulation model. The Lancet 371: 1443-1451, 2008.

HATIP #107, 30th April 2008

This content was checked for accuracy at the time it was written. It may have been superseded by more recent developments. NAM recommends checking whether this is the most current information when making decisions that may affect your health.