Stepping into the AI-driven future of HIV prevention and care

Rouella Mendonca, Director of AI Product at Audere, at IAS 2025. Rouella is standing at a podium and talking into a microphone.
Rouella Mendonca, Director of AI Product at Audere, at IAS 2025. Image: Roger Pebody.

Artifical intelligence (AI) featured prominently at the 13th International AIDS Society Conference on HIV Science (IAS 2025) in Kigali, Rwanda last week. Experts expressed excitement – and some caution – regarding how advances in digital technology and AI could reshape HIV prevention and care services.

The promise of AI

Solange Baptiste, Executive Director of advocacy group ITPC, said the goal of big data is ‘intelligence’ – “the ability to turn vast, diverse data into timely, actionable insight that improves decisions, outcomes and equity.”

AI could make a meaningful difference to HIV prevention and care, she said. It can process massive volumes of health data much faster than a human can; identifying patterns, predicting outbreaks, optimising supply chains and personalising care.

Glossary

stigma

Social attitudes that suggest that having a particular illness or being in a particular situation is something to be ashamed of. Stigma can be questioned and challenged.

drug interaction

A risky combination of drugs, when drug A interferes with the functioning of drug B. Blood levels of the drug may be lowered or raised, potentially interfering with effectiveness or making side-effects worse. Also known as a drug-drug interaction.

sensitivity

When using a diagnostic test, the probability that a person who does have a medical condition will receive the correct test result (i.e. positive). 

pilot study

Small-scale, preliminary study, conducted to evaluate feasibility, time, cost, adverse events, and improve upon the design of a future full-scale research project.

 

anxiety

A feeling of unease, such as worry or fear, which can be mild or severe. Anxiety disorders are conditions in which anxiety dominates a person’s life or is experienced in particular situations.

This could increase efficiency and precision, which is of particular importance during a time of unprecedented funding cuts for HIV services. Baptiste emphasised that big data should be used to “see the unseen” – predicting issues such as loss to follow-up and service breakdowns, thereby allowing for earlier responses, better funding allocation, and ultimately, empowering communities by providing solutions to everyday challenges. The centrality of communities is something that Baptiste kept returning to: “Data without people isn’t intelligence. It’s noise.”

Shawn Malone, a project director at Population Services International, gave some specific examples of digital health solutions. Importantly, these solutions do not address only clients’ pain points or needs, but also those of healthcare providers and governments.

Malone described Coach Mpilo, a WhatsApp based chatbot in South Africa that will support clients or answer their questions while they’re awaiting HIV test results. It could also explain viral load results – clarifying the difference between viral suppression and non-suppression, and the importance of suppression to prevent transmission (U=U). As the chatbot takes on a persona and engages in conversation very similarly to a human, this information would be provided in a more compelling and motivating way than a simple internet search for information.

For clients, dead waiting time at a lengthy clinic visit – due to high caseloads in the South African public health system – could be filled by interacting with a chatbot, which may help the client prepare for the consultation. During rushed consultations, important questions may not always be addressed – and are not always welcomed by healthcare providers, who have full waiting rooms to attend to. The chatbot could play an important role after this visit, especially if the client is left confused.

Digital health solutions may reduce pressure on healthcare providers, who may feel more confident with multi-month dispensing and less frequent in-person monitoring, knowing that a client has an additional source of support. For governments, this reduces the burden on healthcare providers, offers a readily available source of accurate health information, and prioritises access in a scalable and sustainable manner.

Introducing Aimee and MARVIN, the HIV AI bots

Rouella Mendonca, Director of AI Product at Audere, presented data on Aimee, an AI chatbot for adolescent girls and young women, aged 16 to 24, in South Africa. Importantly – and in line with Baptiste’s recommendations – Aimee has been co-designed with youth from across the country. Only a few months in, Aimee has over 1500 active monthly users with over 30,000 messages; 40% of users come back month after month to chat to her. Aimee is built using off-the-shelf large language models, such as ChatGPT and Gemini.

One of Aimee’s advertisements describes her as “your personal AI bestie, right at your fingertips… ready to chat anytime about the stuff you don’t want to Google or ask out loud”. Accessed via WhatsApp, Aimee can provide advice on PrEP, pregnancy worries, relationships, HIV and gender-based violence. The ad says that “nothing’s too awkward, Aimee’s heard it all.”

Beyond providing accurate information, Aimee is designed to build trust through empathic responses. This has proven effective – one user confided to Aimee that she was contemplating taking her life in their third conversation. “Trust isn’t instant,” Mendonca acknowledged. “Behaviour change doesn’t happen on a schedule. It takes time for people to feel safe, to open up and to reveal true barriers they may be facing.”

Aimee builds trust through sentiment detection, progressive disclosure and active listening. If Aimee notices that a user is depressed, she deprioritises information collection and sharing, and instead focuses on listening. She is also patient, but probes gently when needed to reveal deeper layers. The navigation protocols ensure that she refers to a healthcare provider for serious issues, such as suicidal ideation or sexual assault.  

User interactions revealed patterns – users tend to delay important disclosures, asking functional questions initially before returning days later with deeper concerns. Users often ‘stack’ topics: for instance, starting with menstruation but ending with HIV concerns. These insights informed design of the chatbot, so that Aimee elicits and provides information gradually, without rushing the interaction.

“You are the first person I ever became so open to,” one of the young South African women told Aimee. While shorter conversations for quick information might only be three messages long, deeper discussions may extend to over 40 messages, indicating a gradual building of trust. Research showed that clients tend to speak to Aimee about topics such as gender-based violence and relationships much more frequently than they do to nurses. HIV and condom usage are also discussed more frequently with Aimee.

After building trust, Aimee moves towards attitudinal changes and taking action steps. Around a quarter of all clients who have engaged with Aimee have taken up services offered such as HIV testing, contraception, PrEP or social work support.

Mendonca concluded by urging developers to “build tools that listen, not just tools that work”.

“If we design for trust – specifically in the margins – we don’t just expand access, we change care experiences,” she said.

Sebastian Villanueva Guzman, a Master’s student in Biomedical Engineering at Polytechnique Montréal, presented data on MARVIN, an existing chatbot designed to provide information on HIV self-management and related topics in Canada. However, when presented with insulting language or messages expressing suicidal ideation, MARVIN would typically respond by saying that he didn’t understand, asking the user to rephrase the query.

MARVIN was adjusted to ensure that he could also respond appropriately to insults or self-harm. An initial model was trained to differentiate between messages containing neutral, positive, negative and very negative sentiments. A second AI model then classified negative or very negative messages as self-harm, insults or ‘normal’. The ‘normal’ category (for example, a treatment-related question) was important to ensure that MARVIN understood precisely what types of messages constituted self-harm and insults.

For model 1, from 3750 messages, MARVIN was good at correctly classifying messages as positive, negative and so on, with a success rate of around 85%. However, MARVIN tended to over-categorise negative messages as neutral, often because the negative messages were ambiguous in tone.

For model 2, from 3000 messages, MARVIN had a high rate of correctly classifying self-harm, insults and normal messages, at around 95% for each, indicating high sensitivity. However, there was confusion between the insult and normal classes. Insults with irony (“Managing HIV is super chill – just a never-ending quest for meds, paperwork, and people who still don’t get it. Totally effortless”) were incorrectly classified as normal messages.

In a small pilot with six participants (three people with HIV, two engineers and one healthcare provider), MARVIN could successfully generate appropriate responses with emergency contact information for any messages expressing self-harm ideation and guidance for users on avoiding insulting language.

Next steps for MARVIN’s development include being trained to detect more signs of psychological distress, such as markers of depression and anxiety, which may enable him to become a more well-rounded digital companion for people with HIV.

Will AI simply replicate current inequities?

In one of the many sessions on AI at IAS 2025, an audience member asked about HIV stigma in AI systems stemming from misinformation in training data, developer bias in code, and other sources. What’s more, as some topics have become no-go areas in some settings, is there a risk that an AI bot might say: “Sorry, I can’t talk about trans issues or abortion”? This could isolate users, entrenching marginalisation and stigma.

Baptise touched on some of these issues: biases in data can easily translate into biases in care. She spoke of the “data poverty” that arises when underinvestment in community data systems render certain populations – especially key populations – invisible to AI systems. “If we don’t include the lived experience, AI will reproduce the blind spots. So what we have is something nice and fancy and shiny and new that’s just reproducing the inequities we see — and doing it much faster,” Baptiste said.

References

Baptiste S. Intelligent Health Monitoring: Why Community Data Must Be Part of the System. 13th International AIDS Society Conference on HIV Science, Kigali, symposium SY10, 2025.

View the session details on the conference website.

Guzman Villanueva D S. The development, evaluation, and user testing of the AI-based MARVIN chatbot’s integrated mental health management module. 13th International AIDS Society Conference on HIV Science, Kigali, abstract OAD0602, 2025.

View the abstract on the conference website.

Malone S. Integrating AI into HIV Care Pathways: Supporting Self-Care, Continuity, and Quality. 13th International AIDS Society Conference on HIV Science, Kigali, satellite SAT34, 2025.

Mendonca R. Designing for Impact: Behavioural Insights from AI-Driven HIV Self-Care. 13th International AIDS Society Conference on HIV Science, Kigali, satellite SAT34, 2025.

View the session details on the conference website.