AI-Powered Chatbots Bringing Sex Education to the Global South
In a small, remote village of Jharkhand, in the eastern region of India, an adolescent girl hesitates over a question she would never dare ask her mother: “When will I get my period?”
Instead, she picks up the cellphone she shares with her family, opens WhatsApp, and types her question for Disha Didi, a chatbot that feels like a trusted didi, Hindi for older sister. Seconds later, it responds.
This experience mirrors what thousands of women across the Global South, including India, Lebanon, Congo, and Latin America, are now doing — turning to AI-powered chatbots for sexual and reproductive health and rights (SRHR) information that they cannot find at home or in classrooms.
The rise of these tools underscores a critical gap in information and education. Young people are turning to chatbots because schools, families, and health systems have failed to provide even the most basic knowledge about sexual and reproductive health.
A report by the philanthropic NGO Dasra notes that 71% of adolescent girls in India are unaware of menstruation before their first period, leaving them unprepared, anxious, and at risk of health complications. Across much of the Arab world, sexual and reproductive health education is either absent, considered taboo, or filtered through layers of misinformation.
This lack of reliable guidance leaves young people struggling to make informed decisions about their health and rights. Where traditional systems fall short, digital “counselors” like India’s Disha Didi and Lebanon’s Ask Aunty are offering culturally grounded, discreet, and reliable guidance on topics from menstruation to contraception.
Disha Didi from India
Since 2000, the Ipas Development Foundation (IDF), an NGO focused on sexual and reproductive health, has worked to improve access to safe abortions. Over time, they saw that young women in rural Indian communities, constrained by societal norms, minimal family support, and scarce health care resources, were missing access to essential knowledge and services.
To reach them, the nonprofit trained youth leaders to share SRHR information and connect women to public health services. When COVID-19 disrupted in-person outreach, the team created Disha Didi, a WhatsApp chatbot that delivers information directly to youth communities.
“There was an unmet need,” explains Pallavi Lal, who manages digital intervention at IDF. “Among youth of 15 to 24 age group, there was a lack of knowledge and agency, and a stigma around seeking information on sexual and reproductive health.”
The chatbot was designed for young women in rural areas, providing information in Hindi and Bengali across the states of Assam, Madhya Pradesh, Jharkhand, and West Bengal. It doesn’t send notifications, keeping conversations private from family members.
“A key feature of this expanded chatbot was its collaborative development process,” Lal explains. “A team of gynecologists helped develop the content, and we worked closely with youth community members to ensure that the bot addressed specific local needs and contexts.”
Using these insights, and with guidance from health experts, the team built a question bank of over 20,000 queries to ensure that the chatbot responds accurately.
Even the name, Disha Didi — older sister who provides guidance — was one that was chosen by the community of adolescent women.
Users can interact with the chatbot in three ways: navigate topics through a menu, type open-ended questions, or connect with a human counselor for more detailed guidance. Since its launch in 2020, the chatbot has reached 29,000 unique users, handling over 72,000 conversations. Nearly a third of these chats focus on menstruation, while questions about adolescence and sexually transmitted infections make up 23% and 22% of interactions, highlighting the topics young people are most curious about.
Ask Aunty from Lebanon
When the team at Raseef22, a Lebanon-based media organization, noticed a spike in SRHR searches on their website, they realized young people across the region were turning to them for answers. “A third of our organic clicks last year came from SRHR-related queries,” explains Rokaya Kamel, who oversees AI integration at Raseef22.
“But we didn’t want to just put out more articles. And people [may not] end up reading it.”
To address the lack of accessible and engaging SRHR content, the team created Ask Aunty, an AI-powered chatbot developed in collaboration with Google News Initiative’s JournalismAI Innovation Challenge. The chatbot draws only from Raseef22’s editorial archive and trusted partner content (a deliberately limited dataset), to keep information reliable, relevant, and free from the biases that come with open-source material. It's also programmed to admit when it doesn’t know an answer and has to direct users to trusted medical resources in those cases.
The team chose to give the chatbot the persona of a witty 56-year-old Egyptian aunty who provides answers in a warm and conversational tone. Line Itani, product and communication manager at Raseef22, explains that Egyptian Arabic was selected as the chatbot’s language because it is the most familiar and accessible dialect across the Arab world.
“We created her to be an older aunty figure — someone with experience, someone who’s seen things. It also feels culturally appropriate to have her speak to these issues, as opposed to a twenty-something. We had to be culturally sensitive throughout the process,” Itani adds.
Other cultural nuances that went into developing the chatbot persona were the tone and expressions. “The tone of Ask Aunty was very important, because it’s the difference between creating a safe space or alienating users,” explains Kamel. “We had to make sure she sounds like a real person.”
The team also chose to use medical terminology rather than everyday colloquial language, which can carry cultural biases. Line explains that SRHR topics are often clouded by misinformation, such as the cultural practice of female genital mutilation (FGM). The word khatan (implying cut) attempts to equate FGM with male circumcision, which obscures the severe and harmful consequences of FGM. Similarly, the term taharat al-banat (purification of girls) equates the practice to incorrect notions of being clean or pure. By deliberately using the medical term for FGM (tashwih al-a'daa al-tanasuliya al-unthawiyya), the team is moving away from language that reinforces cultural biases or misconceptions.
“These are sensitive issues that have been normalized in society, so we have to address them carefully and break misconceptions bit by bit,” Itani says. As with AI in other languages, the Raseef22 team faces ongoing challenges, continually testing and training the system to ensure that Arabic sentences are grammatically correct and linguistically natural.
Currently in beta, the chatbot be will soon be available on both web and app, giving young people across the Arab world a private, reliable space to ask questions and access guidance they have long lacked.
These chatbots are empowering young people to access information and make informed choices that traditional systems have failed to provide. Their rise highlights both the urgent need for SRHR education and the potential of technology to meet it.
More articles by Category: Health, International, Science and tech
More articles by Tag: Reproductive rights, Reproductive health, Abortion, Sex education, Technology, India, Lebanon















