A game that protects the gamer: SCROL Project’s silent digital sentinel—ADA
The internet is a boundless world where freedom blooms and filters fade. It’s a space that promises connection, knowledge, and opportunity. But like all spaces built without walls, it also invites shadows.
Behind every user on the screen are predators who wear human skin. They don’t need to chase; they wait with money as bait and manipulation as their language. They exploit the very same tools that were meant to empower.
And too often, their targets are children.
When harm hides in hyperlinks and trust is breached through taps and chats, technology must do more than connect—it must protect. This is where the Abuse Detection App (ADA) takes centre stage.
One of the most groundbreaking innovations under the Safety for Children and their Rights OnLine (SCROL) Project implemented by Terre des Hommes Netherlands and Bidlisiw Foundation, is ADA—a digital tool cleverly disguised as the child-friendly mobile game “Amazing Land of Adventures.”

At first glance, the app looks like any other colourful game designed for young players. It features vibrant graphics, interactive storylines, and a tone that easily captures a child’s attention. But behind this playful façade is a powerful tool designed to detect signs of abuse and exploitation.
“It’s a very interactive game composed of several districts and zones,” explains Judith Pulvera, SCROL Project Coordinator at Bidlisiw Foundation. “It’s a journey actually—ADA’s journey—from home, to community, to school, and internet experiences, exploring inputs of how the user or the player [responds].”
Through a series of carefully crafted scenarios and questions, Amazing Land of Adventure is able to detect red flags—such as consistent responses showing that they’ve experienced maltreatment or feel unsafe in certain environments. These subtle cues embedded within the app help the project assess whether a child may be experiencing abuse and exploitation.
More than a game
Every question has been carefully refined with the help of psychometricians and psychologists, as it ultimately affects a child’s assessment.
“There are scores assigned to the different questions in every game,” Pulvera adds. “The system generates results, showing us a specific behavioural pattern of signs and manifestations.”

Each stage of the game offers more than entertainment—it invites disclosure in a way that feels safe, empowering, and even rewarding.
“And in this game, we have to go through five different stages before we complete the whole game,” explains Charles Vergara, Project Staff at Bidlisiw Foundation. “The goal here is to provide ADA, the companion in the game, what we call the equipment of royalties.”
The stages—Family Town, Internet Village, School Point, Sad Zone, and Dream District—guide children through questions about their home, digital habits, school life, emotions, and dreams. Each answer earns them badges, while their digital companion, ADA, gains a gear like a vest or crown.
“For example, after they finish the Family Town, the player gets a Hope Badge while ADA is given a vest,” Vergara says. “The challenge here for the players is to give ADA the full equipment of royalties.”Behind the scenes, each answer feeds into a system developed in collaboration with child protection experts. As children move through different stages of the game, they respond to scenario-based questions. Each answer carries a score, and as they play, the system tallies these scores and looks for unusual, concerning patterns.
These patterns help the system silently flag possible signs of abuse or distress. When results raise concern, social workers and trained facilitators step in to validate the findings by gently speaking with the child, always with consent. This allows the tool to listen in a child-friendly way, offering professionals a starting point for deeper support.
“As we implement the piloting of this application, we’ve realised that we are not just gathering data from the communities… we’re actually helping them [by] letting them express—these children, these users—what they are feeling, what they are experiencing, that they’ve never said to anyone,” Vergara adds.
ADA doesn’t just play, it listens. It listens in ways that adults sometimes miss, in tones too subtle for the untrained ear. And while it may appear as just another game on a child’s screen, it’s quietly rewriting how protection works in a digital age.
From pitch to platform
ADA was created in 2019 during the first CodeRED: A Hackathon Against Online Sexual Abuse and Exploitation of Children (OSAEC), a tech-driven initiative that Mary Grace Quidet helped design.

“I don’t have all the answers to combating OSAEC, but I know that I can create a space where people of different expertise can come together, and where answers to those problems emerge,” Quidet shares.
The Hackathon does just that: it gathers developers, child rights advocates, NGO workers, law enforcement officials, and government leaders into one room and gives them a singular mission—to develop tech-based solutions against online sexual abuse and exploitation of children.
ADA was one of the standout products of the very first hackathon. What began as a pitch has since grown into a live, child-centred innovation with real potential to spot victims.
“It was really progress, it was really put into an actual thing that solves the problem, and it’s growing,” Quidet adds. “Our core is contributing to what makes a good society, and whatever it is that you’re trying to build monetarily is really something that we also need to build humanity.”
As ADA continues to be refined—particularly in ensuring data accuracy—the goal is to transform it into an effective tool that not only accelerates social work but also becomes essential for NGOs working with children. In doing so, it solidifies an impact that extends far beyond its origins as a pitch—offering a roadmap to systemic and sustainable child safety.
Through the SCROL Project, child protection is no longer confined to physical spaces or traditional systems. It moves into the digital realm, where the threats now linger, arming communities with tools like ADA to not only respond, but to recognise, to listen, and to act. In SCROL’s vision, safety is not a privilege, but a right wired into every child’s tech experience.
Author: Bryan G. Fernandez II