Abuse and neglect are the main reasons kids go into foster care: in the UK alone over 65,000 children live with foster families. What these vulnerable children need is a stable home, but too often a foster match falls apart, and these youngsters end up being shifted from family to family in a punishing cycle.
Loubna Bouarfa says she has a solution, she’s using artificial intelligence (AI) to flag less stable foster families who might need extra support—and to help better foster matches be made in the first place.
She’s the founder of British startup Okra, which has used historic data on successful and unsuccessful foster matches to make evolving predictions about the stability of a match. It has, she says, a 93% accuracy rate of identifying foster pairings that will not break down.
A member of the European Commission’s High Level Expert Group for Artificial Intelligence, Bouarfa first presented her methodology at the at ISPOR healthcare conference in November 2018. Now she’s launching a pilot programme with Britain’s government authorities to test her AI in the real world.
“If we can keep a child in one family for two years we can ensure they have a future as an ordinary child, even if the placement isn’t always easy,” Bouarfa tells Sifted.
Inspired by healthcare
Bouarfa wasn’t always focused on fixing social care. When she first founded Okra in 2015 her mission was to use AI to improve healthcare outcomes.
The Moroccan-born technologist had completed a PhD on the subject at Delft University in the Netherlands before becoming a research associate at Britain’s Imperial College London. After this, she worked as a consultant for the medical market and spent a stint at AI-powered fintech startup Featurespace (which raised €28m last week).
Okra’s first customers were big pharma companies seeking to benefit from AI analysis of medical data. “We stretched to social care because it is very similar to healthcare in many ways,” Bouarfa explains.
Both healthcare and foster care have a need for early intervention (spotting cancer patients early on, for example, or a problematic foster match), and both benefit from prediction programmes (where analysis of evidence can help prescribe a better drug, or a more stable foster match).
“What excites me about AI is how we can embrace uncertainty, not stigmatise it or be scared of things we don’t know,” says Bouarfa. “We can use the information we have at hand to know what is the most likely path to success.”
Pioneering AI for social care
Excited to expand into the foster care space, Bouarfa sought an introduction with Jim Cockburn, a social worker who founded his own agency Foster Care Associates in 1994. Cockburn went on to lead Okra’s €3.62m Series A round in March 2018: funding which allowed the team to build and test cutting-edge AI systems for foster care from St John’s Innovation Centre in Cambridge (just a few miles from the city’s historic centre).
Okra trained its initial AI on thousands of existing foster care matches. It had access to 10 years of historic data from an (undisclosed) leading foster care agency, and used the first eight years to train the model, and the last two years to check the outcomes of the model.
In this process, Okra’s AI learned which factors might result in a breakdown where a child is leaves a foster carer. For example, it learned that placement instability is sometimes higher if a foster carer is inexperienced with children, or if a child has previous failed matches or a history of incidents (things like family fights or running away).
Okra’s report which Bouarfa unveiled at ISPOR says its AI is 93% accurate at predicting a successful stable foster care match, and 86% accurate at identifying a foster match breakdown (when a child is moved on).
Okra’s study was based on real-world historical data and outcomes. It wasn’t a live study and so couldn’t be used by real-life social workers to inform their decisions. This is why Okra is now working with select local authorities in England to determine how its AI interpret local data sets and inform real-time decision making.
As of 2019, social workers are being invited to use Okra’s tools on their smartphones to proactively offer more support to less stable matches (in a bid to prevent breakdowns). Local authorities will also be able to use the AI to help better allocate children into stable foster placements. This live pilot will confirm whether Okra’s algorithms are as effective in real life as they are on paper.
If Okra’s UK pilot is successful, Bouarfa plans to scale across the UK and into Europe. However, the founder does not foresee a smooth journey ahead. “For foster care and healthcare, the main bias for scaling is regulation,” says Bouarfa pointing to GDPR.
Regulation like GDPR, and attitudes around it, made it challenging for Okra to access data: its tools are only able to learn from lists of placements, incidents and success rates and it has few background details about individual children or carers. “Imagine how much more accurate we could be in our suggestions for the social worker if we could see data around age or demographic,” Bouarfa says.
Reducing data regulation for AI businesses is something Bouarfa is currently pushing in Brussels within the European Commission’s High Level Expert Group for Artificial Intelligence. One way the entrepreneur believes Europe can reduce regulation is by making allowances for companies that (unlike Google and Facebook) are focused on “creating a better future” rather than profit. (Bouarfa recently read and recommends The Age of Surveillance Capitalism, by American scholar Shoshana Zuboff, which discusses this topic).
“It’s important to identify the reasons we are building tech: are you linked to sustainable development goals like improving climate sustainability, wellbeing, or education?” says Bouarfa. “I see no point in developing technology that does not add value for our society.”