top of page

AI + Restorative Justice: From Release to Renewal

Updated: Jul 30

“Justice is ultimately connected with the way people’s lives go, and not merely with the nature of the institutions surrounding them.” 1


Restorative Justice, at its heart, is society’s promise that every person will be treated with dignity, given a fair chance to thrive, and protected from arbitrary harm. It binds communities together in mutual trust. When these systems fall short because of entrenched bias, excessive punishment, or sheer neglect, it may lead to fractured social trust, diminished public safety, and a cycle of disadvantage, especially for those already on society’s margins.


What is “AI in Restorative Justice”?

Artificial Intelligence (AI) in justice refers to the application of machine-learning and rule-based systems to support or automate processes across policing, courts, and corrections. Across the globe, justice systems are already putting AI to work, and the momentum is growing fast. More than 30 countries now run pilot programs that embed AI into policing, court administration, and corrections. Early evidence is promising: machine‑learning triage tools have trimmed court case backlogs by as much as 30 %, while data‑driven risk‑reduction platforms are linked to 10–20 % drops in re‑offending. Together, these gains show how thoughtfully deployed AI can speed fair outcomes while clearing a path for more humane, community‑focused reintegration 2,3. AI tools are commonly used for:

  • Predictive Policing: Identify crime hotspots proactively.

  • Risk Assessments: Estimate an individual’s likelihood of re-offending.

  • E-discovery: Rapidly sift through millions of legal documents.

  • Plain-Language Translation: Convert complex legal jargon into clear, accessible language.

  • Sentencing and Diversion Recommendations: Suggest suitable rehabilitation programs or sentencing guidelines.

  • Mental Health and Emotional Support: AI-driven chatbots and sentiment-analysis dashboards are being piloted in correctional facilities, providing self-help exercises and tele-psychiatry services 4.


Why This Matters: Breaking the Cycle of Recidivism

While AI technologies hold significant promise in improving restorative justice outcomes, the systems continue to grapple with persistent challenges such as staff shortages, bureaucratic complexity, and geographic disparities in care, factors that collectively impose immense human costs. If current trends continue, states will spend approximately $8 billion to incarcerate again those released in  2022 alone 5.

Each year, over half a million individuals in North America leave incarceration facing significant, interconnected barriers:

  • Employment Early employment post-release drastically reduces recidivism. Yet, many employers categorically reject applicants with criminal records, perpetuating unemployment 6.

  • Housing: Many individuals with criminal records are denied housing, pushed into precarious living situations such as shelters or unstable accommodations 6. Formerly incarcerated people are 10 × more likely to experience homelessness than their peers, a disparity further worsened by the criminalization of homelessness. Temporary shelter rules, anti‑vagrancy laws, and having no fixed address dramatically increase the chances of arrest or missed parole appointments, perpetuating the cycle of re‑incarceration 7


  • Transportation: Suspended licenses, unaffordable fares, and patchy bus or train service leave many returnees stranded; more than one‑third say they cannot secure a car, and nearly a quarter struggle with public transit, making it harder to reach jobs, treatment, or parole check‑ins 6.

  • Stigma & Mental Health: Isolation, unresolved trauma, and public mistrust can pull people back toward crime. Mental health struggles often show up as rule‑breaking instead of a direct call for help, hiding the true depth of need. The first year after release is especially dangerous: suicide rates in this period are about double those of the general population. These realities demand culturally and racially competent re‑entry programs that guarantee continuity of care and long‑term support. To meet that risk, re‑entry programs must be culturally and racially responsive and provide seamless, long‑term support. Regular mental‑health screenings, tele‑health check‑ins, and 24/7 AI helplines are not extras, but essential safeguards that save lives and reduce public costs 8,9

  • Digital & Legal Navigation: Complex parole conditions and online job or benefits applications presume constant internet access and digital literacy, creating additional obstacles for many formerly incarcerated individuals 9


Addressing these hurdles is crucial to improve individual outcomes and reduce the broader societal costs of re-incarceration, fractured families, and ongoing poverty and crime cycles.

Imagine a system that meets individuals at the very moment of release, instantly offering personalized mental health support and clear pathways to essential resources. Responsibly deployed AI solutions can make this vision a reality, transforming the transition from incarceration into a meaningful opportunity for rehabilitation and reintegration 6


Building Trust: Why it’s important 

As developers and innovators, we are naturally optimistic about technology’s potential for positive impact. But we have to be honest about the context we’re working in: the criminal justice system has long emphasized punishment over restoration, meaning it often prioritizes control and compliance rather than healing harm, rebuilding trust, and supporting successful re‑entry. When new technologies (AI or otherwise) are dropped into that environment without a shift in approach, they tend to inherit and even amplify existing punitive patterns. These dynamics lower trust, reduce uptake, and can widen disparities across different groups who interact with the system.

That is why co‑creation is critical. Co‑creation means engaging justice‑impacted individuals, families, community service providers, front‑line officers, case workers, legal advocates, and policymakers throughout problem definition, model design, and evaluation, not just at the end for ‘feedback.’ With this approach, the tech shifts from policing people to supporting them.


Some safeguards to keep in mind as solutions are built in this space: 

  • Explainability every decision is traceable.

  • Human‑in‑the‑Loop no critical outcomes are fully automated.

  • Continuous Bias Auditsbenchmarked by race, gender, and disability.

  • Privacy by Design end‑to‑end encryption, zero‑knowledge storage.

  • Compliance Alignment GDPR, HIPAA/HIA, local open‑records laws.



Some great examples we’ve seen across the world include: 

  • Family Connection: Platforms like Pigeonly leverage VoIP technology and automated account matching to significantly reduce the cost of maintaining family communication during incarceration, critical to successful reintegration 10.    

  • Education and Employment Training: APDS provides secure tablets with GED preparation, résumé-building tools, and coding tutorials. Currently serving over 80 U.S. correctional facilities, APDS’s cloud-based platform tracks individual progress and encourages vocational achievements 11

  • Immediate Resource Access: Paraguay’s conversational AI bot, "Eva," assists justice-impacted women by seamlessly guiding them through ID recovery, legal aid, and trauma counseling within a single communication thread 12

  • Continuity of Care: Cross-system data platforms like the Health and Re-entry Project (HARP) automatically notify community health services of an individual’s impending release, schedule necessary appointments, and pre-authorize prescriptions, transforming traditionally fragmented care into seamless support (HARP) 13.

  • Reentry Guidance: SherlockAI is an always‑on, text‑based guide that helps people coming out of jail or prison, and their families, find clear, reliable answers in seconds 14

  • Mental Health Support: AI chatbots provide incarcerated individuals with 24/7 Cognitive Behavioral Therapy (CBT), motivational interviewing, and coping strategies while machine‑learning monitors behavioral data to flag emerging self‑harm risks in understaffed settings 4.

  • Efficient Resource Navigation: AI-powered text and voice bots can instantly answer queries such as "Where do I find sober housing near me?" across multiple languages and literacy levels, a feat that a single human caseworker would struggle to replicate at scale. For instance, California courts are developing guidelines that enable clerks to safely use generative AI to create plain-language explanations of expungement options, thereby freeing up human staff for more nuanced support tasks 15.

These AI tools do not replace human intervention; they amplify it, converting limited staff resources into targeted, impactful actions that reduce recidivism and attempt to restore dignity while strengthening community safety and cohesion.


Why do we care? Well, we’re trying to build something here…. 

Our team has built Hapi - the first AI‑driven reintegration platform purpose‑built for post‑incarceration realities, providing comprehensive housing, health, and supervision support entirely via SMS for our client direct-care version (no data plan, smartphone, or Wi‑Fi required). Think of it as a lifeboat in an ocean of fragmented rules and resources: it breaks overwhelming information into clear, digestible steps so people can rebuild their lives from day one. Where most tools stop at reminders or hotlines, Hapi delivers the entire re‑entry journey, housing, work, health, mental health, and legal navigation, over something everyone has: basic text messaging. No smartphone, data plan, or app download required.

Powered by a multilingual, trauma‑informed AI, Hapi understands more than 100 languages, adjusts its reading level on the fly, and remembers each conversation so no one ever has to repeat their story. Its offline map guidance gets users to services even without Wi‑Fi. The latest pilot showed 96 % answer accuracy, proving that thoughtful AI can turn post‑release barriers into stepping stones.

Hapi encrypts every message end‑to‑end, undergoes audits (through our partnerships), and meets strict correctional‑ and health‑data standards. The result is a platform that lightens caseworker loads, keeps families in the loop, and gives returning individuals a straight path home. We are hoping to benchmark what fair, human‑centred, AI‑driven reintegration should look like, but we know this can’t be done alone.


Do you think we’ve missed something? Would you like to try Hapi? Please get in touch! We need to do this together!


References:

  1. Amartya Sen, The Idea of Justice (Cambridge, MA: Harvard University Press, 2009), Preface, p. 8 

  2. https://www.rand.org/pubs/research_reports/RRA3299-1.html

  3. https://www.oecd.org/content/dam/oecd/en/publications/reports/2024/06/governing-with-artificial-intelligence_f0e316f5/26324bc2-en.pdf

  4. https://www.researchgate.net/publication/379958533_Artificial_Intelligence_in_Correctional_Facilities_Enhancing_Rehabilitation_and_Supporting_Reintegration

  5. https://csgjusticecenter.org/publications/50-states-1-goal/

  6. https://www.unlv.edu/sites/default/files/media/document/2024-12/r6-Reentry.pdf?utm

  7. https://www.prisonpolicy.org/reports/housing.html

  8. https://www.bu.edu/sph/news/articles/2024/elevated-suicide-risk-post-incarceration-demands-a-response-rooted-in-equity-justice-and-human-dignity/

  9. https://pmc.ncbi.nlm.nih.gov/articles/PMC4788463/

  10. https://pigeonly.com/?utm

  11. https://insights.samsung.com/2019/09/27/american-prison-data-systems-helps-inmates-pursue-success/?utm

  12. https://restofworld.org/2025/paraguay-ai-chatbot-eva-social-justice/?utm

  13. https://healthandreentryproject.org/

  14. https://www.sherlockai.org/

  15. https://www.reuters.com/legal/government/california-court-system-decide-ai-rule-2025-07-15/?utm

Get in Touch

We'd love to hear from you.

  • LinkedIn

Edmonton, AB

Working Worldwide

white logo (3)_edited.png

Duologue Systems

 

© 2025 by Duologue Systems

 

bottom of page