Botler AI Uses Deep Learning to Empower Sexual Harassment Victims

AI is being leveraged for a lot of great reasons, but it’s often being applied to the way people make and interact with money. Now Botler AI is trying to use the technology to influence social change.

The Montreal-based AI company has released a system that will help empower victims of sexual harassment by locating and improving their chances of successfully taking legal action against abusers.

The new platform comes out in the wake of allegations against well-known names and the prevalence of the #metoo campaign, empowering more women to step forward and share their tales of abuse.

“It is very important to understand that what we read on the news doesn’t even begin to scratch the surface of what is actually going on in society,” said Ritika Dutt, 26, co-founder of Botler AI. “Many women are unaware of their rights in a situation. We’re often too scared to ever speak up for fear that the situation might backfire and lead to us being blamed, ostracized or, even losing our jobs.”

Using deep learning, Botler AI scanned over 300,000 U.S. and Canadian criminal court documents including sexual harassment complaints. The AI-powered platform cross-references court documents to see if a situation explained a victim falls under sexual harassment and identifies what laws have been violated in the criminal code. The service can then generate an incident report which the user can hand to relevant authorities if they wish to.

Botler AI does not make judgments based on race, gender, sexual orientation, income or any other inherent human biases.

“This is just the first step,” said Dutt. “The idea is to empower women, and men, suffering through these abhorrent situations with an impartial, trustable tool.”

As the system progresses, a tool may be released that compiles all relevant entered information to compile a case and connect a victim with legal representation if they so choose to take the courtroom path.

That’s where an inherent problem lies that Botler AI could help begin to alleviate. Many victims stay silent and do not wish to relive the memories of an assault or harassment claim for fear of having to state their case multiple times only to see it fall flat and be labeled a liar or attention-seeker.

If Botler AI can help even a few victims identify what it takes to bring their abusers to justice without risking their career, professional relationships or comfort levels, it will help empower more down the line.

Still, AI is not a registered lawyer, and any kind of advice administered cannot be considered legally binding. All Botler AI can do is inform the victim and give them more power to do what they feel is right.

“Even when AI passes the bar, it can never replace the emotional support a human provides,” said Amir Moravej, founder of Botler AI. “We’re forming partnerships with legal service providers, ranging from lawyers to social workers, so that we can work in unison.”

Last year Botler AI released technology that allowed immigrants in Canada to learn more about their status through a conversational chatbot. The company also recently brought on AI pioneer Yoshua Bengio as a strategic advisor.

Social justice is certainly a field that AI can continue to be applied to. Whether it be immigration, sexual assault, or racial profiling, anything with large data sets can use AI to harness exactly how specific cases come about.

It’s one thing to identify a problem and help build a case though, as it’s akin to slapping a bandaid on a stab wound—it’s still up to society to find ways to fix systemic problems.