The robots are coming, but not in the way you might think. Chatbots, or artificial-intelligence-powered software systems that humans interact with through a chat platform, are creeping into the HR sphere to combat workplace harassment.

In the roughly 13 months since public allegations against Harvey Weinstein helped to spur the #MeToo movement, a dozen states have enacted laws affecting employers, the federal government has filed 50 percent more lawsuits, and many executives say they’ve changed their behavior. Yet many employees say plenty of harassing behavior still goes unreported and unaddressed.

That leaves an opening for chatbots and AI.

Automated anti-harassment HR tools are the “next frontier” but haven’t proliferated widely yet, Deloitte principal Art Mazor told Bloomberg Law. Companies have already been testing the HR AI waters by delegating simple jobs to bots like answering questions around things like vacation days or medical plan options. For example, an employee might ask HR chatbot Talla whether employees have Veterans Day off work, and the bot will respond with a yes or no.

“In a sense, they become co-bots, working side-by-side with the HR professionals,” Mazor said about the rising prominence of chatbots in companies he works with. “I think we view this as a space of experimentation right now, and learning.”

Management is welcoming of AI technology for human resources purposes in general and employee advocates are cautiously hopeful about its usefulness. How the information collected by chatbots could eventually be used in court remains to be seen.

The hope for a new anti-harassment bot is that it can help break the ice with employees too confused or disconcerted about an incident to know where to begin.

Spot Steps In

Meet Spot, inspired by “Spotlight,” a platform rolled out in October by tech firm All Turtles. Employees can use it to, anonymously or not, report instances of discrimination or harassment via chat. One of the best things about the product is what’s missing, the developer says: the subjective human element.

When an employee logs on to Spot, the bot takes the lead.

“Hi, I’m going to ask you some questions about what happened,” it says right off the bat. As prompted, the employee answers questions about the date, time, place, parties involved, as well as how it made the employee feel and whether the incident was reported to others. Spot apologizes in advance for repeating itself and sometimes asks the same thing at least twice in different ways.

What it never does is empathize although it has a safe, pleasant tone. This might not seem earth-shattering, but removing human interference in the process yields more accurate results, All Turtles co-founder Jessica Collier told Bloomberg Law.

The employee then has the option of submitting the report, anonymously or not, or leaving it stored in his or her user’s profile, invisible to the company but time-stamped and able to be submitted later if desired. If the employee opts to submit the report with a name attached, HR professionals can then step in.

Some of the other things Spot asked during a recent test of the system: “You may have already mentioned this, but did anyone witness the incident?” “Did you tell anyone about what happened?” “Before we finish is there any evidence?”

Keeping a Poker Face

Chat-based reporting mechanisms aim to remove the leading questions a human interviewer might inadvertently ask and the fear of potential judgment that can come with sharing a traumatic or emotional experience, Collier said.

“We’re not good at not asking leading questions,” Collier said. “It’s not malicious, it’s just human.”

Asking leading questions can prevent employees from retelling exactly what happened. That’s why an automated, evidence-based approach without the “warmth or nurturing” associated with human interaction is ideal for gaining details of an incident, Collier said.

“We don’t want people interpreting the bot as a therapy tool,” Collier said. “It’s there for you to lay out the details of what happened.”

“We use technology to kind of help people give as many details as they’re comfortable with,” she said. Something notably absent in Spot is the power of suggestion. Whereas most AI tools suggest language, like an iPhone can finish a word for users, to ease a user’s experience, Spot developers specifically shied away from that so as not to tip the scales while collecting the data.

Litigation Uncertainty

“Talking” to a chatbot isn’t all that different from reporting incidents to an anonymous hotline, other than the fact that artificial intelligence might be better at getting information because it doesn’t interfere with the process, according to management attorney Eric Meyer.

“I would like to think that the AI is smarter than the human at the other end of the reporting line,” he said. “Does it create liability? I guess theoretically it could, but in the same way that a human interaction could create liability for a company.”

The intake of information is straightforward, but a fumble could occur between gathering information and executing a plan or initiating an investigation, he said. Then it’s on the employer to clean up the mess, but that’s the situation with or without a chatbot.

Worker advocacy groups, like Workplace Fairness, have looked into the capabilities of reporting aids, like Spot, to help whistleblowers, but it is still early in the process, Branter said.

“We have been monitoring the rise of these with #MeToo,” Workplace Fairness senior adviser and attorney Paula Brantner told Bloomberg Law. “We’re trying to figure out if we can establish relationships.”

While hopeful, Brantner doesn’t want to see chatbots replace human interactions or listening only for “magic words” that might trigger a red flag for litigation potential in an interview.

“They’re important for a start, but I would hate to see them replace talking to someone who can really give you sound advice,” she said. “The question is, do the machines have the sophistication to ferret out conduct that might not be actionable harassment now, but the company would want to know about?”

It could also help in the courtroom, by documenting incidents before statutes of limitations run out, she said.

“There will potentially be cases that can be litigated that couldn’t be before because they came forward in a timely fashion,” Brantner said.