Almost a month ago now, FACEIT teased a new feature called ‘Justice’, which left the platform’s users scratching their heads. However, following a presentation by the company’s CEO at a conference, we may now know what Justice actually is.
Niccolo Maisto took to the stage for a short presentation at the Google Customer Innovations Series to spread the word about Minerva, FACEIT’s AI banning system that detects toxicity in matches played on their platform. As The Loadout reported earlier in the year, Minerva has already been hard at work eliminating problematic players from matches by reading game chats to identify toxic or abusive behaviour. However, Minerva is an AI that is subject to machine learning, and sometimes the context of messages can be missed.
As was revealed by Maisto at the conference, Justice will be a community led add-on to Minerva, in which actual players on the server will review cases that the AI has detected and dished out a ban for. They will then be able to say whether Minerva made the right or wrong call on whether or not the behaviour was toxic or an example of griefing, which will then aid the machine learning process.
Twitter user Canz__ was first to spot the screenshot of Justice during Maisto’s presentation.
— ᑕᗩᑎᘔ (@Canz__) November 21, 2019
You can also see the full FACEIT presentation below (skip to 27:30 for Maisto talking about Justice specifically).
During the presentation, Maisto says: “One of the main concerns our community expressed when we announced Minerva was that they didn’t really understand who was setting the threshold. Is it just some machine ruling on behaviour, is it a man behind the curtain playing judge on our behaviour and making the decisions?”
He goes on to explain that allowing players to review cases will remove some of that ambiguity, as well as help to train Minerva.
There is currently no set date for when Justice will be live