Gaming platforms have started to realize that harm doesn’t simply come from seeing or having access to age-inappropriate games, it can also come from seeing inappropriate or hurtful messages. Gamers, especially in highly competitive titles, have long used colorful language, to put it mildly, in communicating with others, sometimes without considering whether the recipient is comfortable with it or not. Valve is now rolling out a chat filtering system for Steam that puts gamers in complete control while keeping its own hands off moderating these chats.
Chat or profanity filters often take the form of blocking inappropriate messages, especially from strangers. Here, Valve is already doing something rather different and will instead obscure words or phrases rather than entire messages. Instead of certain undesired words, recipients will see symbols instead.
These word filters can come from three sources. Valve has a default list of the most common profanities but users can add or remove words as they feel necessary. They can even add lists from other sources, which Valve positions as a way for communities and groups to define their own language guidelines.
Steam Chat Filtering also gives users control over where the filters get applied. The system is implemented in Steam Chat on all platforms and Chat in Supported Games and, by default, lets messages from registered friends pass through unfiltered. You can change that, of course, including turning off filtering completely, and the sender won’t see or be notified if a filter has been applied to words.
What Valve isn’t doing, however, is preventing harmful messages from even reaching people in the first place. It says that it doesn’t want to stand in the way of marginalized groups trying to reclaim language for themselves but others will undoubtedly see this as Valve placing the burden on users and communities and shirking its responsibilities instead.