Digital Playground

Co-authored by Nick Woodford, Content Manager, and Copywriter at Anzu, and Brenna Schaaf, Director, Marketing at Kidas

Although video games have always been popular among kids, the sheer amount available, the ease of accessibility, and with a growing number of immersive and engaging free-to-play titles, gaming has become an extremely popular pastime where kids are not only playing but are hanging out with friends, creating, learning, and in some cases even making money. In a recent study, 93% of boys aged 8-11 and 79% of girls aged 12-15 said they had played a video game in the last month.

With so many young players taking to gaming platforms, game companies have a duty to ensure this space is kept safe for them and harmonious for adults. But what do they need to be aware of? And how can they safe-proof their titles to ensure they remain safe spaces for players everywhere? Over the following paragraphs, we’ll be exploring some of the key areas of focus, including concerns with in-game currency, cyberbullying prevention, and the importance of working with the right partners.

Ensuring your communication channels are foolproof

When we think of negative communication within games, often what comes to mind is someone shouting obscenities through a headset at another player who has just killed them in the game. However, game companies should be monitoring bad actors, especially when they know their games attract younger players.

Another aspect of negative communication through gaming can be via internal chat systems where players can communicate via text messaging. Many multiplayer games, including Roblox, have safety precautions in place, with AI-based technology monitoring chats in real-time that happen on the platform. Third parties are available to developers who don’t have the resources to build their own security systems. One of these is ProtectMe by Kidas’ novel context understanding algorithm, which securely identifies instances of cyberbullying, predation, and toxic gaming behavior via chat systems.

Choosing the right monetization method

Cyberbullying is an ongoing issue for game developers. With new technologies and ways to interact within games, developers need to pay close attention to their platforms to stamp out any problems. One area where this is growing is around in-game purchases where kids are singled out for not having the same skins as their friends or by only sporting a free ‘default’ option. In one instance, a teacher from a UK primary school told a story of one of his students being bullied for not owning a premium skin in Fortnite, saying he “begged his parents for [money] to buy a skin because no one would play with him”.

Paid for DLC, where players must spend money to access extra levels and content, can also result in kids being excluded as they cannot access the same areas of the game as their peers.

Both parents and players have also heavily criticized loot boxes but for a different reason. Studies show that this form of in-game monetization can lead to negative spending habits associated with gambling. Several governments have put restrictions in place to safeguard both children and adults from this form of monetization to stamp out this behavior. China first spearheaded this push by requiring game companies to publicize the odds of loot boxes. A practice that has also been adopted in the US by gaming platforms and major publishers in light of legal and public scrutiny over how they are used.

So what can we learn here? Game developers should properly consider the right monetization methods for their titles. Once in play, they should monitor how they are used and what impact they have on their player’s experience. Anzu helps developers implement blended in-game ads that complement the gameplay and add a sense of realism, allowing developers to monetize their titles without compromising on the player experience.

Eliminating inappropriate content

In the past, rating a game was pretty easy, helping parents clearly understand what themes, storylines, and content their kids would be playing through.

The rise of user-generated content via open-world multiplayer experiences like Fortnite, Minecraft, and Roblox, where players can delve in and out of different games within one platform, has made this process more challenging. However, all these platforms have strict guidelines to vet and verify any experiences created on the platform to ensure they are suitable for their audiences. Developers creating content for these worlds should ensure that it matches the correct regulations and that there are no areas of the game that could harm or impact players of a younger age.

This is also true for developers creating games for other platforms. With the sheer amount of titles now available, it can become overwhelming and often extremely difficult for parents to vet every game their kids play. Developers have a duty to ensure they clearly describe exactly what players can expect from their titles and what audiences they are suitable for. This is even more important if they know that their games will likely appeal to younger audiences.

Looking to the future

As gaming continues to grow and kids spend even more time within these digital spaces, we all have a duty of care to protect upcoming generations from harmful and inappropriate behavior and content. One way for game developers to do this is to ensure they are working with the right partners who have these issues in mind and can help with any concerns or hurdles they may need to overcome.

Anzu and Kidas co-authored this article. Anzu is an award-winning in-game advertising solution that allows game developers to monetize their titles safely and securely by running blended in-game ads that sit within the background of games, complementing gameplay and enhancing the realism of the experience. Kidas is a solution that sends alerts when children are exposed to bullying, online predators, sexual content, hate speech, and other toxic behaviors within games, and its technology can be embedded in almost any mobile or PC game.