Microsoft Corporation formally submitted a comprehensive compliance report to the Australian eSafety Commissioner on April 23, 2026, detailing the company's latest strategies to combat online grooming and extremist radicalization within its gaming ecosystems. The submission follows a legal transparency notice issued under the Online Safety Act 2021, which required Microsoft and other major gaming entities to disclose internal data regarding user protection measures.

The report focuses extensively on Minecraft, a platform with a significant minor user base. Microsoft disclosed that it has increased its annual investment in artificial intelligence-driven moderation systems by 15 percent over the previous fiscal year. These systems are designed to identify and flag predatory behavior patterns before they result in direct contact between adults and minors. The company reported that its Safety by Design framework now incorporates real-time intervention capabilities across all first-party titles.

According to the filing, Microsoft's moderation infrastructure now utilizes advanced natural language processing to monitor in-game chat for indicators of grooming. This includes the detection of attempts to migrate conversations to unmoderated, third-party messaging platforms. Microsoft also confirmed its continued participation in the Tech Coalition and its use of shared hash databases with the National Center for Missing and Exploited Children to prevent the spread of known harmful material.

Addressing the risk of radicalization, Microsoft outlined the protocols managed by its Digital Safety Operations Center. This unit is tasked with monitoring for the dissemination of extremist ideologies and the use of gaming environments for recruitment by designated hate groups. The company stated it has implemented more stringent verification processes for user-generated content and private servers, known as Realms, to provide more granular control over community interactions.

Commissioner Julie Inman Grant stated that the transparency notices are essential for understanding how gaming platforms, which often operate with less public scrutiny than social media, manage safety risks. Under the Online Safety Act 2021, the Commissioner has the authority to issue civil penalties of up to AUD 782,500 per day for corporate non-compliance. Microsoft's response included specific data on the volume of reports received from Australian users and the average resolution time for safety-related complaints.

Microsoft Chief Safety Officer Courtney Gregoire noted in the submission that the company is committed to evolving its safety architecture in response to increasingly sophisticated online threats. The report also highlighted the performance of the Xbox Family Settings app, which has seen a 22 percent increase in active users in Australia, allowing parents to manage communication permissions and screen time.

The eSafety Commissioner’s office will now review the submission to determine if Microsoft’s measures align with the Basic Online Safety Expectations set by the Australian government. This review is part of a broader regulatory audit involving other major industry participants, including Sony Interactive Entertainment and Roblox Corporation.