Minecraft Strengthens Chat Moderation and Reporting
Minecraft has updated its chat moderation system with improved detection of harmful language and a clearer reporting process.
Published: 2026-01-20
What Changed
Minecraft has rolled out an updated chat moderation system for Java and Bedrock editions. The system now uses improved AI detection to identify and filter harmful language, grooming patterns, and requests for personal information in real-time. Players can report individual chat messages more easily, and reported messages are reviewed by human moderators within 24 hours. Violations can result in temporary or permanent bans from online multiplayer.
What to Review
- •Ensure your child's Minecraft account has the correct age set — parental controls are tied to age verification through Microsoft Family Safety.
- •Review whether your child plays on public multiplayer servers where they interact with strangers.
- •Check that chat reporting is enabled and that your child knows how to use it.
Parent Actions
- 1Set up Microsoft Family Safety (account.microsoft.com/family) to manage your child's Minecraft account settings.
- 2Discuss with your child which servers they play on and whether they are interacting with people they know in real life.
- 3Show your child how to report concerning chat messages using the in-game reporting tool.
Full app guide
View the complete Minecraft safety guide
This is practical educational content to support families. For case-specific concerns about a child's safety, contact the NSPCC helpline on 0808 800 5000 or your local safeguarding team.
Was this page helpful?
Last reviewed: 2026-03-29