Driving Civility and Safety for All Users
Roblox has spent almost two decades working to make the platform one of the safest online environments for our users, particularly the youngest users. Our guiding vision is to create the safest and most civil community in the world.
As our platform evolves and scales, forging a new future for communication and connection, our investment in preventative safety measures remains fundamental. To be the best in the world at delivering safe and civil online experiences, this is essential. With each passing year, we implement new strategies and technology to achieve gains in speed and effectiveness of our safety and moderation systems.
Every day, tens of millions of people of all ages have a safe and positive experience on Roblox, abiding by our Community Standards. For instance, as shown in our latest annual Transparency Report, Roblox users generated and uploaded approximately 205 billion total pieces of content to Roblox’s platform, every piece of which was reviewed by our content moderation tools. Only 0.0063% of that total content was flagged as violating our policies around issues such as bullying, hate speech and violent extremism via our detection and reporting systems. It’s no coincidence that our policies are significantly stricter than those found on social networks and user-generated content platforms, and cover everything from profanity to ad standards.
We continuously measure a wide range of internal metrics—using both automated and human systems—spanning everything from bullying to inappropriate language. This approach guides our work in fostering the most safe and civil interactions on the internet and, on the most basic level, means that we can say that safety issues are not widespread or systemic on Roblox. And a commitment to this work is critical—we are all part of families ourselves and know that even one incident involving the health or safety of children is horrible. We sympathize deeply with families and individuals who have been impacted.
We have built a platform with safety at the foundation
Every team at Roblox helps ensure that products are designed with safety in mind. In addition to the 10% of our full-time employees and thousands of contractors who focus exclusively on trust and safety, hundreds more Roblox employees work to build and enhance the technology on which our Trust & Safety Team depends. Moreover, our leadership is committed to ensuring that these teams are always properly resourced. We spend hundreds of millions of dollars each year to meet our safety mission.
Our recent advances in AI have improved safety on Roblox. Large language models, multimodal AI, and custom models that identify safety issues in spoken language have increased moderation accuracy and efficiency. To ensure the integrity of our efforts, we deploy AI-driven automation only once we’ve shown that it produces more accurate results than humans when applied at scale. While AI has automated some human efforts, it has also allowed our staff to focus on more complex investigations, including enforcing our policy that prohibits grooming minors and soliciting their personal information.
We have a multi-prong and interconnected approach to safety
- Significant updates and features undergo a rigorous Trust by Design process to solicit feedback and identify potential safety issues well ahead of development.
- We proactively moderate all communication on the platform through a combination of AI and human review. If any signs of critical harm are identified, they are escalated to our investigation team.
- Inappropriate words and phrases are filtered from all text communication using rigorous industry-leading automated filters.
- Our voice communication system provides real-time feedback to users who violate our policies. This has resulted in the behavior on Roblox becoming significantly more civil.
- For users under 13, our filters block sharing of personal information and attempts to take conversations off Roblox, where safety standards and moderation are less stringent.
- We do not allow users to exchange images or videos through voice or text messaging on Roblox.
- All content uploaded to Roblox, including images, video and audio files, 3D models, and text, undergo a comprehensive review process using multimodal AI augmented by humans.
- Our safety team constantly shares insights and guidance throughout the company to improve the safety of our platform and products.
We ensure that users have easy, consistent access to features that allow them to flag content that is abusive, vulgar, or otherwise inappropriate for our online community. The Report Abuse feature, located prominently throughout Roblox—including within experiences—allows users to easily report inappropriate content or behavior, and we encourage our users to use it.
Parents can limit or disable online chat capabilities, customize access to experiences based on age recommendations, and select options for spending limits. Feedback from users, parents, and other members of our community is one of the best tools we have to improve our products. We continually explore different ways to update our parental control systems to make them even more useful.
We are deeply invested in addressing the industry-wide challenge of online safety
Online safety and civility—particularly the safety of young users—is an industry-wide issue that we are actively collaborating on with law enforcement, key industry organizations, and policymakers. We recognize there are bad actors out there, and we are deeply troubled by any reports of child endangerment. This is why Roblox, for example, was the first, and one of the only, companies to support the California Age-Appropriate Design Code Act and more recently signed a letter of support for California Bill SB 933, which updates state laws to expressly prohibit AI-generated child sexual abuse material.
We work with law enforcement to track down perpetrators
Over the years, we have formed deep and lasting relationships with law enforcement at the international, federal, and state levels, and we regularly share best practices. We provide clear guidance about how law enforcement can contact our team, including resources we make available online to connect law enforcement directly to the Roblox Law Enforcement Portal. We also proactively report potential safety threats to law enforcement via an integration with the FBI’s EXTRACT team, which is designed to handle reports from platforms like ours.
We proactively report potentially risky content to the National Center for Missing & Exploited Children (NCMEC). NCMEC’s CyberTipline is a designated reporting mechanism for the public and electronic service providers (ESPs) to report instances of suspected child sexual exploitation. In 2023, Roblox reported 13,316 incidents to NCMEC (Roblox errs on the side of safety and, as a result, has a lower reporting threshold than other platforms). To put the volume of incidents on Roblox into context, in 2023, the CyberTipline received more than 35 million reports relating to incidents on other platforms from other ESPs.
We are expanding our industry-wide relationships, research, and collaborations to find new solutions together
We also communicate and collaborate across our sector. Knowing that increasing the civility and safety of online communication is an industry-wide issue, we’re in constant communication with other platforms, NGOs, and academic/industry consortiums working to find solutions to address bad actors. Roblox, for instance, has been an active member of Tech Coalition since 2018 and joined Lantern as a founding member. This first-of-its-kind cross-platform signal-sharing program enables participating companies to share information and take action on bad actors moving from platform to platform.
"Roblox remains a valued member of the Tech Coalition, including as part of our Board of Directors. They have demonstrated leadership within our working groups, helping with critical industry-wide decisions such as the Tech Coalition's pursuit of Lantern and the development of sector roadmaps, among others. This work helps Roblox and the broader industry build capacity to fight online child sexual exploitation and abuse." Sean Litton, Executive Director, Tech Coalition
We also work to improve safety across the industry via open source and collaboration with partners. Last year, for example, the Digital Wellness Lab at Boston Children’s Hospital and leaders from Roblox convened over 100 top safety & civility experts from around the globe to share ideas and create a framework on steps needed across technology innovation, policy, and education to make the online world a truly safe and civil space, especially for youth, and across disciplines to weigh in and workshop the challenge as members of the Civility Working Group.
Our close collaboration with this wide range of public and private organizations, including parental advocacy groups, mental health organizations, and government agencies, also gives us valuable insights into the concerns of parents, policymakers, and others about online safety while we share our learnings and technology solutions.
We also advocate for policies that align with these learnings and are in the best interests of young people.
We have been recognized for our safety and civility initiatives by numerous safety-focused industry and civil society organizations. These include Save the Children, the Digital Wellness Lab, Connect Safely, the United Nations Office of the Special Representative of the Secretary-General on Violence Against Children, and Family Online Safety Institute (FOSI).
"From our earliest encounters with Roblox, we've been impressed by their commitment to safety. It flows from their CEO and we have had the opportunity to provide constructive criticism through their Trust and Safety Advisory Board. Roblox has consistently maintained a leadership role when it comes to online safety in general and to the policies and tools they have created specifically for their online communities." Stephen Balkam, CEO, Family Online Safety Institute
We will continue to work tirelessly to keep our users safe and be vigilant against bad actors who might attempt to circumvent our safety systems. And we recognize, too, that this critical work is never finished. We commit to continually invest and adapt to address the evolving threat landscape to keep our community safe. We are already working on the next generation of safety tools and features as we seek to continue to lead on the future of safety and civility online.