We hope that expanded transparency through disclosures to the Consortium can help us all learn and build the necessary societal defenses and capacities to protect public conversation.Ĭonsortium membership is by application. Our goal is to provide increased transparency about more issues that impact the health of the platform, while grappling with the considerable safety, security, and integrity challenges in this space. We’ve designed the Consortium as an industry-leading effort to increase transparency around Twitter’s content moderation policies and enforcement decisions, so credible, public interest researchers can independently investigate, learn, and produce insights that inform the public, policymakers, and other researchers. The Consortium continues and expands on that access.
In October 2018, we launched the first archive in the industry of potential foreign information operations we had seen on Twitter. Transparency is core to our mission and has been a critical part of Twitter from the start.
Over time, we intend to share similarly comprehensive data about other policy areas with the Consortium. Through the Consortium, Twitter will continue to support our existing disclosures of datasets of persistent platform manipulation campaigns, which consist of material that was posted in violation of our platform manipulation and spam policy. Through the Twitter Moderation Research Consortium (“ TMRC” or the “ Consortium”), Twitter shares large-scale datasets concerning platform moderation issues with a global group of members, comprising of public interest researchers from across academia, civil society, NGOs and journalism, studying platform governance issues.