Collaborative Notes: X Launches AI Feature To Augment Fact-checking System
Keneci Network @kenecifeed
Keneci Network @kenecifeed
X has launched a new experimental feature called Collaborative Notes, designed to integrate artificial intelligence into its Community Notes system. The initiative allows eligible human contributors to request AI-generated drafts for Community Notes, which are then refined through community feedback.
The AI, powered by Grok, creates an initial draft when a user requests a note on a post. This draft is then reviewed, rated, and improved by other contributors using suggestions and ratings. The system evaluates incoming feedback to determine if a revised version constitutes a meaningful improvement before publishing it.
Only top Community Notes writers with a high “Writing Impact” score can initiate a Collaborative Note at launch. The feature is currently limited to English-language posts.
Contributors can rate notes and suggest edits, with revisions only published if they represent a meaningful improvement. A revision history is maintained for transparency, and contributors are notified if their input is used.
The feature aims to address long-standing criticisms about the slow pace of note publication. According to X’s Keith Coleman, who oversees Community Notes, the collaborative process creates a “new way to make models smarter in the process” through continuous learning from community input. This hybrid model combines AI speed with human judgment, allowing notes to evolve in real time as new feedback arrives.
“The idea: when you request a note, AI drafts one — then the community refines it together through ratings and suggestions. You can watch it get better in real time. It’s a whole new way for the public to work with AI — and each other,” X's @CommunityNotes handle wrote.
The system follows the same transparency principles as existing Community Notes, including open-source code and public data access. And X is exploring new metrics like “Collaborative Impact” points to track the influence of user suggestions.
This marks a significant evolution from earlier experiments, such as the 2025 pilot that allowed AI bots to generate notes independently. In that model, AI-written notes were subject to human rating but not direct editing. The new Collaborative Notes feature represents a more integrated, iterative process where humans and AI co-create content together.
The move aligns with broader research advocating for hybrid human-LLM systems in content moderation. A 2025 study co-authored by X and institutions like MIT, Stanford, and Harvard supports this approach, arguing it can expand reach and speed while preserving trust.
X emphasizes that AI is not meant to replace humans but to augment the community’s ability to scale fact-checking efforts—especially for high-visibility or rapidly spreading content. This addresses long-standing criticism of slow community note publication.