N.B. this post is also available in German below.
🔗Introduction
Right now, the world needs secure communication more than ever. Waves of security breaches such as the “Salt Typhoon” compromise of the telephone network’s wiretap system have led the FBI to advise US citizens to switch to end-to-end-encrypted communication. Geopolitical shifts painfully highlight the importance of privacy-preserving communication for vulnerable minorities, in fear of being profiled or targeted. Meanwhile the International Rules-Based Order is at risk like never before.
We built Matrix to provide secure communication for everyone - to be the missing communication layer of the Open Web. This is not hyperbole: Matrix is literally layered on top of the Web - letting organisations run their own servers while communicating in a wider network. As a result, Matrix is “decentralised”: the people who built Matrix do not control those servers; they are controlled by the admins who run them - and just as the Web will outlive Tim Berners-Lee, Matrix will outlive us.
Matrix itself is a protocol (like email), defined as an open standard maintained by The Matrix.org Foundation C.I.C - a UK non-profit incorporated in 2018 to act as the steward of the protocol; to coordinate the protocol’s evolution and to work on keeping the public Matrix network safe. The Foundation is funded by donations from its members (both individuals and organisations), and also organises the Matrix.org homeserver instance used by many as their initial home on the network.
Much like the Web, Matrix is a powerful technology available to the general public, which can be used both for good and evil.
The vast majority of Matrix’s use is constructive: enabling collaboration for open source software communities such as Mozilla, KDE, GNOME, Fedora, Ubuntu, Debian, and thousands of smaller projects; providing a secure space for vulnerable user groups; secure collaboration throughout academia (particularly in DACH); protecting healthcare communication in Germany; protecting national communication in France, Germany, Sweden and Switzerland; and providing secure communication for NATO, the US DoD and Ukraine. You can see the scope and caliber of the Matrix ecosystem from the talks at The Matrix Conference in September.
However, precisely the same capabilities which benefit privacy-sensitive organisations mean that a small proportion of members of the public will try to abuse the system.
We have been painfully aware of the risk of abuse since the outset of the project, and rather than abdicating responsibility in the way that many encrypted messengers do, we’ve worked steadily at addressing it. In the early days, even before we saw significant abuse, this meant speculating on approaches to combat it (e.g. our FOSDEM 2017 talk and subsequent 2020 blog post proposing decentralised reputation; now recognisable in Bluesky’s successful Ozone anti-abuse system and composable moderation). However, these posts were future-facing at the time - and these days we have different, concrete anti-abuse efforts in place.
In this post, we’d like to explain where things are at, and how they will continue to improve in future.
🔗What we do today
The largest use of our funding as a Foundation is spent on our full-time Safety team, and we expanded that commitment at the end of 2024. On a daily basis, the team triage, investigate, identify and remove harmful content from the Matrix.org server, and remove users who share that material. They also build tooling to prevent, detect and remove harmful content, and to protect the people who work on user reports and investigations.
The humans who make up the Foundation Trust & Safety team are dedicated professionals who put their own mental health and happiness in jeopardy every day, reviewing harmful content added by people abusing the service we provide. Their work exposes them to harms including child sexual exploitation and abuse (CSEA), terrorist content, non-consensual intimate imagery (NCII), harassment, hate, deepfakes, fraud, misinformation, illegal pornography, drugs, firearms, spam, suicide, human trafficking and more. It’s a laundry list of the worst that humanity has to offer. The grim reality is that all online services have to deal with these problems, and to balance the work to detect and remove that content with the rights of their users. We’re committed to that work, and to supporting the Trust & Safety team to the best of our ability — we are very grateful for their sacrifice.
Continue reading…