Today, as part of a press stunt, Twitter CEO Dick Costolo and CNBC invited Twitter users to ask questions with the hashtag #askcostolo.
A significant chunk of tweets on that tag concerned abuse and harassment on Twitter. This problem disproportionately affects marginalized groups, including women and people of color.
Concerns around harassment were not substantively addressed. Twitter seems helpless to stop the madness. Maybe it’s because their team is so overwhelmingly not comprised of the marginalized groups I mentioned.
Policing a global community of Twitter’s scale is tough work. No one knows how to do it yet. But protecting users from abusive content is a software problem.
Here’s a handful of things Twitter could be trying to protect its users. Allow users to optionally:
Block all users whose accounts are less than 30 days old
This is easy—it takes an arrow out of the quiver of serial harassers who use alternate accounts generated as needed.
Block all users whose follow counts are less than whatever threshold users set
Google used the social proof of “back links” to establish credibility and ranking for content over 16 years ago. This is old hat by now. Users should be able to block anyone who can’t convince other people to follow them.
Rings of followers created just to subvert this will have to be detected.
Again, hire a Google engineer. They’ve cracked this one.
Block new users whose @replies include any words the user decides
Users who are on the receiving end of harassment face startlingly unimaginative adversaries. The same slurs and threats are used over and over. Brand new account with no followers using the n-word? Block!
That’s stupidly easy to express algorithmically.
Block any user who has been blocked by more than N people I’m following
Let’s also share the load. If all your friends block someone there’s a decent chance you’ll want to also.
Auto-blocks are opaque
There should be no feedback when a behavior triggers these measures. The harasser should believe that everything is working as normal.
This is, at best, a box of bandaids. But it arms every user with substantially more tools than they have today to control and enjoy their experience with the platform.
A company that produces a turd of an iPhone app with over 30 dedicated engineers can surely spare a couple souls to work on basic content filtering mechanics.