Between January 6 and January 20, we saw that Social Media Policies and Community Standards could be enforced. Although it was a bit of ‘backs against the wall’ action, social media companies such as Twitter, Facebook, Twitch, Reddit, Google, YouTube and others flexed their muscles and banned or restricted some accounts due to violation of their community standards, such as misinformation and inciting violence. The CyberCivics kids say, “it’s about time,” and we agree. But now is better than never if it gets us taking action.
Why are we so frustrated with social media companies ability to censor or monitor –depending on your point of view–content? Do we expect them to be the content and behavior police? Or are we offloading our responsibilities as citizens and parents by expecting the Facebooks and YouTubes to identify and remove misinformation or inciting and abusive content?
I think we are. Here’s why.
Social media policies need to start at home and they need to reflect our values—those early lessons we teach our kids: honesty, fairness, determination, consideration, and love. It’s time we rolled up our sleeves and pitched in on the fight. Social media isn’t going away nor should it. Let’s reclaim it. If ever there was a time to be an ‘upstander’ rather than a bystander to combat inappropriate, abusive or fraudulent content, it is now, when we can build on the lessons most recently learned.
The last two week raised a couple of questions for me:
- Did it have to get to outright lies and inciting violence before social media platforms took action–whatever happened to trying to prevent simple threats and bullying? and
- How many people have actually read the platforms’ Community Standards or think about what’s OK when it comes to online behaviors?
Community Standards are available for all the platforms if you care to look. Facebook, for example, publishes their Community Standards https://www.facebook.com/communitystandards/ and outline their approach to balancing free speech and giving everyone a voice with the responsibilities of good citizenship, such as safety, privacy, authenticity, and respect. No easy task. And as every parent knows, laying out guidelines is not the same thing as enforcement. But what about OUR social media policies and community standards?
Who carries the burden of enforcement? The NY Times reported that Zuckerberg did not feel that Facebook should police content. Many disagreed with his position and argued that social media companies should not only monitor but be held accountable. Aside from the problems of operationalizing all those terms for a court of law, there’s ‘what should be’ and ‘what is,’ what’s aspirational and what’s achievable. Not legally, but in practice.
Pew Internet (2019) reported that nearly 75% of Facebook users visit the site every day, a number unchanged from 2018. Users are apparently undaunted by privacy, political censorship, abuse and fake news concerns. They carry on, reading, posting and sharing even though majority of users believe all these problems exist online.
If you consider the volume of content being posted on social media platforms, you can start to see the challenge. On a daily basis, 350 million photos are uploaded on Facebook, over 400,000 hours of video posted to YouTube every minute (300 hours/minutes), 500 million tweets sent per day, 95 million photos and videos shared on Instagram and that’s only a few of the major social media platforms. Even the best algorithms and most vigilant review teams stand little chance of getting it all, much less getting it right. It is further complicated by the fact content is subjective, culturally-bound and idiomatic and that no human, however ‘perfect’ and evolved, is without some form of cognitive bias. It’s how our brains work.
What does this mean?
Social media policies have to start at home or at the very least at school. Social Media Policies and Community Standards need to be taught as part of the basic lessons and values parents work so hard to teach their children: responsibility, respect, safety and self-regulation. Digital citizenship, digital etiquette and digital safety all translate from offline values to the digital landscape.
Social Media companies need to do their best to monitor abuses but there are always going to be a lot more of us than of them. We need to partner with social media companies to take some of the responsibility for flagging inappropriate and abusive content. We are the experts in the subjective content we see. Kids are surprisingly proactive reporters of misdeeds, having not become jaded over the ‘ways of the world.’ Yet we hear from kids all the time that nothing is done when they report hate speech, bullying and other forms of abuse. The appropriate burden on social media companies is to beef up their ability to not just flag but verify and be responsive to reports of abuse.
—
Perrin, A., Anderson, M. (2019) Share of U.S. adults using social media, including Facebook, is mostly unchanged since 2018. Pew Research Center. https://www.pewresearch.org/fact-tank/2019/04/10/share-of-u-s-adults-using-social-media-including-facebook-is-mostly-unchanged-since-2018/
Photo by Robin Worrall @robin_rednine on Unsplash.com