You’re probably already media literate – trust your instincts

This week is Media Literacy Week in the U.S., and that means lots of people will talk about how important it is to be able to tell the difference between facts and fake news, especially online.

It’s true. It’s really important. But it’s always been difficult, and it’s becoming harder and harder as we get more and more of our information via social media. This problem is becoming increasingly apparent as we inch closer to the 2020 presidential election. If you aren’t obsessively reading about this (in which case, I envy you) you might have missed that Mark Zuckerberg, head of Facebook, recently said his platform will not be taking action against political ads that contain lies.

In statements last week, he said he’s really concerned about the “erosion of truth,” but he just can’t let Facebook be the arbiter of right and wrong by taking down political ads that contain false statements. One of his primary arguments is that the FCC requires radio and television stations to give candidates equal time, but Zuckerberg also likes to claim Facebook is not a media company… But it’s this “we’re not the arbiter of truth” piece that feels most troubling to me.

It’s a very familiar argument, similar to what I’ve heard from individuals who’ve decided they don’t trust any mainstream media source: “We can’t trust one arbiter of truth, so we really can’t trust any, and we’ll never now what’s ‘true,’ so why bother worrying about it?” Usually I would get a message like this after gently suggesting to an acquaintance or distant family member that a link to InfoWars or NaturalNews or Prager U might be misleading.

Sometimes journalists get a little resentful about this stuff, which, as a journalist, I get. But I also get not wanting to be condescended to about what’s “true,” and I get that there are so many information sources out there, it can be truly impossible to sift through it all without spending a lot of time and energy. I also get that some people might have that time and energy, but choose to spend it finding things that confirm what they already believe – it’s a free country, so I won’t try to talk you out of it.

But… I think that deep down most people really do care about facts, and really don’t like being lied to. Yes, politics is dirty, and media can be too. But throwing the proverbial baby out with the bathwater in these two areas is dangerous. Media is meant to hold the powerful accountable. Facebook can’t decide if it’s a member of the media, or one of the powerful, or both. It might feel like we, lowly civilians, can’t figure that out for them or do anything about it, but what I want you to think about on this media literacy day is that we can.

Media literacy doesn’t have to imply you’re illiterate about the media, or that you need to take some kind of formal class or workshop to understand what’s going on. For most people – people who want to know what’s true but are just a little overwhelmed – it’s about trusting your instincts.

Does a headline seem too good or bad or crazy to be true? It probably is. You can check by looking at the URL, reading the story, and clicking on links within it.

Are you skeptical of the way something is being framed? That’s great insight. You can read articles by other publications about the same topic to round out your exposure to the story and see what makes sense to you.

You’re still going to suffer from confirmation bias – we all want to believe what we want to believe. But I think being intentional about this, recognizing when we’re maybe understanding something based more on our wishes than the facts in front of us, will make all the difference.

It’s true – existentially, it’s hard to know what’s objectively, 100%, no-doubt true. But that’s not what media literacy is about. It’s about knowing what happened, who did it, and maybe why. Sometimes answering those questions takes more than one tweet or article or even one year of reporting and reading. That’s okay – that’s how it’s always been. Getting comfortable with not knowing some things for sure, but being pretty confident you’re following along, is half the battle.

Resources:

  • Subscribe to The Flip Side, a newsletter that shows you how the right, left, and center are covering various big news items (especially political stuff). It doesn’t always make me feel like I know what’s true for certain, but it helps me understand better the way things are being framed and why.
  • Take this News Literacy Quiz. Fun fact – I didn’t pass the first time I took it myself!
  • Read these 8 ways to tell if a website is reliable.
  • Subscribe to the news sources you use most, and/or sign up for their newsletters so you get the information right in your inbox, rather than through the filter of your social media feed.

 

The Facebook Supreme Court

blur close up focus gavel
Photo by Pixabay on Pexels.com

Yesterday Facebook officially launched its Oversight Board, an independent body that will make decisions about what can and cannot be posted on Facebook and hear appeals from people whose posts have been taken down. It’s been compared to the Supreme Court, the top appeals court in the United States justice labyrinth.

Like the Supreme Court, Facebook says the Oversight Board will create precedent, meaning earlier decisions will be used to shape later ones, so they aren’t reinventing the wheel every time. Also like the Supreme Court, the Board will try to come to consensus, but when everyone can’t agree, the majority will make the decision and those who dissent can include their reasons in the final decision.

Unlike the Supreme Court though, the Oversight Board’s members won’t be nominated by the president…I mean CEO, Mark Zuckerberg. He’s only appointing the two co-chairs, and it will be up to them to choose the rest of the 11-person board (it will get bigger as time goes on, according to the charter).

According to Facebook:

The purpose of the board is to protect free expression by making principled, independent decisions about important pieces of content and by issuing policy advisory opinions on Facebook’s content policies.

How will they choose what pieces of content are “important” enough to get an official ruling? The process is laid out in a post in Facebook’s newsroom. Cases referred to the Board will be those that involve “real-world impact, in terms of severity, scale and relevance to public discourse,” and that are “disputed, the decision is uncertain and/or the values involved are competing.”

I’m spitballing here, but my guess is that means it woouldn’t include your aunt posting confederate flag memes to her 12 followers, but it might include a politician who posts the same to their thousands of followers. My guess is that other cases will include things like body positivity posts that have been reported and taken down, like this one on Instagram (which is owned by Facebook).

In a blog post introducing the Board, Zuckerberg said it will start with “a small number of cases,” and admitted there’s still a lot of work to be done before it’s operational. I couldn’t find a method of actually submitting a case, for example.

The big question I ask myself when I see things like this: Do I think it is an empathetic use of technology? Do I think it shows an understanding of – and compassion for – users’ experiences and concerns? And do I think it will encourage users to be more empathetic themselves?

In some ways yes; almost; and maybe.

I do not think Zuckerberg ever expected to be tasked with arbitrating free speech on the internet. But he’s here now, and he’s getting a lot of pressure from politicians of all stripes to do something about harassment, privacy violations, and alleged censorship. Not to mention the fact that some lawmakers (and constituents, and former Facebook employees) want to break up the company’s ostensible monopoly on social media discourse. It’s all eyes on Zuck. His response to the free speech stuff has long been that it’s not his job to make those decisions. He has said he wants governments to make it clearer what’s okay to post online and what’s not. But by virtue of global politics and Facebook’s size and influence, the company is already making these decisions every day whether he likes it or not.

So I think a Supreme Court-style Oversight Board that can make binding decisions he cannot veto is smart. I think it could assuage some of his critics and make certain people feel more comfortable using the platform. I think it’s more self-preservation than empathy, but I think the effect could be an empathetic one if all goes well. But I also think it’s a HUGE undertaking that could go sideways pretty easily.

An internet appeals court is a real, tangible thing Facebook can give us, and it can have real, tangible results – controversial though they will be. Assurance that we won’t be manipulated by Macedonian trolls or bullied by classmates, or that we can post about our lives and ideas without unwittingly entering the thunderdome, is a lot harder to give.