Inside Facebook's race to separate news from junk

Inside Facebook's race to separate news from junk
    Watch the video

    click to begin


    JUDY WOODRUFF: Facebook is under pressure to crack down on false news, fake accounts
    and inflammatory content that can be manipulated to influence the public.
    This week, the social media giant announced that it deleted 865 million posts in the first
    three months of this year.
    Most of it was spam.
    The company also quickly disabled more than a half-a-billion fake accounts.
    But that isn't everything.
    Tonight, we take a look at how Facebook tries to tackle the content it won't delete.
    Miles O'Brien has been examining the problem of junk news and Facebook.
    For the record, the "NewsHour" has partnered with Facebook on projects in the past.
    This is the last report in a special series.
    It's part of our weekly look at the Leading Edge of science and technology.
    MILES O'BRIEN: Inside Facebook headquarters in Silicon Valley, they are trying rethink
    the column of babble that the News Feed has become.
    MAN: So, I don't think this is actually necessarily all that bad of a design, even though it doesn't
    look that great.
    MILES O'BRIEN: Here, they are trying to figure out how rate the quality of the news we like
    and share, more clearly identify the source, offer users some context, and make sure the
    cream rises to the top of the feed, while the junk sits at the bottom.
    TESSA LYONS, Product Manager, Facebook: We don't want a false story to get distribution,
    so we demote it.
    MILES O'BRIEN: Tessa Lyons is product manager of News Feed integrity.
    She works with two competing goals in mind, keep the platform free and open to a broad
    spectrum of ideas and opinions and reduce the spread of misinformation.
    Why not just delete it?
    TESSA LYONS: Well, it's an interesting question.
    And I think, look there's real tension here.
    We believe that we're not arbiters of truth and the people don't want us to be arbiters
    of truth.
    We also believe that censoring and fully removing information, unless it violates our community
    standards, is not the expectation from our community.
    So we work to reduce the damage that it can do and ensure that, when it is seen, it's
    surrounded by additional context.
    MILES O'BRIEN: Even though nearly half of all Americans now get their news from Facebook,
    the company insists it is a technology enterprise, not in the business of making editorial judgments.
    So they are outsourcing the work.
    The most clear-cut problem, content that is demonstrably false.
    To grapple with that, they have turned to a 20 third-party fact-checkers globally, including
    one of the Internet's original arbiters of fact from fiction,
    We caught up with managing editor Brooke Binkowski, a former newspaper and radio reporter.
    She works at home or at a neighborhood coffee shop in San Diego.
    And is this a typical day?
    Busy day?
    Yes, OK.
    BROOKE BINKOWSKI, Managing Editor, This is a busier-than-normal day.
    MILES O'BRIEN: Facebook reached out to Snopes to be among its outside fact-checkers in 2016.
    BROOKE BINKOWSKI: It's gone from probably eight-hour days for all of us to more like
    12-, 15-hour days now because there's just so much to tackle, and we all are true believers,
    We all think that it's important, what we're doing.
    MILES O'BRIEN: Facebook says when a story is debunked by fact-checkers, it reduces its
    reach by 80 percent.
    But Binkowski remains skeptical.
    BROOKE BINKOWSKI: I still am not convinced that it's making a huge difference.
    MILES O'BRIEN: Part of the problem?
    Old-fashioned shoe-leather reporting, making calls to sources, doing the research, sometimes
    even taking a trip to a real library, takes time.
    BROOKE BINKOWSKI: Mark Twain famously said, what is it, a lie can travel halfway around
    the world while the truth is still putting on its shoes.
    I think it's gotten so bad now that a lie can travel like three times around the world,
    completely change, affect the perspectives and the votes of thousands of people, and
    wreak havoc all over the place while the truth is still kind of getting out of bed.
    It's just happened much faster, and it's overwhelming.
    MILES O'BRIEN: At Facebook, they are keenly aware of this, but they see no easy fix, either
    from humans or machines.
    Is it possible to match the rate at which people use the product?
    TESSA LYONS: If we're always waiting for individual facts to be reviewed, that is going to be
    slow in each case.
    But by working with the fact-checkers if we're able to understand the pages, the domains
    who repeatedly spread this type of information, then there's more work that we can do upstream
    to stop it earlier.
    MILES O'BRIEN: They are also looking upstream for spammers who create content that it is
    factually correct, but misleading, incomplete or polarizing.
    It's often called clickbait.
    To try and defend against this, Facebook is using an artificial intelligence technique
    called machine learning classification.
    The idea?
    Feed the computer reams of clickbait examples, so it can find patterns and learn how to spot
    this material and send it to the bottom of the News Feed.
    So, with artificial intelligence, you can make the algorithm smart enough to identify
    what is clickbait from a spammer?
    TESSA LYONS: This is using a machine learning classifier in order to help us scale this
    problem, because,at two billion people, we want to ensure that our solutions don't require
    individual manual review, but, rather, they can scale across the platform.
    MILES O'BRIEN: Producer Cameron Hickey has been developing his own tool to identify the
    junk as part of our reporting.
    In doing so, he has seen the limits of machine learning and the persistence of the adversary.
    CAMERON HICKEY: Using software to recognize patterns and then do something based on those
    patterns that you recognize is only as good as a pattern remaining consistent.
    And the whole point of this problem is that the people who are trying to publish content
    like this, they're adaptive, right?
    So as soon as you shut down one avenue, they move to another avenue.
    MILES O'BRIEN: Historically, junk news producers have taken advantage of the fact that most
    everything that appears in the Facebook News Feed looks the same, whether it's heavily
    researched and vetted journalism or pure junk.
    In the days when we bought newspapers and magazines at newsstands or supermarkets, it
    was easier to identify the difference between quality and junk.
    Facebook is developing ways to give its users some clues in labs like this one.
    GRACE JACKSON, Facebook: So, what's going to happen is, there's going to be red dots
    that pop up on the screen.
    I just want you to follow them with your eye.
    MILES O'BRIEN: Grace Jackson is a quantitative user experience researcher at Facebook.
    She is showing me how she tracks eye movements as a user reads a News Feed.
    She's testing to see how easily I recognize visual cues that an article has been fact-checked
    or links are added for context.
    I blew by it.
    MILES O'BRIEN: I think you need a little more there.
    GRACE JACKSON: This was our original design that we had tested and learned that a lot
    of people skipped right over it and never saw the entry point over here.
    MILES O'BRIEN: Well, it's not obvious that's a click point, right?
    GRACE JACKSON: Exactly.
    MILES O'BRIEN: The eye-tracking data helps product designer Jeff Smith as he ponders
    new ways to give users clues about the credibility of information.
    JEFF SMITH, Facebook: We're in this new space and age where we're trying to design for new
    mechanisms that people those credibility cues that used to be there via the publisher on
    the supermarket aisle or the newsstand.
    MILES O'BRIEN: He's working on a design that more clearly identifies articles that have
    been debunked, provides context, related articles, and information about the publisher.
    JEFF SMITH: I'm trying to give the user as much information as possible in a way that
    they can easily sort of digest and understand, without getting in the way.
    MILES O'BRIEN: The Facebook News Feed algorithm is finely tuned to hold our attention, originally
    without an emphasis on the quality of the content.
    But the company says it is trying to change that.
    ALEX HARDIMAN, Facebook: We want to make sure that the news people see is high quality.
    And we didn't have that stance before.
    And so it's a pretty radical departure in terms of the way that we have been thinking
    about news, and I think a really worthwhile one.
    MILES O'BRIEN: Alex Hardiman is the director of products for news.
    She is leading Facebook's effort to identify news sources that are credible, trustworthy
    and authentic.
    They're turning to their users for the answer, asking them to rate news sources they trust,
    and feeding those rankings into the News Feed algorithm to determine what sources should
    rise to the top.
    With so many people getting their news from Facebook, why don't you have a newsroom?
    ALEX HARDIMAN: Because, for us, thinking through what quality means doesn't require us to have
    a newsroom.
    We're really trying to make sure that we pull in great information from publishers and from
    the people who use Facebook to make these decisions.
    MILES O'BRIEN: But, if given the choice, will users of Facebook choose quality over junk?
    The hyperpartisan content publisher we found in California, Cyrus Massoumi, is skeptical.
    CYRUS MASSOUMI, Truth Examiner: They aren't New York Times readers, necessarily.
    Maybe some of them are, but the majority of them just want a 250- to 350-word article
    which will sort of like get them a little bit fired up.
    MILES O'BRIEN: And the numbers back him up.
    His Truth Examiner page has 3.8 million fans, and the stories he publishes generate a much
    higher rate of likes and shares than The New York Times and The Washington Post.
    CYRUS MASSOUMI: Nobody wants to read that stuff when they're own their phone, which
    is what everybody's doing when they're on Facebook.
    Like, nobody pulls out their phone and goes like, aha, I would love to read this 5,000-word
    profile of the endangered giraffe in the Congo.
    MILES O'BRIEN: As long as users continue to like and share junk news, should Facebook
    redefine its role as a publisher?
    BROOKE BINKOWSKI: I think that they really need to come to terms with the fact that they
    are a media company, on top of everything else, because, right now, they keep saying
    they're tech, they're tech, they're tech.
    They're trying to avoid it all coming crashing down when they finally say, we're media, because
    then all those questions will come: Well, why didn't you do this?
    Why didn't you do it that way?
    Why didn't you listen to this?
    MILES O'BRIEN: Is it time for the company to take a little more responsibility about
    what is in the News Feed, what is on the trending stories in an active editorial way?
    ALEX HARDIMAN: I say absolutely yes to responsibility.
    The tactics as to whether or not active editorial way, I would say we don't know.
    So, the first part, yes, we have a responsibility to making sure that the news people see on
    Facebook is high-quality.
    MILES O'BRIEN: The problem is far from solved, and the 2018 midterm elections are looming.
    Even as the political races heat up, here at Facebook, they're running their own race,
    with no finish line in sight.
    I'm Miles O'Brien for the "PBS NewsHour" in Menlo Park, California.
    JUDY WOODRUFF: Fascinating.
    And you can watch all of the stories in Miles' series about junk news online at
    Women, progressive Democrats score wave of primary wins. What will that mean for midterms? Your phone is trying to control your life A young Mark Zuckerberg's early mistake Why the U.S. pays more for health care than the rest of the world The Key to Understanding Facebook's Current Crisis WATCH: Former Secy. of State Tillerson delivers remarks at VMI commencement Michelle Obama's speech moves many to tears in Charlotte The life of an ex-president after leaving office Watch Michelle Obama speak on International Women's Day Cambridge Analytica boss under fire from MPs