Publicado: Sáb, Enero 20, 2018
Ciencia | Por Aurelio Ontiveros

Facebook's big newsfeed rethink: 5 key lessons for brands

Facebook's big newsfeed rethink: 5 key lessons for brands

Facebook is announcing a second major tweak to its algorithm, saying it will prioritize news based on survey results of trustworthiness.

Facebook founder Mark Zuckerberg said Friday (Jan. 19) that starting next, week the social media site will start prioritizing news sites.

Users' timelines will now include about 4 percent news, down from 5 percent before the shift.

"You could find music; you could find news; you could find information, but you couldn't find and connect with the people that you cared about, which as people is actually the most important thing".

"We do not plan to release individual publishers' trust scores because they represent an incomplete picture of how each story's position in each person's feed is determined".

"There's too much sensationalism, misinformation and polarization in the world today", Zuckerberg wrote.

While Zuckerberg's tone sounds like he's doing right by us, he's also doing right by Facebook. "We considered asking outside experts, which would take the decision out of our hands but would likely not solve the objectivity problem", wrote Zuckerberg. Users will see more news from sources the community thinks are trustworthy, he said.

Doping whistleblower to testify in Geneva as athletes appeal
Russian athletes were particularly targeted, twice as much as those from any other country in November and December. Verdicts are expected by Friday, Feb. 2, one week before the Pyeongchang Olympics opening ceremony in South Korea.

Here's how this will work.

The surveys will ask a "diverse and representative" sample of Facebook users if they've heard of a news outlet and how much they trust it.

The new ranking system, he said, would hopefully separate news organizations that are only trusted by their readers or watchers, from ones that are broadly trusted across society.

What makes legitimate news on Facebook? Ahead of the US presidential election in 2016, Facebook was criticized for bias because its human curators of a "Trending Topics" section were only allowed to pick links from a set of sources Facebook designated as trusted, which excluded some conservative sites.

This means that Facebook users will see less content on their newsfeeds from brands and publishers.

Facebook wasn't the only one putting time and money into getting news onto their network, media outlets spent millions chasing after coveted eyeballs by tailoring content and publishing directly to the "news" feed.

Me gusta esto: