Menu

Timesdelhi.com

June 16, 2019
Category archive

Facebook News Feed

Facebook changes algorithm to promote worthwhile & close friend content

in Apps/Delhi/Facebook/Facebook algorithm/Facebook News Feed/Facebook survey/India/mobile/Policy/Politics/Social/TC by

Facebook is updating the News Feed ranking algorithm to incorporate data from surveys about who you say are your closest friends and which links you find most worthwhile. Today Facebook announced it’s trained new classifiers based on patterns linking these surveys with usage data so it can better predict what to show in the News Feed. The change could hurt Pages that share click-bait and preference those sharing content that makes people feel satisfied afterwards.

For close friends, Facebook surveyed users about which people they were closest too. It then detected how this matches up with who you are tagged in photos with, constantly interact with, like the same post and check in to the same places as, and more. That way if it recognizes those signals about other people’s friendships, it can be confident those are someone’s closest friends they’ll want to see the most of. You won’t see more friend content in total, but more from your best pals instead of distant acquaintances.

A Facebook News Feed survey from 2016, shared by Varsha Sharma

For worthwhile content, Facebook conducted surveys via news feed to find out which links people said were good uses of their time. Facebook then detected which types of link posts, which publishers, and how much engagement the posts got and matched that to survey results. This then lets it determine that if a post has a simialr style and engagement level, it’s likely to be worthwhile and should be ranked higher in the feed.

The change aligns with CEO Mark Zuckerberg’s recent comments declaring that Facebook’s goal isn’t total time spent, but time well spent with meaningful content you feel good about. Most recently, that push has been about demoting unsafe content. Last month Facebook changed the algorithm to minimize clickbait and links to crappy ad-filled sites that receive a disproportionately high amount of their traffic from Facebook. It cracked down on unoriginality by hiding videos ripped off from other creators, and began levying harsher demotions to repeat violators of its policies. And it began to decrease the distribution of “borderline content” on Facebook and Instagram that comes close to but doesn’t technically break its rules.

While many assume Facebook just juices News Feed to be as addictive in the short-term as possible to keep us glued to the screen and viewing ads, that would actually be ruinous for its long-term business. If users leave the feed feeling exhausted, confused, and unfulfilled, they won’t come back. Facebook’s already had trouble with users ditching its text-heavy News Feed for more visual apps like Instagram (which it luckily bought) and Snapchat (which it tried to). While demoting click-bait and viral content might decrease total usage time today, it could preserve Facebook’s money-making ability for the future while also helping to rot our brains a little less.

Facebook prototypes a swipeable hybrid carousel of feed posts & Stories

in Advertising Tech/Apps/Delhi/Facebook/Facebook ads/Facebook News Feed/Facebook Stories/Facebook Stories Ads/India/mobile/Politics/Snapchat Stories Clone/Social by

Feed and Stories unite! Facebook is so eager to preempt the shift to Stories that it might even let us use the same interface of horizontally swipeable cards to sift through News Feed posts. If users won’t scroll down any more, Facebook’s ad business could take a huge hit. But by allowing traditional feed posts and ads to appear amidst Stories in the same carousel you’re more prone to swipe through, it could squeeze more views and dollars out of that content. This would help Facebook gracefully transition to the post-News Feed era while it teaches advertisers how to use the full-screen Stories ad format.

In this image, you can see a user in mid-swipe through the hybrid carousel between a News Feed story about a friend updating their profile photo to an animated GIF-style video on the left and a Stories video on the right.

We’re awaiting comment from Facebook about this. There’s a chance it was just caused by a bug like the briefly side-scrollable Instagram feed that popped up in December, or that it will never be publicly tested let alone launch. But given the significance of Facebook potentially reimagining navigation of its main revenue stream, we considered it worth covering immediately. After all, Facebook predicts that Stories sharing will surpass feed sharing across all social apps sometime this year,. it alrady has 300 million daily users across Stories on Facebook and Messenger, plus another 500 million on Instagram Stories and 450 million on WhatsApp Status.

This swipeable hybrid carousel was first spotted by reverse engineering specialist and frequent TechCrunch tipster Jane Manchun Wong. She discovered this unreleased feature inside of the Android version of Facebook and screenrecorded the new navigation method. In this prototype, when a News Feed post’s header or surrounding space is tapped, users see a full-screen version of the post. From there they can swipe left to reveal the next content in the hybrid carousel, which can include both traditional News Feed posts, News Feed ads, and purposefully vertical Stories and Stories ads. Users can tap to Like, react to, or comment on feed posts while still in the carousel interface.

If Facebook moved forward with offering this as an optional way to browse its social network, it would hedge the business against the biggest behavior change it’s seen since the move from desktop to mobile. Vertically-scrolling News Feeds are useful for browsing text-heavy content, but the navigation requires more work. Users have to stop and start scrolling precisely to get a whole post in view, and it takes longer to move between pieces of content.

In contrast, swipeable Stories carousels offer a more convenient lean-back navigation style where posts always appear fully visible. All it takes to advance to the next full-screen piece of content is a single tap, which is easier on your joints. This allows rapid-fire fast-forwarding through friends’ lives, which works well with more visual, instantly digestible content. While cramming text-filled News Feed posts may not be ideal, at least they might get more attention. If Facebook combined all this with unskippable Stories ads like Snapchat is increasingly using, the medium shift could lure more TV dollars to the web.

The hybrid posts and Stories carousel can contain both traditional image plus caption News Feed posts and News feed ads as well as Stories

Facebook has repeatedly warned that it’s out of space for more ads in the News Feed, and that users are moving their viewing time to Stories where advertisers are still getting acclimated. When Facebook made it clear on its Q2 2018 earnings call that this could significantly reduce revenue growth, its share price dropped 20 percent vaporizing $120 billion in value. Wall Street is rightfully concerned that the Stories medium shift could upend Facebook’s massive business.

Stories is a bustling up-and-coming neighborhood. News Feed is a steadily declining industrial city that’s where Facebook’s money is earned but that’s on its way to becoming a ghost town. A hybrid Stories/posts carousel would build a super highway between them, connecting where Facebook users want to spend time with where the municipality generates the taxes necessary to keep the lights on.

Facebook News Feed now downranks sites with stolen content

in Apps/Delhi/Facebook/Facebook Clickbait/Facebook News Feed/India/Media/Policy/Politics/Social/TC by

Facebook is demoting trashy news publishers and other websites that illicitly scrape and republish content from other sources with little or no modification. Today it exclusively told TechCrunch that it will show links less prominently in the News Feed if they have a combination of this new signal about content authenticity along with either clickbait headlines orlanding pages overflowing with low-quality ads. The move comes after Facebook’s surveys and in-person interviews with discovered that users hate scraped content.

If illgotten intellectual property gets less News Feed distribution, it will receive less referral traffic, earn less ad revenue, and the there’ll be less incentive for crooks to steal articles, photos, and videos in the first place. That could create an umbrella effect that improves content authenticity across the web.

And just in case the scraped profile data stolen from 29 million users in Facebook’s recent massive security breach ended up published online, Facebook would already have a policy in place to make links to it effectively disappear from the feed.

Here’s an example of the type of site that might be demoted by Facebook’s latest News Feed change. “Latet Nigerian News” scraped one of my recent TechCrunch articles, and surrounded it by tons of ads.

An ad-filled site that scraped my recent TechCrunch article. This site might be hit by a News Feed demotion

“Starting today, we’re rolling out an update so people see fewer posts that ink out to low quality sites that predominantly copy and republish content from other sites without providing unique value. We are adjusting our Publish Guidelines accordingly” Facebook wrote in an addendum to its May 2017 post about demoting sites stuffed with crappy ads. Facebook tells me the new publisher guidelines will warn news outlets to add original content or value to reposted content or invoke the social network’s wrath.

Personally, I think the importance of transparency around these topics warrants a new blog post from Facebook as well as an update to the original post linking forward to it.

So how does Facebook determine if content is stolen? It’s systems compare the main text content of a page with all other text content to find potential matches. The degree of matching is used to predict that a site stole its content. It then uses a combined classifier merging this prediction with how clickbaity a site’s headlines are plus the quality and quantity of ads on the site.

Facebook would make a martyr by banning Infowars

in Apps/Delhi/Facebook/Facebook Fake News/Facebook News Feed/fake news/Government/India/infowars/Opinion/Policy/Politics/Social/TC by

Alex Jones’ Infowars is a fake news-peddler. But Facebook deleting its Page could ignite a fire that consumes the network. Still, some critics are asking why it hasn’t done so already.

This week Facebook held an event with journalists to discuss how it combats fake news. The company’s recently appointed head of News Feed John Hegeman explained that, “I guess just for being false, that doesn’t violate the community standards. I think part of the fundamental thing here is that we created Facebook to be a place where different people can have a voice.”

In response, CNN’s Oliver Darcy tweeted: “I asked them why InfoWars is still allowed on the platform. I didn’t get a good answer.” BuzzFeed’s Charlie Warzel meanwhile wrote that allowing the Infowars Page to exist shows that “Facebook simply isn’t willing to make the hard choices necessary to tackle fake news.”

Facebook’s own Twitter account tried to rebuke Darcy by tweeting, “We see Pages on both the left and the right pumping out what they consider opinion or analysis – but others call fake news. We believe banning these Pages would be contrary to the basic principles of free speech.” But harm can be minimized without full-on censorship.

There is no doubt that Facebook hides behind political neutrality. It fears driving away conservative users for both business and stated mission reasons. That strategy is exploited by those like Jones who know that no matter how extreme and damaging their actions, they’ll benefit from equivocation that implies ‘both sides are guilty,’ with no regard for degree.

Instead of being banned from Facebook, Infowars and sites like it that constantly and purposely share dangerous hoaxes and conspiracy theories should be heavily down-ranked in the News Feed.

Effectively, they should be quarantined, so that when they or their followers share their links, no one else sees them.

“We don’t have a policy that stipulates that everything posted on Facebook must be true — you can imagine how hard that would be to enforce,” a Facebook spokesperson told TechCrunch. “But there’s a very real tension here. We work hard to find the right balance between encouraging free expression and promoting a safe and authentic community, and we believe that down-ranking inauthentic content strikes that balance. In other words, we allow people to post it as a form of expression, but we’re not going to show it at the top of News Feed.”

Facebook already reduces the future views of posts by roughly 80 percent when they’re established as false by its third-party fact checkers like Politifact and the Associated Press. For repeat offenders, I think that reduction in visibility should be closer to 100 percent of News Feed views. What Facebook does do to those whose posts are frequently labeled as false by its checkers is “remove their monetization and advertising privileges to cut off financial incentives, and dramatically reduce the distribution of all of their Page-level or domain-level content on Facebook.”

The company wouldn’t comment directly about whether Infowars has already been hit with that penalty, noting “We can’t disclose whether specific Pages or domains are receiving such a demotion (it becomes a privacy issue).” For any story fact checked as false, it shows related articles from legitimate publications to provide other perspectives on the topic, and notifies people who have shared it or are about to.

But that doesn’t solve for the initial surge of traffic. Unfortunately, Facebook’s limited array of fact checking partners are strapped with so much work, they can only get to so many BS stories quickly. That’s a strong endorsement for more funding to be dedicated to these organizations like Snopes, preferably by even keeled non-profits, though the risks of governments or Facebook chipping in might be worth it.

Given that fact-checking will likely never scale to be instantly responsive to all fake news in all languages, Facebook needs a more drastic option to curtail the spread of this democracy-harming content on its platform. That might mean a full loss of News Feed posting privileges for a certain period of time. That might mean that links re-shared by the supporters or agents of these pages get zero distribution in the feed.

But it shouldn’t mean their posts or Pages are deleted, or that their links can’t be opened unless they clearly violate Facebook’s core content policies.

Why downranking and quarantine? Because banning would only stoke conspiratorial curiosity about these inaccurate outlets. Trolls will use the bans as a badge of honor, saying, “Facebook deleted us because it knows what we say is true.”

They’ll claim they’ve been unfairly removed from the proxy for public discourse that exists because of the size of Facebook’s private platform.

What we’ll have on our hands is “but her emails!” 2.0

People who swallowed the propaganda of “her emails”, much of which was pushed by Alex Jones himself, assumed that Hillary Clinton’s deleted emails must have contained evidence of some unspeakable wrongdoing — something so bad it outweighed anything done by her opponent, even when the accusations against him had evidence and witnesses aplenty.

If Facebook deleted the Pages of Infowars and their ilk, it would be used as a rallying cry that Jones’ claims were actually clairvoyance. That he must have had even worse truths to tell about his enemies and so he had to be cut down. It would turn him into a martyr.

Those who benefit from Infowars’ bluster would use Facebook’s removal of its Page as evidence that it’s massively biased against conservatives. They’d push their political allies to vindictively regulate Facebook beyond what’s actually necessary. They’d call for people to delete their Facebook accounts and decamp to some other network that’s much more of a filter bubble than what some consider Facebook to already be. That would further divide the country and the world.

When someone has a terrible, contagious disease, we don’t execute them. We quarantine them. That’s what should happen here. The exception should be for posts that cause physical harm offline. That will require tough judgement calls, but knowing inciting mob violence for example should not be tolerated. Some of Infowars posts, such as those about Pizzagate that led to a shooting, might qualify for deletion by that standard.

Facebook is already trying to grapple with this after rumors and fake news spread through forwarded WhatsApp messages have led to crowds lynching people in India and attacks in Myanmar. Peer-to-peer chat lacks the same centralized actors to ban, though WhatsApp is now at least marking messages as forwarded, and it will need to do more. But for less threatening yet still blatantly false news, quarantining may be sufficient. This also leaves room for counterspeech, where disagreeing commenters can refute posts or share their own rebuttals.

Few people regularly visit the Facebook Pages they follow. They wait for the content to come to them through the News Feed posts of the Page, and their friends. Eliminating that virality vector would severely limit this fake news’ ability to spread without requiring the posts or Pages to be deleted, or the links to be rendered unopenable.

If Facebook wants to uphold a base level of free speech, it may be prudent to let the liars have their voice. However, Facebook is under no obligation to amplify that speech, and the fakers have no entitlement for their speech to be amplified.

Image Credit: Getty – Tom Williams/CQ Roll Call, Flickr Sean P. Anderson CC

Facebook tests 30-day keyword snoozing to fight spoilers, triggers

in Apps/Delhi/Facebook/Facebook News Feed/Facebook Snooze/Health/India/mobile/Policy/Politics/Social/TC by

Don’t want to know the ending to a World Cup game or Avengers movie until you’ve watched it, or just need to quiet an exhausting political topic like “Trump”? Facebook is now testing the option to “snooze” specific keywords so you won’t see them for 30 days in News Feed or Groups. The feature is rolling out to a small percentage of users today. It could make people both more comfortable browsing the social network when they’re trying to avoid something, and not feel guilty posting about sensitive topics.

The feature was first spotted in the Facebook’s app’s code by Chris Messina on Sunday, who told TechCrunch he found a string for “snooze keywords for 30 days”. We reached out to Facebook on Monday, which didn’t initially respond, but last night provided details we could publish at 5am this morning ahead of an official announcement later today. The test follows the roll out of snoozing people, Pages, and Groups from last December.

To snooze a keyword, you first have to find a post that includes it. That kind of defeats the whole purpose since you might run into the spoiler you didn’t want to see. But when asked about that problem, a Facebook spokesperson told me the company is looking into adding a preemptive snooze option in the next few weeks, potentially in News Feed Preferences. It’s also considering a recurring snooze list so you could easily re-enable hiding your favorite sports team before any game you’ll have to watch on delay.

For now, though, when you see the word you can hit the drop-down arrow on the post which will reveal an option to “snooze keywords in this post”. Tapping that reveals a list of nouns from the post you might want to nix, without common words like “the” in the way. So if you used the feature on a post that said “England won its World Cup game against Tunisia! Yes!”, the feature would pull out “World Cup”, “England”, and “Tunisia”. Select all that you want to snooze, and posts containing them will be hidden for a month. Currently, the feature only works on text, not images, and won’t suggest synonyms you might want to snooze as well.

The spokesperson says the feature “was something that kept coming up” in Facebook interviews with users. The option applies to any organic content, but you can’t block ads with it, so if you snoozed “Deadpool” you wouldn’t see posts from friends about the movie but still might see ads to buy tickets. Facebook’s excuse for this is that ads belong to a “a separate team, separate algorithm” but surely it just doesn’t want to open itself up to users mass-blocking its revenue driver. The spokesperson also said that snoozing isn’t currently being used for other content and ad targeting purposes.

We asked why users can’t permanently mute keywords like Twitter launched in November 2016, or the way Instagram launched keyword blocking for your posts’ comments in September 2016. Facebook says “If we’re hearing from people that they want more or less time” that might get added as the feature rolls out beyond a test. There is some sense to defaulting to only temporary muting, as users might simply forget they blocked their favorite sports team before a big game, and then wouldn’t see it mentioned forever after.

But when it comes to abuse, permanent muting is something Facebook really should offer. Instead it’s relied on users flagging abuse like racial slurs, and it recently revealed its content moderation guidelines. Some topics that are fine for others could be tough for certain people to see, though, and helping users prevent trauma probably deserves to be prioritized above stopping reality TV spoilers.

Go to Top