Timesdelhi.com

April 22, 2018
Category archive

Facebook

Data experts on Facebook’s GDPR changes: Expect lawsuits

in Cambridge Analytica/China/Congress/data protection/data protection law/data security/Delhi/Europe/European Union/Facebook/facial recognition/Federal Trade Commission/GDPR/General Data Protection Regulation/Government/India/Internet Business Models/Mark Zuckerberg/Paul-Olivier Dehaye/personally identifiable information/Policy/Politics/privacy/Social/social media/United States by

Make no mistake: Fresh battle lines are being drawn in the clash between data-mining tech giants and Internet users over people’s right to control their personal information and protect their privacy.

An update to European Union data protection rules next month — called the General Data Protection Regulation — is the catalyst for this next chapter in the global story of tech vs privacy.

A fairytale ending would remove that ugly ‘vs’ and replace it with an enlightened ‘+’. But there’s no doubt it will be a battle to get there — requiring legal challenges and fresh case law to be set down — as an old guard of dominant tech platforms marshal their extensive resources to try to hold onto the power and wealth gained through years of riding roughshod over data protection law.

Payback is coming though. Balance is being reset. And the implications of not regulating what tech giants can do with people’s data has arguably never been clearer.

The exciting opportunity for startups is to skate to where the puck is going — by thinking beyond exploitative legacy business models that amount to embarrassing blackboxes whose CEOs dare not publicly admit what the systems really do — and come up with new ways of operating and monetizing services that don’t rely on selling the lie that people don’t care about privacy.

 

More than just small print

Right now the EU’s General Data Protection Regulation can take credit for a whole lot of spilt ink as tech industry small print is reworded en masse. Did you just receive a T&C update notification about a company’s digital service? Chances are it’s related to the incoming standard.

The regulation is generally intended to strengthen Internet users’ control over their personal information, as we’ve explained before. But its focus on transparency — making sure people know how and why data will flow if they choose to click ‘I agree’ — combined with supersized fines for major data violations represents something of an existential threat to ad tech processes that rely on pervasive background harvesting of users’ personal data to be siphoned biofuel for their vast, proprietary microtargeting engines.

This is why Facebook is not going gentle into a data processing goodnight.

Indeed, it’s seizing on GDPR as a PR opportunity — shamelessly stamping its brand on the regulatory changes it lobbied so hard against, including by taking out full page print ads in newspapers…

This is of course another high gloss plank in the company’s PR strategy to try to convince users to trust it — and thus to keep giving it their data. Because — and only because — GDPR gives consumers more opportunity to lock down access to their information and close the shutters against countless prying eyes.

But the pressing question for Facebook — and one that will also test the mettle of the new data protection standard — is whether or not the company is doing enough to comply with the new rules.

One important point re: Facebook and GDPR is that the standard applies globally, i.e. for all Facebook users whose data is processed by its international entity, Facebook Ireland (and thus within the EU); but not necessarily universally — with Facebook users in North America not legally falling under the scope of the regulation.

Users in North America will only benefit if Facebook chooses to apply the same standard everywhere. (And on that point the company has stayed exceedingly fuzzy.)

It has claimed it won’t give US and Canadian users second tier status vs the rest of the world where their privacy is concerned — saying they’re getting the same “settings and controls” — but unless or until US lawmakers spill some ink of their own there’s nothing but an embarrassing PR message to regulate what Facebook chooses to do with Americans’ data. It’s the data protection principles, stupid.

Zuckerberg was asked by US lawmakers last week what kind of regulation he would and wouldn’t like to see laid upon Internet companies — and he made a point of arguing for privacy carve outs to avoid falling behind, of all things, competitors in China.

Which is an incredibly chilling response when you consider how few rights — including human rights — Chinese citizens have. And how data-mining digital technologies are being systematically used to expand Chinese state surveillance and control.

The ugly underlying truth of Facebook’s business is that it also relies on surveillance to function. People’s lives are its product.

That’s why Zuckerberg couldn’t tell US lawmakers to hurry up and draft their own GDPR. He’s the CEO saddled with trying to sell an anti-privacy, anti-transparency position — just as policymakers are waking up to what that really means.

 

Plus ça change?

Facebook has announced a series of updates to its policies and platform in recent months, which it’s said are coming to all users (albeit in ‘phases’). The problem is that most of what it’s proposing to achieve GDPR compliance is simply not adequate.

Coincidentally many of these changes have been announced amid a major data mishandling scandal for Facebook, in which it’s been revealed that data on up to 87M users was passed to a political consultancy without their knowledge or consent.

It’s this scandal that led Zuckerberg to be perched on a booster cushion in full public view for two days last week, dodging awkward questions from US lawmakers about how his advertising business functions.

He could not tell Congress there wouldn’t be other such data misuse skeletons in its closet. Indeed the company has said it expects it will uncover additional leaks as it conducts a historical audit of apps on its platform that had access to “a large amount of data”. (How large is large, one wonders… )

But whether Facebook’s business having enabled — in just one example — the clandestine psychological profiling of millions of Americans for political campaign purposes ends up being the final, final straw that catalyzes US lawmakers to agree their own version of GDPR is still tbc.

Any new law will certainly take time to formulate and pass. In the meanwhile GDPR is it.

The most substantive GDPR-related change announced by Facebook to date is the shuttering of a feature called Partner Categories — in which it allowed the linking of its own information holdings on people with data held by external brokers, including (for example) information about people’s offline activities.

Evidently finding a way to close down the legal liabilities and/or engineer consent from users to that degree of murky privacy intrusion — involving pools of aggregated personal data gathered by goodness knows who, how, where or when — was a bridge too far for the company’s army of legal and policy staffers.

Other notable changes it has so far made public include consolidating settings onto a single screen vs the confusing nightmare Facebook has historically required users to navigate just to control what’s going on with their data (remember the company got a 2011 FTC sanction for “deceptive” privacy practices); rewording its T&Cs to make it more clear what information it’s collecting for what specific purpose; and — most recently — revealing a new consent review process whereby it will be asking all users (starting with EU users) whether they consent to specific uses of their data (such as processing for facial recognition purposes).

As my TC colleague Josh Constine wrote earlier in a critical post dissecting the flaws of Facebook’s approach to consent review, the company is — at very least — not complying with the spirit of GDPR’s law.

Indeed, Facebook appears pathologically incapable of abandoning its long-standing modus operandi of socially engineering consent from users (doubtless fed via its own self-reinforced A/B testing ad expertise). “It feels obviously designed to get users to breeze through it by offering no resistance to continue, but friction if you want to make changes,” was his summary of the process.

But, as we’ve pointed out before, concealment is not consent.

To get into a few specifics, pre-ticked boxes — which is essentially what Facebook is deploying here, with a big blue “accept and continue” button designed to grab your attention as it’s juxtaposed against an anemic “manage data settings” option (which if you even manage to see it and read it sounds like a lot of tedious hard work) — aren’t going to constitute valid consent under GDPR.

Nor is this what ‘privacy by default’ looks like — another staple principle of the regulation. On the contrary, Facebook is pushing people to do the opposite: Give it more of their personal information — and fuzzing why it’s asking by bundling a range of usage intentions.

The company is risking a lot here.

In simple terms, seeking consent from users in a way that’s not fair because it’s manipulative means consent is not being freely given. Under GDPR, it won’t be consent at all. So Facebook appears to be seeing how close to the wind it can fly to test how regulators will respond.

Safe to say, EU lawmakers and NGOs are watching.

 

“Yes, they will be taken to court”

“Consent should not be regarded as freely given if the data subject has no genuine or free choice or is unable to refuse or withdraw consent without detriment,” runs one key portion of GDPR.

Now compare that with: “People can choose to not be on Facebook if they want” — which was Facebook’s deputy chief privacy officer, Rob Sherman’s, paper-thin defense to reporters for the lack of an overall opt out for users to its targeted advertising.

Data protection experts who TechCrunch spoke to suggest Facebook is failing to comply with, not just the spirit, but the letter of the law here. Some were exceeding blunt on this point.

“I am less impressed,” said law professor Mireille Hildebrandt discussing how Facebook is railroading users into consenting to its targeted advertising. “It seems they have announced that they will still require consent for targeted advertising and refuse the service if one does not agree. This violates [GDPR] art. 7.4 jo recital 43. So, yes, they will be taken to court.”

“Zuckerberg appears to view the combination of signing up to T&Cs and setting privacy options as ‘consent’,” adds cyber security professor Eerke Boiten. “I doubt this is explicit or granular enough for the personal data processing that FB do. The default settings for the privacy settings certainly do not currently provide for ‘privacy by default’ (GDPR Art 25).

“I also doubt whether FB Custom Audiences work correctly with consent. FB finds out and retains a small bit of personal info through this process (that an email address they know is known to an advertiser), and they aim to shift the data protection legal justification on that to the advertisers. Do they really then not use this info for future profiling?”

That looming tweak to the legal justification of Facebook’s Custom Audiences feature — a product which lets advertisers upload contact lists in a hashed form to find any matches among its own user-base (so those people can be targeted with ads on Facebook’s platform) — also looks problematical.

Here the company seems to be intending to try to claim a change in the legal basis, pushed out via new terms in which it instructs advertisers to agree they are the data controller (and it is merely a data processor). And thereby seek to foist a greater share of the responsibility for obtaining consent to processing user data onto its customers.

However such legal determinations are simply not a matter of contract terms. They are based on the fact of who is making decisions about how data is processed. And in this case — as other experts have pointed out — Facebook would be classed as a joint controller with any advertisers that upload personal data. The company can’t use a T&Cs change to opt out of that.

Wishful thinking is not a reliable approach to legal compliance.

 

Fear and manipulation of highly sensitive data

Over many years of privacy-hostile operation, Facebook has shown it has a major appetite for even very sensitive data. And GDPR does not appear to have blunted that.

Let’s not forget, facial recognition was a platform feature that got turned off in the EU, thanks to regulatory intervention. Yet here Facebook is now trying to use GDPR as a route to process this sensitive biometric data for international users after all — by pushing individual users to consent to it by dangling a few ‘feature perks’ at the moment of consent.

Veteran data protection and privacy consultant, Pat Walshe, is unimpressed.

“The sensitive data tool appears to be another data grab,” he tells us, reviewing Facebook’s latest clutch of ‘GDPR changes’. “Note the subtlety. It merges ‘control of sharing’ such data with FB’s use of the data “to personalise features and products”. From the info available that isn’t sufficient to amount to consent for such sensitive data and nor is it clear folks can understand the broader implications of agreeing.

“Does it mean ads will appear in Instagram? WhatsApp etc? The default is also set to ‘accept’ rather than ‘review and consider’. This is really sensitive data we’re talking about.”

“The face recognition suggestions are woeful,” he continues. “The second image — is using an example… to manipulate and stoke fear — “we can’t protect you”.

“Also, the choices and defaults are not compatible with [GDPR] Article 25 on data protection by design and default nor Recital 32… If I say no to facial recognition it’s unclear if other users can continue to tag me.”

Of course it goes without saying that Facebook users will keep uploading group photos, not just selfies. What’s less clear is whether Facebook will be processing the faces of other people in those shots who have not given (and/or never even had the opportunity to give) consent to its facial recognition feature.

People who might not even be users of its product.

But if it does that it will be breaking the law. Yet Facebook does indeed profile non-users — despite Zuckerberg’s claims to Congress not to know about its shadow profiles. So the risk is clear.

It can’t give non-users “settings and controls” not to have their data processed. So it’s already compromised their privacy — because it never gained consent in the first place.

New Mexico Representative Ben Lujan made this point to Zuckerberg’s face last week and ended the exchange with a call to action: “So you’re directing people that don’t even have a Facebook page to sign up for a Facebook page to access their data… We’ve got to change that.”

WASHINGTON, DC – APRIL 11: Facebook co-founder, Chairman and CEO Mark Zuckerberg prepares to testify before the House Energy and Commerce Committee in the Rayburn House Office Building on Capitol Hill April 11, 2018 in Washington, DC. This is the second day of testimony before Congress by Zuckerberg, 33, after it was reported that 87 million Facebook users had their personal information harvested by Cambridge Analytica, a British political consulting firm linked to the Trump campaign. (Photo by Chip Somodevilla/Getty Images)

But nothing in the measures Facebook has revealed so far, as its ‘compliance response’ to GDPR, suggest it intends to pro-actively change that.

Walshe also critically flags how — again, at the point of consent — Facebook’s review process deploys examples of the social aspects of its platform (such as how it can use people’s information to “suggest groups or other features or products”) as a tactic for manipulating people to agree to share religious affiliation data, for example.

“The social aspect is not separate to but bound up in advertising,” he notes, adding that the language also suggests Facebook uses the data.

Again, this whiffs a whole lot more than smells like GDPR compliance.

“I don’t believe FB has done enough,” adds Walshe, giving a view on Facebook’s GDPR preparedness ahead of the May 25 deadline for the framework’s application — as Zuckerberg’s Congress briefing notes suggested the company itself believes it has. (Or maybe it just didn’t want to admit to Congress that U.S. Facebook users will get lower privacy standards vs users elsewhere.)

“In fact I know they have not done enough. Their business model is skewed against privacy — privacy gets in the way of advertising and so profit. That’s why Facebook has variously suggested people may have to pay if they want an ad free model & so ‘pay for privacy’.”

“On transparency, there is a long way to go,” adds Boiten. “Friend suggestions, profiling for advertising, use of data gathered from like buttons and web pixels (also completely missing from “all your Facebook data”), and the newsfeed algorithm itself are completely opaque.”

“What matters most is whether FB’s processing decisions will be GDPR compliant, not what exact controls are given to FB members,” he concludes.

US lawmakers also pumped Zuckerberg on how much of the information his company harvests on people who have a Facebook account is revealed to them when they ask for it — via its ‘Download your data’ tool.

His answers on this appeared to intentionally misconstrue what was being asked — presumably in a bid to mask the ugly reality of the true scope and depth of the surveillance apparatus he commands. (Sometimes with a few special ‘CEO privacy privileges’ thrown in — like being able to selectively retract just his own historical Facebook messages from conversations, ahead of bringing the feature to anyone else.)

‘Download your Data’ is clearly partial and self-serving — and thus it also looks very far from being GDPR compliant.

 

Not even half the story

Facebook is not even complying with the spirit of current EU data protection law on data downloads. Subject Access Requests give individuals the right to request not just the information they have voluntarily uploaded to a service, but also personal data the company holds about them; Including giving a description of the personal data; the reasons it is being processed; and whether it will be given to any other organizations or people.

Facebook not only does not include people’s browsing history in the info it provides when you ask to download your data — which, incidentally, its own cookies policy confirms it tracks (via things like social plug-ins and tracking pixels on millions of popular websites etc etc) — it also does not include a complete list of advertisers on its platform that have your information.

Instead, after a wait, it serves up an eight-week snapshot. But even this two month view can still stretch to hundreds of advertisers per individual.

If Facebook gave users a comprehensive list of advertisers’ access to their information the number of third party companies would clearly stretch into the thousands. (In some cases thousands might even be a conservative estimate.)

There’s plenty of other information harvested from users that Facebook also intentionally fails to divulge via ‘Download your data’. And — to be clear — this isn’t a new problem either. The company has a very long history of blocking these type of requests.

In the EU it currently invokes a exception in Irish law to circumvent more fulsome compliance — which, even setting GDPR aside, raises some interesting competition law questions, as Paul-Olivier Dehaye told the UK parliament last month.

“All your Facebook data” isn’t a complete solution,” agrees Boiten. “It misses the info Facebook uses for auto-completing searches; it misses much of the information they use for suggesting friends; and I find it hard to believe that it contains the full profiling information.”

“Ads Topics” looks rather random and undigested, and doesn’t include the clear categories available to advertisers,” he further notes.

Facebook wouldn’t comment publicly about this when we asked. But it maintains its approach towards data downloads is GDPR compliant — and says it’s reviewed what it offers via with regulators to get feedback.

Earlier this week it also put out a wordy blog post attempting to diffuse this line of attack by pointing the finger of blame at the rest of the tech industry — saying, essentially, that a whole bunch of other tech giants are at it too.

Which is not much of a moral defense even if the company believes its lawyers can sway judges with it. (Ultimately I wouldn’t fancy its chances; the EU’s top court has a robust record of defending fundamental rights.)

 

Think of the children…

What its blog post didn’t say — yet again — was anything about how all the non-users it nonetheless tracks around the web are able to have any kind of control over its surveillance of them.

And remember, some Facebook non-users will be children.

So yes, Facebook is inevitably tracking kids’ data without parental consent. Under GDPR that’s a majorly big no-no.

TC’s Constine had a scathing assessment of even the on-platform system that Facebook has devised in response to GDPR’s requirements on parental consent for processing the data of users who are between the ages of 13 and 15.

“Users merely select one of their Facebook friends or enter an email address, and that person is asked to give consent for their ‘child’ to share sensitive info,” he observed. “But Facebook blindly trusts that they’ve actually selected their parent or guardian… [Facebook’s] Sherman says Facebook is “not seeking to collect additional information” to verify parental consent, so it seems Facebook is happy to let teens easily bypass the checkup.”

So again, the company is being shown doing the minimum possible — in what might be construed as a cynical attempt to check another compliance box and carry on its data-sucking business as usual.

Given that intransigence it really will be up to the courts to bring the enforcement stick. Change, as ever, is a process — and hard won.

Hildebrandt is at least hopeful that a genuine reworking of Internet business models is on the way, though — albeit not overnight. And not without a fight.

“In the coming years the landscape of all this silly microtargeting will change, business models will be reinvented and this may benefit both the advertisers, consumers and citizens,” she tells us. “It will hopefully stave off the current market failure and the uprooting of democratic processes… Though nobody can predict the future, it will require hard work.”

News Source = techcrunch.com

Can data science save social media?

in ceo/Column/computing/Congress/cryptography/Delhi/digital media/encryption/ENVEIL/European Union/Facebook/Federal government/General Data Protection Regulation/Google/H&M/human rights/IBM/India/kroger/Mark Zuckerberg/maryland/Microsoft/National Security Agency/online ads/online advertising/photographer/pilot/Politics/privacy/social media/terms of service/tesco/United States/Wal-Mart/world wide web by

The unfettered internet is too often used for malicious purposes and is frequently woefully inaccurate. Social media — especially Facebook — has failed miserably at protecting user privacy and blocking miscreants from sowing discord.

That’s why CEO Mark Zuckerberg was just forced to testify about user privacy before both houses of Congress. And now governmental regulation of FaceBook and other social media appears to be a fait accompli.

At this key juncture, the crucial question is whether regulation — in concert with FaceBook’s promises to aggressively mitigate its weaknesses — correct the privacy abuses and continue to fulfill FaceBook’s goal of giving people the power to build transparent communities, bringing the world closer together?

The answer is maybe.

What has not been said is that FaceBook must embrace data science methodologies initially created in the bowels of the federal government to help protect its two billion users. Simultaneously, FaceBook must still enable advertisers — its sole source of revenue — to get the user data required to justify their expenditures.

Specifically, Facebook must promulgate and embrace what is known in high-level security circles as homomorphic encryption (HE), often considered the “Holy Grail” of cryptography, and data provenance (DP). HE would enable Facebook, for example, to generate aggregated reports about its user psychographic profiles so that advertisers could still accurately target groups of prospective customers without knowing their actual identities.

Meanwhile, data provenance – the process of tracing and recording true identities and the origins of data and its movement between data bases – could unearth the true identities of Russian perpetrators and other malefactors or at least identify unknown provenance, adding much needed transparency in cyberspace.

Both methodologies are extraordinarily complex. IBM and Microsoft, in addition to the National Security Agency, have been working on HE for years but the technology has suffered from significant performance challenges. Progress is being made, however. IBM, for example, has been granted a patent on a particular HE method – a strong hint it’s seeking a practical solution – and last month proudly announced that its rewritten HE encryption library now works up to 75 times faster. Maryland-based ENVEIL, a startup staffed by the former NSA HE team, has broken the performance barriers required to produce a commercially viable version of HE, benchmarking millions of times faster than IBM in tested use cases.

How Homomorphic Encryption Would Help FaceBook

HE is a technique used to operate on and draw useful conclusions from encrypted data without decrypting it, simultaneously protecting the source of the information. It is useful to FaceBook because its massive inventory of personally identifiable information is the foundation of the economics underlying its business model. The more comprehensive the datasets about individuals, the more precisely advertising can be targeted.

HE could keep Facebook information safe from hackers and inappropriate disclosure, but still extract the essence of what the data tells advertisers. It would convert encrypted data into strings of numbers, do math with these strings, and then decrypt the results to get the same answer it would if the data wasn’t encrypted at all.

A particularly promising sign for HE emerged last year, when Google revealed a new marketing measurement tool that relies on this technology to allow advertisers to see whether their online ads result in in-store purchases.

Unearthing this information requires analyzing datasets belonging to separate organizations, notwithstanding the fact that these organizations pledge to protect the privacy and personal information of the data subjects. HE skirts this by generating aggregated, non-specific reports about the comparisons between these datasets.

In pilot tests, HE enabled Google to successfully analyze encrypted data about who clicked on an advertisement in combination with another encrypted multi-company dataset that recorded credit card purchase records. With this data in hand, Google was able to provide reports to advertisers summarizing the relationship between the two databases to conclude, for example, that five percent of the people who clicked  on an ad wound up purchasing in a store.

Data Provenance

Data provenance has a markedly different core principle. It’s based on the fact that digital information is atomized into 1’s and 0’s with no intrinsic truth. The dual digits exist only to disseminate information, whether accurate or widely fabricated. A well-crafted lie can easily be indistinguishable from the truth and distributed across the internet. What counts is the source of these 1’s and 0’s. In short, is it legitimate?  What is the history of the 1’ and 0’s?

The art market, as an example, deploys DP to combat fakes and forgeries of the world’s greatest paintings, drawing and sculptures. It uses DP techniques to create a verifiable, chain-of-custody for each piece of the artwork, preserving the integrity of the market.

Much the same thing can be done in the online world. For example, a FaceBook post referencing a formal statement by a politician, with an accompanying photo, would  have provenance records directly linking the post to the politician’s press release and even the specifics of the photographer’s camera. The goal – again – is ensuring that data content is legitimate.

Companies such as Wal-Mart, Kroger, British-based Tesco and Swedish-based H&M, an international clothing retailer, are using or experimenting with new technologies to provide provenance data to the marketplace.

Let’s hope that Facebook and its social media brethren begin studying HE and DP thoroughly and implement it as soon as feasible. Other strong measures — such as the upcoming implementation of the European Union’s General Data Protection Regulation, which will use a big stick to secure personally identifiable information – essentially should be cloned in the U.S. What is best, however, are multiple avenues to enhance user privacy and security, while hopefully preventing breaches in the first place. Nothing less than the long-term viability of social media giants is at stake.

News Source = techcrunch.com

A flaw-by-flaw guide to Facebook’s new GDPR privacy changes

in Advertising Tech/Apps/Delhi/Facebook/Facebook ads/Facebook Data privacy/Facebook Download Your Information/GDPR/Government/India/mobile/Policy/Politics/privacy/Social/TC by

Facebook is about to start pushing European users to speed through giving consent for its new GDPR privacy law compliance changes. They ask users review how Facebook uses data around the web to target you with ads, sensitive profile info they share, and facial recognition But with a design the encourages rapidly hitting the “Agree” button, a lack of granular controls, a laughably cheatable parental consent request for teens, and an aesthetic overhaul of Download Your Information that doesn’t make it any easier to switch social networks, Facebook shows it’s still hungry for your data.

The new privacy change and terms of service consent flow will appear starting this week to European users, though they’ll be able to dismiss it for now, at least until the May 25th GDPR compliance deadline Facebook vowed to uphold in Europe. Meanwhile, Facebook says it will roll out the changes and consent flow globally over the coming weeks and months, though with some slight regional differences. And finally, all teens worldwide that share sensitive info will have to go through the weak new parental consent flow.

Facebook brought a group of reporters to the new Building 23 at its Menlo Park headquarters to preview the changes. But feedback was heavily critical as journalists grilled Facebook’s deput chief privacy officer Rob Sherman. Questions centered around how Facebook makes accepting the updates much easier than review or changing them, but Sherman stuck to talking points about how important it was to give users choice and information.

“Trust is really important and it’s clear that we have a lot of work to do to regain the trust of people on our service” he said, giving us deja vu about Mark Zuckerberg’s testimonies before congress. “We know that people won’t becomfortable using facebook if they don’t feel that their information is protected.”

Trouble At Each Step Of Facebook’s Privacy Consent Flow

There are a ton of small changes so we’ll lay out each with our criticisms.

Facebook’s consent flow starts well enough with the screen above offering a solid overview of why it’s making changes for GDPR and what you’ll be reviewing. But with just an ‘X’ up top to back out, it’s already training users to speed through by hitting that big blue button at the bottom.

Sensitive Info

First up is control of your sensitive profile information, specifically your sexual preference, religious views, and political views. As you’ll see at each step, you can either hit the pretty blue “Accept And Continue” button regardless of whether you’ve scrolled through the information. But if you hit the ugly grey “Manage Settings” button, you have to go through an interstitial where Facebook makes it’s argument trying to deter you from moving the info before letting you make and save your choice. It feels obviously designed to get users to breeze through it by offering no resistance to continue, but friction if you want to make changes.

Facebook doesn’t let advertisers target you based on this sensitive info, which is good. The only exception is that in the US, political views alongside political Pages and Events you interact with inform your overarching personality categories that can be targeted with ads. But your only option here is either to remove any info you’ve shared in these categories so friends can’t see it, or allow Facebook to use it to personalize the site. There’s no option to keep this stuff on your profile but not let Facebook use it.

Facial Recognition

The Face Recognition step won’t actually give users in the European Union a choice, as the government has banned the feature. But everyone else will get to choose whether to leave their existing setting, which defaults to on, or turn off the feature. Here the lack of granularity is concerning. Users might want to see warnings about possible impersonators using their face in their profile pics, but not be suggested as someone to tag in their friends’ photos. Unfortunately, it’s all or nothing. While Facebook is right to make it simple to turn on or off completely, granular controls that unfold for those that want them would be much more empowering.

Data Collection Across The Web

A major concern that’s arisen in the wake of Zuckerberg’s testimonies is how Facebook uses data collected about you from around the web to target users with ads and optimize its service. While Facebook deputer chief privacy officer Rob Sherman echoed Zuckerberg in saying that users tell the company they prefer relevant ads, and that this data can help thwart hackers and scrapers, many users are unsettled by the offsite collection practices. Here, Facebook lets you block it from targeting you wih ads based on data about your browsing behavior on sites that show its Like and share buttons, conversion Pixel, or Audience Network ads. Here the issue is that there’s no way to stop Facebook from using that data from personalizing your News Feed or optimizing other parts of its service.

New Terms Of Service

Facebook recently rewrote its Terms Of Service and Data Use Policy to be more explicit and easy to read. It didn’t make any significant changes other than noting the policy now applies to its subsidiaries like Instagram, WhatsApp, and Oculus. That’s all clearly explained here, which is nice. But the fact that the button to reject the new Terms Of Service isn’t even a button, it’s a tiny ‘see your options’ hyperlink shows how badly Facebook wants to avoid you closing your account. When Facebook’s product designer for the GDPR flow was asked if she thought this hyperlink was the best way to present the alternative to the big ‘I Accept’ button, she disingenuously said yes, eliciting scoffs from the room of reporters. It seems obvious that Facebook is trying to minimize the visibility of the path to account deletion rather than making it an obvious course of action if you don’t agree to its terms.

I requested Facebook actually show us what was on the other side of the that tine ‘see my options’ link and this is what we got. First, Facebook doesn’t mention its temporary deactivation option, just the scary permanent delete option. Facebook recommends downloading your data before deleting your account, which you should. But the fact that you’ll have to wait (often a few hours) before you can download your data could push users to delay deletion and perhaps never resume. And only if you keep scrolling do you get to another tiny “I’m ready to delete my account” hyperlink instead of a real button.

Parental Consent

GDPR also implements new regulation about how teens are treated, specifically users between the ages of 13 (the minimum age required to sign up for Facebook) and 15. If users in this age range have shared their religious views, political views, or sexual preference, Facebook requires them to either remove it or get parental consent to keep it. But the system for attaining and verifying that parental consent is a joke.

Users merely select one of their Facebook friends or enter an email address, and that person is asked to give consent for their ‘child’ to share sensitive info. But Facebook blindly trusts that they’ve actually selected their parent or guardian, even though it has a feature for users to designate who their family is, and the kid could put anyone in the email field, including an alternate address they control. Sherman says Facebook is “not seeking to collect additional information” to verify parental consent, so it seems Facebook is happy to let teens easily bypass the checkup.

Privacy Shortcuts

To keep all users abreast of their privacy settings, Facebook has redesigned its Privacy Shortcuts in a colorful format that sticks out from the rest of the site. No complaints here.

Download Your Information

Facebook has completely redesigned its Download Your Information tool after keeping it basically the same for the past 8 years. You can now view your content and data in different categories without downloading it, which alongside the new privacy shortcuts is perhaps the only unequivocally positive and unproblematic change amidst today’s announcements.

And Facebook now lets you select certain categories of data, date ranges, JSON or HTML format, and image quality to download. That could make it quicker and easier if you just need a copy of a certain type of content but don’t need to export all your photos and videos for example. Thankfully, Facebook says you’ll be able to now export your media in a higher resolution than the old tool allowed.

But the big problem here was the subject of my feature piece about Facebook’s lack of data portability. The Download Your Information tool is supposed to let you take your data and go to a different social network. But it only exports your social graph aka your friends as a text list of names. There are no links, usernames, or other unique identifiers unless friends opt into let you export their email or phone number, so good luck finding the right John Smith on another app. The new version of Download Your Information works exactly the same, rather than offering any interoperable format that would let you find your friends elsewhere.

A Higher Standard

Overall, it seems like Facebook is complying with the letter of GDPR law, but with questionable spirit. Sure, privacy is boring to a lot of people. Too little info and they feel confused and scared. Too many choices and screens and they feel overwhelmed and annoyed. Facebook struck the right balance in some places here. But the subtly pushy designs seem intended to push people away from changing their defaults in ways that could hamper Facebook’s mission and business.

Making the choices even in visible weight, rather than burying the ways to make changes in grayed-out buttons and tiny links, would have been more fair. And it would have shown that Facebook has faith in the value it provides, such that users would stick around and leave features enabled if they truly wanted to.

When questioned about this, Sherman pointed the finger at other tech companies, saying he thought Facebook was more upfront with users. Asked to clarify if he thought Facebook’s approach was “better”, he said “I think that’s right”. But Facebook isn’t being judged by the industry standard because it’s not a standard company. It’s built its purpose and its business on top of our private data, and touted itself as a boon to the world. But when asked to clear a higher bar for privacy, Facebook delved into design tricks to keep from losing our data

News Source = techcrunch.com

Virtual Instagram celebrity “Lil Miquela” has had her account hacked

in Avatar/california/computing/Delhi/Facebook/India/instagram/internet culture/photo sharing/Politics/simulation/social media/Software/Spotify/TC/world wide web by

The Instagram account for the virtual celebrity known as Lil Miquela has been hacked.

The multi-racial fashionista and advocate for multiculturalism, whose account is followed by nearly 1 million people, has had “her” account taken over by another animated Instagram account holder named “Bermuda“.

Welcome to the spring of 2018.

The hack of the @Lilmiquela account started earlier today, but the Bermuda avatar has long considered Miquela her digital nemesis and has taken steps to hack other of Miquela’s social accounts — like Spotify — before.

Because this is the twenty-first century — and given the polarization of the current political climate — it’s not surprising that the very real culture wars between proponents of pluralism and the Make America Great Again movement would take their fight to feuding avatars.

In posts on the Lil Maquela account, Bermuda proudly flaunts her artificial identity… and a decidedly pro-Trump message.

Unlike Miquela, whose account plays with the notion of a physical presence for a virtual avatar, Bermuda is very clearly a simulation. And one with political views that are diametrically opposed to those espoused by Miquela (whose promotion of openness and racial equality has been a feature that’s endeared the account to followers and fashion and culture magazines alike).

Miquela Sousa, a Brazilian-American from Downey, Calif., launched her Instagram account in 2016. Since the account’s appearance, Miquela has been a subject of speculation in the press and online.

Appearing on magazine covers, and consenting to do interviews with reporters, Miquela has been exploring notions of celebrity, influence and culture since her debut on Facebook’s new most popular social media site

A person familiar with the Lil Miquela account said that Instagram was working on regaining control.

News Source = techcrunch.com

1 2 3 33
Go to Top