Menu

Timesdelhi.com

June 25, 2019
Category archive

Mark Zuckerberg

Gender, race and social change in tech; Moira Weigel on the Internet of Women, Part Two

in #metoo/Apps/capitalism/China/Delhi/discrimination/Diversity/Elon Musk/equality/ethics/Europe/Feminism/gender diversity/gender equality/Google/Government/harvard/India/J.P. Morgan/logic/Mark Zuckerberg/Media/Personnel/Policy/Politics/sexism/Social/Startups/talent/TC/Venture Capital/wage gap/women in tech/Women's rights by

Tech ethics can mean a lot of different things, but surely one of the most critical, unavoidable, and yet somehow still controversial propositions in the emerging field of ethics in technology is that tech should promote gender equality. But does it? And to the extent it does not, what (and who) needs to change?

In this second of a two-part interview “On The Internet of Women,” Harvard fellow and Logic magazine founder and editor Moira Weigel and I discuss the future of capitalism and its relationship to sex and tech; the place of ambivalence in feminist ethics; and Moira’s personal experiences with #MeToo.

Greg E.: There’s a relationship between technology and feminism, and technology and sexism for that matter. Then there’s a relationship between all of those things and capitalism. One of the underlying themes in your essay “The Internet of Women,” that I thought made it such a kind of, I’d call it a seminal essay, but that would be a silly term to use in this case…

Moira W.: I’ll take it.

Greg E.: One of the reasons I thought your essay should be required reading basic reading in tech ethics is that you argue we need to examine the degree to which sexism is a part of capitalism.

Moira W.: Yes.

Greg E.: Talk about that.

Moira W.: This is a big topic! Where to begin?

Capitalism, the social and economic system that emerged in Europe around the sixteenth century and that we still live under, has a profound relationship to histories of sexism and racism. It’s really important to recognize that sexism and racism themselves are historical phenomena.

They don’t exist in the same way in all places. They take on different forms at different times. I find that very hopeful to recognize, because it means they can change.

It’s really important not to get too pulled into the view that men have always hated women there will always be this war of the sexes that, best case scenario, gets temporarily resolved in the depressing truce of conventional heterosexuality.  The conditions we live under are not the only possible conditions—they are not inevitable.

A fundamental Marxist insight is that capitalism necessarily involves exploitation. In order to grow, a company needs to pay people less for their work than that work is worth. Race and gender help make this process of exploitation seem natural.

Image via Getty Images / gremlin

Certain people are naturally inclined to do certain kinds of lower status and lower waged work, and why should anyone be paid much to do what comes naturally? And it just so happens that the kinds of work we value less are seen as more naturally “female.” This isn’t just about caring professions that have been coded female—nursing and teaching and so on, although it does include those.

In fact, the history of computer programming provides one of the best examples. In the early decades, when writing software was seen as rote work and lower status, it was mostly done by women. As Mar Hicks and other historians have shown, as the profession became more prestigious and more lucrative, women were very actively pushed out.

You even see this with specific coding languages. As more women learn, say, Javascript, it becomes seen as feminized—seen as less impressive or valuable than Python, a “softer” skill. This perception, that women have certain natural capacities that should be free or cheap, has a long history that overlaps with the history of capitalism.  At some level, it is a byproduct of the rise of wage labor.

To a medieval farmer it would have made no sense to say that when his wife had their children who worked their farm, gave birth to them in labor, killed the chickens and cooked them, or did work around the house, that that wasn’t “work,” [but when he] took the chickens to the market to sell them, that was. Right?

A long line of feminist thinkers has drawn attention to this in different ways. One slogan from the 70s was, ‘whose work produces the worker?’ Women, but neither companies nor the state, who profit from this process, expect to pay for it.

Why am I saying all this? My point is: race and gender have been very useful historically for getting capitalism things for free—and for justifying that process. Of course, they’re also very useful for dividing exploited people against one another. So that a white male worker hates his black coworker, or his leeching wife, rather than his boss.

Greg E.: I want to ask more about this topic and technology; you are a publisher of Logic magazine which is one of the most interesting publications about technology that has come on the scene in the last few years.

Friend portability is the must-have Facebook regulation

in Apps/Chris Hughes/data portability/Delhi/Facebook/Facebook Data Portability/Facebook Download Your Information/Facebook Policy/facebook privacy/Facebook Regulation/FTC/Government/India/Mark Zuckerberg/Opinion/Policy/Politics/privacy/Social/TC by

Choice for consumers compels fair treatment by corporations. When people can easily move to a competitor, it creates a natural market dynamic coercing a business to act right. When we can’t, other regulations just leave us trapped with a pig in a fresh coat of lipstick.

That’s why as the FTC considers how many billions to fine Facebook or which executives to stick with personal liability or whether to go full-tilt and break up the company, I implore it to consider the root of how Facebook gets away with abusing user privacy: there’s no simple way to switch to an alternative.

If Facebook users are fed up with the surveillance, security breaches, false news, or hatred, there’s no western general purpose social network with scale for them to join. Twitter is for short-form public content, Snapchat is for ephemeral communication. Tumblr is neglected. Google+ is dead. Instagram is owned by Facebook. And the rest are either Chinese, single-purpose, or tiny.

No, I don’t expect the FTC to launch its own “Fedbook” social network. But what it can do is pave an escape route from Facebook so worthy alternatives become viable options. That’s why the FTC must require Facebook offer truly interoperable data portability for the social graph.

In other words, the government should pass regulations forcing Facebook to let you export your friend list to other social networks in a privacy-safe way. This would allow you to connect with or follow those people elsewhere so you could leave Facebook without losing touch with your friends. The increased threat of people ditching Facebook for competitors would create a much stronger incentive to protect users and society.

The slate of potential regulations for Facebook currently being discussed by the FTC’s heads include a $3 billion to $5 billion fine or greater, holding Facebook CEO personally liable for violations of an FTC consent decree, creating new privacy and compliance positions including one held by executive that could be filled by Zuckerberg, creating an independent oversight committee to review privacy and product decisions, accordng to the New York Times and Washington Post. More extreme measures like restricting how Facebook collects and uses data for ad targeting, blocking future acquisitions, or breaking up the company are still possible but seemingly less likely.

Facebook co-founder Chris Hughes (right) recently wrote a scathing call to break up Facebook.

Breaking apart Facebook is a tantalizing punishment for the company’s wrongdoings. Still, I somewhat agree with Zuckerberg’s response to co-founder Chris Hughes’ call to split up the company, which he said “isn’t going to do anything to help” directly fix Facebook’s privacy or misinformation issues. Given Facebook likely wouldn’t try to make more acquisitions of big social networks under all this scrutiny, it’d benefit from voluntarily pledging not to attempt these buys for at least three to five years. Otherwise, regulators could impose that ban, which might be more politically attainable with fewer messy downstream effects,

Yet without this data portability regulation, Facebook can pay a fine and go back to business as usual. It can accept additional privacy oversight without fundamentally changing its product. It can become liable for upholding the bare minimum letter of the law while still breaking the spirit. And even if it was broken up, users still couldn’t switch from Facebook to Instagram, or from Instagram and WhatsApp to somewhere new.

Facebook Kills Competition With User Lock-In

When faced with competition in the past, Facebook has snapped into action improving itself. Fearing Google+ in 2011, Zuckerberg vowed “Carthage must be destroyed” and the company scrambled to launch Messenger, the Timeline profile, Graph Search, photo improvements and more. After realizing the importance of mobile in 2012, Facebook redesigned its app, reorganized its teams, and demanded employees carry Android phones for “dogfooding” testing. And when Snapchat was still rapidly growing into a rival, Facebook cloned its Stories and is now adopting the philosophy of ephemerality.

Mark Zuckerberg visualizes his social graph at a Facebook conference

Each time Facebook felt threatened, it was spurred to improve its product for consumers. But once it had defeated its competitors, muted their growth, or confined them to a niche purpose, Facebook’s privacy policies worsened. Anti-trust scholar Dina Srinivasan explains this in her summary of her paper “The Anti-Trust Case Against Facebook”:

“When dozens of companies competed in an attempt to win market share, and all competing products were priced at zero—privacy quickly emerged as a key differentiator. When Facebook entered the market it specifically promised users: “We do not and will not use cookies to collect private information from any user.” Competition didn’t only restrain Facebook’s ability to track users. It restrained every social network from trying to engage in this behavior . . .  the exit of competition greenlit a change in conduct by the sole surviving firm. By early 2014, dozens of rivals that initially competed with Facebook had effectively exited the market. In June of 2014, rival Google announced it would shut down its competitive social network, ceding the social network market to Facebook.

For Facebook, the network effects of more than a billion users on a closed-communications protocol further locked in the market in its favor. These circumstances—the exit of competition and the lock-in of consumers—finally allowed Facebook to get consumers to agree to something they had resisted from the beginning. Almost simultaneous with Google’s exit, Facebook announced (also in June of 2014) that it would begin to track users’ behavior on websites and apps across the Internet and use the data gleaned from such surveillance to target and influence consumers. Shortly thereafter, it started tracking non-users too. It uses the “like” buttons and other software licenses to do so.”

This is why the FTC must seek regulation that not only punishes Facebook for wrongdoings, but that lets consumers do the same. Users can punch holes in Facebook by leaving, both depriving it of ad revenue and reducing its network effect for others. Empowering them with the ability to take their friend list with them gives users a taller seat at the table. I’m calling for what University Of Chicago professors Luigi Zingales and Guy Rolnik termed a Social Data Portability Act.

Luckily, Facebook already has a framework for this data portability through a feature called Find Friends. You connect your Facebook account to another app, and you can find your Facebook friends who are already on that app.

But the problem is that in the past, Facebook has repeatedly blocked competitors from using Find Friends. That includes cutting off Twitter, Vine, Voxer, and MessageMe, while Phhhoto was blocked from letting you find your Instagram friends…six months before Instagram copied Phhhoto’s core back-and-forth GIF feature and named it Boomerang. Then there’s the issue that you need an active Facebook account to use Find Friends. That nullifies its utility as a way to bring your social graph with you when you leave Facebook.

Facebook’s “Find Friends” feature used to let Twitter users follow their Facebook friends, but Facebook later cut off access for competitors including Twitter and Vine seen here

The social network does offer a way to “Download Your Information” which is helpful for exporting photos, status updates, messages, and other data about you. Yet the friend list can only be exported as a text list of names in HTML or JSON format. Names aren’t linked to their corresponding Facebook profiles or any unique identifier, so there’s no way to find your friend John Smith amongst everyone with that name on another app. And less than 5 percent of my 2800 connections had used the little-known option to allow friends to export their email address. What about the big “Data Transfer Project” Facebook announced 10 months ago in partnership with Google, Twitter, and Microsoft to provide more portability? It’s released nothing so far, raising questions of whether it was vaporware designed to ward off regulators.

Essentially, this all means that Facebook provides zero portability for your friendships. That’s what regulators need to change. There’s already precedent for this. The Telecommunications Act of 1996 saw FCC require phone service carriers to allow customers to easily port their numbers to another carrier rather than having to be assigned a new number. If you think of a phone number as a method by which friends connect with you, it would be reasonable for regulators to declare that the modern equivalent — your social network friend connections — must be similarly portable.

How To Unchain Our Friendships

Facebook should be required to let you export a truly interoperable friend list that can be imported into other apps in a privacy-safe way.

To do that, Facebook should allow you to download a version of the list that feature hashed versions of the phone numbers and email addresses friends used to sign up. You wouldn’t be able to read that contact info or freely import and spam people. But Facebook could be required to share documentation teaching developers of other apps to build a feature that safely cross-checks the hashed numbers and email addresses against those of people who had signed up for their app. That developer wouldn’t be able to read the contact info from Facebook either, or store any useful data about people who hadn’t signed up for their app. But if the phone number or email address of someone in your exported Facebook friend list matched one of their users, they could offer to let you connect with or follow them.

This system would let you save your social graph, delete your Facebook account, and then find your friends on other apps without ever jeopardizing the privacy of their contact info. Users would no longer be locked into Facebook and could freely choose to move their friendships to whatever social network treats them best. And Facebook wouldn’t be able to block competitors from using it.

If the company wanted to go a step further, it could offer ways to makes News Feed content preferences or Facebook Groups connections portable, such as by making it easier for Group members to opt-in to joining a parallel email or text message mailing list. For researchers, Facebook could offer ways to export anonymized News Feed and activity data for study.

Portability would much more closely align the goals of users, Facebook, and the regulators. Facebook wouldn’t merely be responsible to the government for technically complying with new fines, oversight, or liability. It would finally have to compete to provide the best social app rather than relying on its network effect to handcuff users to its service.

This same model of data portability regulation could be expanded to any app with over 1 billion users, or even 100 million users to ensure YouTube, Twitter, Snapchat, or Reddit couldn’t lock down users either. By only applying the rule to apps with a sufficiently large user base, the regulation wouldn’t hinder new startup entrants to the market and accidentally create a moat around well-funded incumbents like Facebook that can afford the engineering chore. Data portability regulation combined with a fine, liability, oversight, and a ban on future acquisitions of social networks could set Facebook straight without breaking it up.

Users have a lot of complaints about Facebook that go beyond strictly privacy. But their recourse is always limited because for many functions there’s nowhere else to go, and it’s too hard to go there. By fixing the latter, the FTC could stimulate the rise of Facebook alternatives so that users rather regulators can play king-maker.

Zuckerberg says breaking up Facebook “isn’t going to help”

in Apps/Chris Hughes/Delhi/Drama/Facebook/Government/India/Mark Zuckerberg/Nick Clegg/Policy/Politics/privacy/Social/TC by

With the look of someone betrayed, Facebook’s CEO has fired back at co-founder Chris Hughes and his brutal NYT op-ed calling for regulators to split up Facebook, Instagram, and WhatsApp. “When I read what he wrote, my main reaction was that what he’s proposing that we do isn’t going to do anything to help solve those issues. So I think that if what you care about is democracy and elections, then you want a company like us to be able to invest billions of dollars per year like we are in building up really advanced tools to fight election interference” Zuckerberg told France Info while in Paris to meet with French President Emmanuel Macron.

Zuckerberg’s argument boils down to the idea that Facebook’s specific problems with privacy, safety, misinformation, and speech won’t be directly addressed by breaking up the company, and instead would actually hinder its efforts to safeguard its social networks. The Facebook family of apps would theoretically have fewer economies of scale when investing in safety technology like artificial intelligence to spot bots spreading voter suppression content.

Facebook’s co-founders (from left): Dustin Moskovitz, Chris Hughes, and Mark Zuckerberg

Hughes claims that “Mark’s power is unprecedented and un-American” and that Facebook’s rampant acquisitions and copying have made it so dominant that it deters competition. The call echoes other early execs like Facebook’s first president Sean Parker and growth chief Chamath Palihapitiya who’ve raised alarms about how the social network they built impacts society.

But Zuckerberg argues that Facebook’s size benefits the public. “Our budget for safety this year is bigger than the whole revenue of our company was when we went public earlier this decade. A lot of that is because we’ve been able to build a successful business that can now support that. You know, we invest more in safety than anyone in social media” Zuckerberg told journalist Laurent Delahousse.

The Facebook CEO’s comments were largely missed by the media, in part because the TV interview was heavily dubbed into French with no transcript. But written out here for the first time, his quotes offer a window into how deeply Zuckerberg dismisses Hughes’ claims. “Well [Hughes] was talking about a very specific idea of breaking up the company to solve some of the social issues that we face” Zuckerberg says before trying to decouple solutions from anti-trust regulation. “The way that I look at this is, there are real issues. There are real issue around harmful content and finding the right balance between expression and safety, for preventing election interference, on privacy.”

Claiming that a breakup “isn’t going to do anything to help” is a more unequivocal refutation of Hughes’ claim than that of Facebook VP of communications and former UK deputy Prime Minster Nick Clegg . He wrote in his own NYT op-ed today that “what matters is not size but rather the rights and interests of consumers, and our accountability to the governments and legislators who oversee commerce and communications . . . Big in itself isn’t bad. Success should not be penalized.”

Mark Zuckerberg and Chris Hughes

Something certainly must be done to protect consumers. Perhaps that’s a break up of Facebook. At the least, banning it from acquiring more social networks of sufficient scale so it couldn’t snatch another Instagram from its crib would be an expedient and attainable remedy.

But the sharpest point of Hughes’ op-ed was how he identified that users are trapped on Facebook. “Competition alone wouldn’t necessarily spur privacy protection — regulation is required to ensure accountability — but Facebook’s lock on the market guarantees that users can’t protest by moving to alternative platforms” he writes. After Cambridge Analytica “people did not leave the company’s platforms en masse. After all, where would they go?”

That’s why given critics’ call for competition and Zuckerberg’s own support for interoperability, a core tenet of regulation must be making it easier for users to switch from Facebook to another social network. As I’ll explore in an upcoming piece, until users can easily bring their friend connections or ‘social graph’ somewhere else, there’s little to compel Facebook to treat them better.

Facebook talked privacy, Google actually built it

in Apps/Artificial Intelligence/Delhi/Developer/Facebook/facebook privacy/Google/Google I/O 2019/google privacy/India/Mark Zuckerberg/mobile/Opinion/Policy/Politics/privacy/Sundar Pichai/TC by

Mark Zuckerberg: “The future is private”. Sundar Pichai: ~The present is private~. While both CEO’s made protecting user data a central theme of their conference keynotes this month, Facebook’s product updates were mostly vague vaporware while Google’s were either ready to ship or ready to demo. The contrast highlights the divergence in strategy between the two tech giants.

For Facebook, privacy is a talking point meant to boost confidence in sharing, deter regulators, and repair its battered image. For Google, privacy is functional, going hand-in-hand with on-device data processing to make features faster and more widely accessible.

Everyone wants tech to be more private, but we must discern between promises and delivery. Like “mobile”, “on-demand”, “AI”, and “blockchain” before it, “privacy” can’t be taken at face value. We deserve improvements to the core of how our software and hardware work, not cosmetic add-ons and instantiations no one is asking for.

AMY OSBORNE/AFP/Getty Images

At Facebook’s F8 last week, we heard from Zuckerberg about how “Privacy gives us the freedom to be ourselves” and he reiterated how that would happen through ephemerality and secure data storage. He said Messenger and Instagram Direct will become encrypted…eventually…which Zuckerberg had already announced in January and detailed in March. We didn’t get the Clear History feature that Zuckerberg made the privacy centerpiece of his 2018 conference, or anything about the Data Transfer Project that’s been silent for the 10 months since it’s reveal.

What users did get was a clumsy joke from Zuckerberg about how “I get that a lot of people aren’t sure that we’re serious about this. I know that we don’t exactly have the strongest reputation on privacy right now to put it lightly. But I’m committed to doing this well.” No one laughed. At least he admitted that “It’s not going to happen overnight.”

But it shouldn’t have to. Facebook made its first massive privacy mistake in 2007 with Beacon, which quietly relayed your off-site ecommerce and web activity to your friends. It’s had 12 years, a deal with the FTC promising to improve, countless screwups and apologies, the democracy-shaking Cambridge Analytica scandal, and hours of being grilled by congress to get serious about the problem. That makes it clear that if “the future is private”, then the past wasn’t. Facebook is too late here to receive the benefit of the doubt.

At Google’s I/O, we saw demos from Pichai showing how “our work on privacy and security is never done. And we want to do more to stay ahead of constantly evolving user expectations.” Instead of waiting to fall so far behind that users demand more privacy, Google has been steadily working on it for the past decade since it introduced Chrome incognito mode. It’s changed directions away from using Gmail content to target ads and allowing any developer to request access to your email, though there are plenty of sins to atone for. Now when the company is hit with scandals, it’s typically over its frightening efficiency as with its cancelled Project Maven AI military tech, not its creepiness.

Google made more progress on privacy in low-key updates in the runup to I/O than Facebook did on stage. In the past month it launched the ability to use your Android device as a physical security key, and a new auto-delete feature rolling out in the coming weeks that erases your web and app activity after 3 or 18 months. Then in its keynote today, it published “privacy commitments” for Made By Google products like Nest detailing exactly how they use your data and your control over that. For example, the new Nest Home Max does all its Face Match processing on device so facial recognition data isn’t sent to Google. Failing to note there’s a microphone in its Nest security alarm did cause an uproar in February, but the company has already course-corrected

That concept of on-device processing is a hallmark of the new Android 10 Q operating system. Opening in beta to developers today, it comes with almost 50 new security and privacy features like TLS 1.3 support and Mac address randomization. Google Assistant will now be better protected, Pichai told a cheering crowd. “Further advances in deep learning have allowed us to combine and shrink the 100 gigabyte models down to half a gigabyte — small enough to bring it onto mobile devices.” This makes Assistant not only more private, but fast enough that it’s quicker to navigate your phone by voice than touch. Here, privacy and utility intertwine.

The result is that Google can listen to video chats and caption them for you in real-time, transcribe in-person conversations, or relay aloud your typed responses to a phone call without transmitting audio data to the cloud. That could be a huge help if you’re hearing or vision impaired, or just have your hands full. A lot of the new Assistant features coming to Google Pixel phones this year will even work in Airplane mode. Pichai says that “Gboard is already using federated learning to improve next word prediction, as well as emoji prediction across 10s of millions of devices” by using on-phone processing so only improvements to Google’s AI are sent to the company, not what you typed.

Google’s senior director of Android Stephanie Cuthbertson hammered the idea home, noting that “On device machine learning powers everything from these incredible breakthroughs like Live Captions to helpful everyday features like Smart Reply. And it does this with no user input ever leaving the phone, all of which protects user privacy.” Apple pioneered much of the on-device processing, and many Google features still rely on cloud computing, but it’s swiftly progressing.

When Google does make privacy announcements about things that aren’t about to ship, they’re significant and will be worth the wait. Chrome will implement anti-fingerprinting tech and change cookies to be more private so only the site that created them can use them. And Incognito Mode will soon come to the Google Maps and Search apps.

Pichai didn’t have to rely on grand proclamations, cringey jokes, or imaginary product changes to get his message across. Privacy isn’t just a means to an end for Google. It’s not a PR strategy. And it’s not some theoretical part of tomorrow like it is for Zuckerberg and Facebook. It’s now a natural part of building user-first technology…after 20 years of more cavalier attitudes towards data. That new approach is why the company dedicated to organizing the world’s information has been getting so little backlash lately.

With privacy, it’s all about show, don’t tell.

How tech entrepreneurs think of Universal Basic Income

in Andrew Yang/bernie sanders/Chris Hughes/Delhi/Elon Musk/Facebook/Finance/Government/India/Mark Zuckerberg/MIT/Philanthropy/Policy/Politics/social security/Startups/ubi/universal basic income/welfare/Y Combinator by

As tech has grown, policy debates have become an important pastime. Today’s tech industry aspires to replace human drivers with self-driving cars, secretaries with AI assistants, permanent jobs with gigs — and as a result, the human impact of tech has become an everyday conversation.

No other idea is as emblematic of this as Universal Basic Income, a policy that would distribute a monthly sum to every adult regardless of their income or employment status.

The conversation is widespread. Mark Zuckerberg and Elon Musk have said that UBI may be desirable or necessary. Y-Combinator Research and Facebook co-founder Chris Hughes are running basic income studies. Tech-friendly presidential hopefuls Bernie Sanders and Andrew Yang support the issue.

But should the average tech entrepreneur or investor support UBI? The answer is not entirely clear.

The good news is that the tech industry is deeply familiar with risk, which is an important component of arguments for UBI. The bad news: risk isn’t the whole story, and both positive and negative evidence for the policy are currently thin.

Image via H. Armstrong Roberts/ClassicStock/Getty Images

The role of risk

Entrepreneurs understand the risk component of UBI because it’s the same risk they take in starting companies. Many entrepreneurs start with savings or seed funding that reduce their downside risk — and it’s not hard for them to imagine that others lack these resources. A UBI could solve the issue.

1 2 3 15
Go to Top