Going Public on Privacy: Q&A With Dipayan Ghosh

Dipayan Ghosh would like to be an “algorithmic ethicist.”

If that sounds daunting and a bit indecipherable, no worries. He’s got a PhD from Cornell in electrical and computer engineering and is working with Harvard and New America on web civil rights issues.

Then there were the years spent working on tech policy at the White House under President Obama, trying to put legislative legs under the Broadband Bill of Rights—which, like many other efforts over the past few years, never became law. He also worked on the White House’s eventual, and successful, support for Title II.

That was followed by a stint as privacy and policy advisor at Facebook. Now, at 27, he's with New America as part of a new public interest technology team of fellows.

Ghosh talked with Multichannel News Washington Bureau chief John Eggerton about network neutrality, the search for algorithmic diversity and how game theory could drive a marketplace model for privacy.

[For a shorter version of this interview, click here.]

MCN: Tell us about yourself.
Dipayan Ghosh: I’m a computer scientist by training and did my PhD in electrical engineering and computer science. In graduate school, I grew more interested in information theory, which is the study of how you send information from point A to point B as securely and efficiently as possible.

I grew very interested in the subfield of security and privacy. After the [Edward] Snowden disclosures, the Obama Administration was under a lot of pressure to respond to the public outcry and at the time there was a lot of scrutiny about what surveillance practices the government, more broadly, might employ and what kind of protections there were and should be in the future on individual privacy.

MCN: So you decided to join that effort?
DG: Yes, it was a really interesting time to be there. John Podesta came in to lead the approach to these really difficult issues. The President commissioned a report back to him, and led by Podesta, that would be a comprehensive review of Big Data and privacy and the implications for individual rights across society.

During that time Podesta held meetings with every industry under the sun. We engaged with banks and Internet companies and a civil rights round table, which led to the report.

MCN: And remind us what the key recommendations were.
DG: Some of the big ones were around Internet privacy standards for the internet privacy Bill of Rights. There were suggestions around how we can approach the international conversation around privacy and the safe harbor arrangements with Europe. It dealt with student privacy in the context of educational technology that was powered through the collection and analysis of student data and how you can create ethical standards around that.

Another was around “algorithmic discrimination,” and whether there were ways that Big Data in use by many sectors of private industry is potentially creating discriminatory impacts across society and how we can protect against that.

The civil rights round table brought that issue to the front, and really resonated with us as an issue that was important to get right.

MCN: And what has happened to the report since then?
DG: Out of that report came the Consumer Bill of Rights Act, a legislative proposal that came out of the White House in February 2015, as well as the Student Digital Privacy Act. That garnered bicameral, bipartisan support on the Hill, even though it hasn’t moved forward.

The conversation with Europe has progressed to where we renegotiated safe harbor and made a lot of progress, though there are new questions they are asking given the new political situation in the U.S.

So, yes, I think there were a lot of follow ups.

MCN: What else did you work on?
DG: We worked on net neutrality, which ultimately resulted in a recommendation by the President to reclassify common carriers under Title II. That recommendation came out on YouTube, of all places.

My appointment was across the National Economic Council and also the Office of Science and Technology. Many different groups across the White House were involved in net neutrality. And I think that led to some really progressive policies.

MCN: What made you move to Facebook?
DG: The safe harbor negotiations with Europe. I was particularly interested in moving to a company at the forefront of privacy. I worked in the privacy team within the policy organization at Facebook and worked partly on how we develop products in a way that is sensitive to users’s needs.

MCN: What are those needs and does this generation even want privacy?
DG: Obviously the conceptualization of privacy has evolved over time from decades ago being behind a wall and having your own physical space where you can have a reasonable expectation that nobody is spying on you, to today, where there are a lot of different platforms and a lot of different universes where we need to be aware of how we are perceived by others, if we care about that perception.

And Facebook has shown great leadership in that space. The kinds of privacy settings are top of the line. When I was studying this issue through an academic lens, I found it helps to think about it from a game theory perspective: Pit the end user against the corporation, say, a utility for example.

They both want to maximize their bottom lines and they don’t want to compromise that bottom line. For the consumer it’s their utility, for the corporation it’s their profits over time.

You have to consider many different things. It’s not just the up-front revenue you might have as a company or it’s not just the upfront savings you might have as a consumer. As a consumer, you might value the privacy and security of your information. As a company, you might value that information as well so that you can route power more effectively or be able to monetize it in different ways, let’s say the advertising ecosystem.

At the end of the day, both want to maximize their value.

And I think if you look at it from that perspective and say that by default I want my Nash Equilibrium (game theory speak for a solution in which neither player has an incentive to change no matter what the other player does) to be a solution where both the consumer and the company choose privacy rather than no privacy.

When you look at it from that perspective, you can reach the set of conditions that you need and achieve that equilibrium, to achieve that market-stable situation of both entities opting for privacy-aware design or privacy-preserving technology, or privacy and security applied to data security protections on a network, or encryption.

I think that when you look at it from an economic perspective, reasonable and sensible regulation is a good thing, and user education is a good thing. People need to be aware of where their data is going and what the privacy framework in the industry is.

MCN: Could privacy become a value-added differentiator in the marketplace for ISPs?
DG: I think a company needs to ask all those questions and figure out what is the right approach for them. Now, I think that some might insinuate that ISPs will take the action that is not good for privacy, especially in the absence of regulation.

But that is where advocates and other people in the public can come forward and really make known the concerns they have around these issues. I think that history has shown that can have a great impact on the courses of action that corporations take.

MCN: What are you working on now?
DG: I started this fellowship at New America and I want to look at the intersection of privacy and civil rights, how privacy and anti-discrimination policy can be incorporated into algorithms.

MCN: Which means?
DG: There was an issue in the White House report on Big Data discrimination, or algorithmic discrimination, or algorithmic fairness, or bias. There are different ways that people reference it, but essentially it is the idea that algorithms [sets of computer commands to accomplish a task] can be designed in such a way that leaves civil rights in the lurch.
In other words, they could have a discriminatory impact on society if the algorithms were not designed responsibly.

MCN: Sort of like algorithmic red lining?
DG: Yes. One example that has been referenced in the White House report is with an application called Street Bump, which basically crowdsources pothole information.

Boston partnered with them to learn where potholes were in real time, then go out and repair them. It turned out that the potholes that were getting repaired were in richer, younger neighborhood. Obviously you don’t fault the people who designed the algorithm from the get-go. It’s very hard to perceive that this issue might surface until you observe it.

But then they went in and compensated for it, which is the most incredible thing. But if they hadn’t, that would have potentially created a mechanism for discriminatory impacts.

MCN: So, it requires factoring in human nature with algorithmic theory?
DG: Absolutely. I think that in general, as an engineer myself, we design technology that will create the most value as efficiently as possible for the people we are designing it for.

Sometimes we might not be trained on the legal basis for certain civil rights laws or other legal or regulatory systems.

So, I think it is important to educate engineers and people thinking of how to design algorithms, on how they can think about these issues. You can call it algorithmic ethics. Then have them monitor the inputs and outputs from their algorithms and the effects and outcomes so if there is some evidence of distorted or disparate impact, it can be corrected.

MCN: So, you are trying to be an algorithmic ethicist?
DG: That’s a really nice title. I think it is a really difficult area to get your head around because there are so many different companies that are designing algorithms and it is really hard to think about how to design ethical standards that apply across the board.

Perhaps the best way to think about it is, sector by sector, industry by industry, looking at what types of standards could be established for the use and implementation of algorithms so that you can factor out, as best you can, these discriminatory impacts as they manifest themselves.

MCN: Facebook has been in the conversations about the proliferation of ‘fake news. Any thoughts?
DG: I think that company is moving in the right direction and starting to really have a positive conversation with a wide range of players who want to provide guidance as to what direction the platform should go. The start of that conversation has been really positive.

It’s a good step forward.

John Eggerton

Contributing editor John Eggerton has been an editor and/or writer on media regulation, legislation and policy for over four decades, including covering the FCC, FTC, Congress, the major media trade associations, and the federal courts. In addition to Multichannel News and Broadcasting + Cable, his work has appeared in Radio World, TV Technology, TV Fax, This Week in Consumer Electronics, Variety and the Encyclopedia Britannica.