Fast Forward: 'Note to Self' Host Manoush Zomorodi

fast-forward--and-39;note-to-self-and-39;-host-manoush-zomorodi photo 1

My guest today is Manoush Zomorodi, the Host of Note to Self that's produced by WNYC Studios. She just launched a new project called The Privacy Paradox, and we're going to talk about privacy, digital media, and the government. We are also going to make sure you know all of the things your phone knows about you.

Dan Costa: Before we get into The Privacy Paradox itself, let's talk about what you do because you are a radio personality. You've got a podcast, so it goes out as a podcast as well. You can listen to it online. But then you do these experiments that are not what people might expect from a radio show

Manoush Zomorodi: No, they're a little weird.

Explain this process of these experiments, these engagements.

Okay, so this is the third year that we're doing it. The last two Januaries, what we've done is we've taken a concept, and we've asked people, "Join us for one week, try a behavior change, and see what happens." For the first one, it was called Boredom Brilliant. We had 20,000 people sign up and sort of rethink how they use their phone. We asked them to put their phone down a little bit to try and get bored because I went down this rabbit hole where I was like, "Wait, what happens when we get bored?" I don't think I ever get bored anymore because I have my phone. We ask people to purposefully make time to get bored for a week to see if it would jump-start their creativity, and it worked. Dan, like my favorite comments, were the teenagers who were like, "You know, I don't think I've ever had this sensation before." Because if you think about it, they've had phones their whole lives. They don't know what being bored is actually ...

They've never stood in a line and not been able to talk to their friends.

Right, without anything to do, had to space out. They've never had to let their minds wander. We did that for a week. It was great. People were really into it. There's a book coming out in September. That was Boredom Brilliant. And then last year we did something called Infomagical. Again, it was a week of every day, a different challenge. What we were trying to do was help people deal with their information overload. This idea that I certainly had every night of just mindlessly scrolling, scrolling, scrolling. I was wondering, how much of this information do I retain in my brain?

We talked, again, to neuroscientists, cognitive psychologists, technologists about what the brain can do, how we can use our technology to help us, to enhance our lives rather than like make us feel like drooling idiots at the end of the day. That was great. That was 40,000 people, again, for a week. We made all these behavior changes, and 70% of them said that they felt they were able to handle information overload better than before. Of course, you know, this was last January. It feels like a million years ago because we weren't even talking about echo chambers and filter bubbles and ...

Pre-election.

.. alternative facts and all of those things. I feel like we need to reprise that one.

Yeah, you do.

And then here we are again. It's January, and, I mean, it feels like so long ago when I started thinking about this one, The Privacy Paradox, which is more about our digital civil liberties, which ended up being incredibly timely, but not in the way I had expected it to be. I sort of thought this was like in a post-Ed Snowden thing.

I also really wanted to talk about this idea of the ad-based economy, the surveillance economy, as some of the experts I spoke to described it to me. It's been a weird couple weeks, and so I feel like people feel a little like, "Ah, what do I ... I can't walk away from breaking news." But actually, I think what we're finding is that people want to sign up for this week of challenges, which is next week, because it's simple, easy, constructive things that they can try every day, and you can do something.

You've got the expert sources, traditional reporting, and then you've got all these people doing these experiments and giving you feedback.

Giving us feedback, that's the key.

That feedback then gets wrapped into your story, too.

Yes, exactly.

You've got real people that are living it in real time. That's going to happen next week. If somebody misses that first day or two, they'll still be able to catch up.

Exactly. What we've had it is that you can do Infomagical, for example ... Like, today you could sign up for it. It triggers. I mean, ideally, you do it with ... You know, you wake up in the morning and you think to yourself, "Wow, tens of thousands of other people are waking up right now, and we're all going to do this weird thing today," and there's a sort of collective community element to it in addition to the on-demand element to it.

I'm trying to get the best of both worlds. What was a real surprise to me with these projects is that at the end, we have had so many clinical researchers come to us and say, "Uh, can you share your data with us? Because we think it might help us figure out what we need to be studying as we go forward." I'm like, "Yeah, our data's semi-scientific. It is not ... You know, we don't have a control group. We're not doing it ... certainly not in a lab like this one." They are really, I think, looking to people boots on the ground of how technology's changing us as human beings and where they need to be going regarding the actual published research going forward.

The Privacy Paradox is a phrase that you sort of intuitively know what it means. But I don't think I've ever heard that phrase before. It perfectly encapsulates what you're talking about.

Well, I'd heard the phrase, and then I dug into it. This is a phrase, the privacy paradox, that behavioral economists use, specifically at Carnegie Mellon. They like to throw this one around.

Are they claiming ownership?

They do. They're allowed to. The idea is that, like, actually, no matter what you hear about millennials telling the most intimate things on Instagram, it's not true. Americans value their privacy immensely. 74% of Americans, this was according to Pew Research, say that it is extremely important to them that they control who has ownership of their personal information and data. Then what happens, though, every day, you know how it is, it's ...

They give it away.

They give it away. Exactly. The question is, if we care so much about it, why are we giving it away? You know, it's really not the consumer's fault. It is ... You cannot be a person in the world in 2017 and not be on these platforms. For example, one woman was like ... she, a listener, emailed me, was like, "Well, I'm going back to work. I was a stay-at-home mom. I need to be relevant. I need to be searchable. I used to be really protective of my privacy, and I can't be because I need to get a job. So how do I balance these two things, which is providing for my family, but also, like, a fundamental belief that I shouldn't have to reveal too much?"

Let's touch on this idea that surveillance economy powers and pays for so much of our digital lives—it is all driven by personal information. Data power it. Not just any data, it's your private information. It drives Google. It drives Facebook. It drives every media site that's out there, including PCMag.com.

That is our advertising model. It's why we can create free content. It's why Facebook can create a free service. It's why we have this great free search engine that indexes the entire world. Why do you want to mess with that?

Well, you know, I will say, what I found interesting I didn't quite understand. Shoshana Zuboff ... I'm like name dropping all kinds of people for you today, Dan. She is a retired Harvard researcher. She's like at the nexus of this and coined that term "surveillance economy." It's one thing if you're like, "Okay, here's my Social Security number." That's obvious that's personal information. But she talks about the digital exhaust that each of us has, or digital breadcrumbs. It's not the data, per se, but it's the metadata. It's not the phone call I made, but what time I made the phone call, where I was when I made the phone call. That it's our behavior that is being used, and we don't have control over that.

I'm perfectly fine with giving ... You know, we make that trade off every day, right? The privacy calculus, which is like, "Fine, I will give you all my personal information as long as you give me a great product, a free product, a very personalized product." All of us are going to ... You know, I might be more vigilant about that than somebody else, but at least I should know what you're using my information for, and at least I should have the option of deciding to take it back, which I can't. Once it's out there, that's it. Facebook has it forever.

As you surveyed the landscape, did you find that there are enough privacy controls? Like, there are privacy controls on Facebook. Most people don't know how to use them. Is it that there are controls that out there and people aren't using them, or is it that ultimately you don't have a lot of control in whether Facebook or Amazon or Apple or Microsoft have all this information about you?

Well, I mean, there's one study that says it would take us 22 days a year to read all the terms of service that we ... for every website that we go on. Don't bother reading them. There's no point. The legalese is perfectly written in ...

For lawyers.

.... incomprehensible ways. Exactly, to make no sense whatsoever. Also, I think it's we don't ... How can we agree to things that we don't know are happening? For example, our friends at ProPublica did a great investigative research that found that Facebook has over 52,000 categories that they put their users in. A category, one was breastfeeding in public. Another was pretending to text awkwardly. Another one was grass, and I'm not sure if that was this kind, or like walk with your shoes off kind. I'm really not sure. In addition to, of course, your income, what gender you say you ... you know, all of those things that you tell them. It also watches wherever else you are, and then puts you in all these different categories, which is fine. You're like, "Oh, it's just advertising," but there are some weird categories also, like ethnic affinity. They could say, "Dan Costa has an Asian-American ethnic affinity," which is kind of weird. They would show you things that they think you like ...

Don't judge me.

Absolutely. This is a judge-free space.

Let's get a question in from the audience.

Audience: How concerned should we be about always-listening voice assistants, like Alexa?

I'm freaked out by Alexa.

Do you have an Alexa?

I absolutely do not have an Alexa.

Really?

No way.

She's amazing. She makes your life better.

This is good. This is debate. Tell me about how great Alexa is.

Alexa is amazing.

First of all, I don't want some woman that I'm bossing around in my house. Like, not setting a good ...

Again, I feel a little judgment. I mean, it's hard to describe what she does when you bring her home, but just that voice-based interface, to be able to ask for things and have her deliver them. It's the same ... you ask the same questions you would ask for if you were typing on your phone or if you're at your keyboard, but being able to do it hands-free is very liberating.

Here's my problem, okay. Just last month, law enforcement asked Amazon to hand over the recordings that Alexa had made in the home of an alleged murderer. Now, no judgment on the alleged murderer, of course, but we're at a point where there are no legal protections for any of this stuff.

It's a fascinating case.

It is a fascinating case.

It wasn't necessarily that Alexa would have recordings of everything that happened in the house.

But if he had been asking for things like ...Paper towels, maybe?

Or where can I buy black plastic bags?

And gloves.

And latex gloves.

Oh, we shouldn't laugh. This is terrible.

Those types of searches would be in the hits. If he did it on a desktop, those searches would also be findable and observable.

I mean, we're at this point, though, Dan. I think, to me, the point is like there are no laws. There's nothing. There's no regulation. Nobody has any say. We also don't even like ... Another one that I was reading about was, have you heard about ... We're doing ... Should I give it away?

I think you should.

Okay, it's a tool we're going to ask people to try next week. It's called Apply Magic Sauce. Do you know this one?

I do not.

Oh my god, it's unbelievable. It will look at your Facebook profile, and it will predict your personality. There's a great piece in Motherboard Today written by these German reporters saying that, was technology similar to Apply Magic Sauce from Cambridge Analytica used maybe to see what people's personalities were on Facebook?

It looks at your Facebook profile ... I mean, you figure it could do a bunch of things like, you know, are you using curse words? Are you writing about certain products? Are you positive or negative?

What's your punctuation? How do you construct sentences? What words do you use? It called me 94% more hard working and organized than the rest of the population. Not 93%, 94, based on something I had written.

Did you find that accurate? Do you think that's an accurate assessment?

Well, that's the question, right? I don't know if it's accurate. Or is that my, you know ... to get like deep and so like psychoanalysis ... or is that my externalization of my personality? Is that okay that they're not reading what I write, but they're reading between the lines and then showing me things based on that? They don't tell me that in the privacy settings. We don't know if that's being used.

I think that's a really interesting point because there's obviously a lot of anxiety. People are very ... people are weirded out by this technology.

Yes. Creepy.

Because they're just trying to wrap their heads around it. I mean, if you take it and you go to the flip side, and you go, "Well, what does it matter?" What is at stake here? What are you afraid of happening? What are you afraid of being put into one of these buckets? What's the worse thing that happens?

Right. Well, I put that exact question to ... There's this great guy, Joseph Turow; he's at the University of Pennsylvania. He has studied marketing for decades, and his whole thing is, you are rehearsing right now to not have any privacy in your life at all. The Fourth Amendment, baby. The word privacy is not used, but it is what makes us Americans, this right to have self-determination, autonomy, free will. I mean, I feel like we're at a point where we have to go back to basics. What did we see at the airport over the weekend? We saw customs officials asking people to hand over their phones to show them their Twitter accounts before they entered this country. I think ...

Customs has been using social media profiles for a while.

Have they?

Domestically and internationally.

Oh, I didn't know that.

When that customs guy's sitting behind the desk, and he calls up your information, if you have you have a public account, it has that information up there.

Who gets to decide what I've written is "unacceptable," quote?

I think the thing that what your show is doing, and what we've been talking about a lot of about, is that individuals need to be aware: what they put out there is not going to stay in one place. If it's public and it's online, it's going to be searchable, and it's going to be analyzable. Not just one at a time, but in mass.

Professor Turow said to me, he's like, "I mean, at the very least, we need to go from creepy to crappy," because if it's crappy, at least we understand it, and maybe we can try to do something about it. Creepy, we just don't even understand how it works, and that doesn't seem ... Like, we're using these things all day long. We should have a little bit more knowledge as to how they work and where our personal information is going.

You get into that in The Privacy Paradox. You reveal how a lot of these things work.

Yes, exactly.

Let's get another question in from the audience.

Audience Will we just learn to accept future millennial politicians having unseemly things coming back from their distant digital pasts?

I mean, if Access Hollywood can find it, I mean, does it matter? As we saw, you can dig up the files, and it doesn't seem to matter.

Trump was a public figure, and he was getting ready for a media hit. He knew there were cameras and microphones on the bus. The larger question is that there's a whole generation that's had cameras on them since they were ten years old. Digital photos don't go anywhere. They're in these vast Google Photos libraries and uploaded in the cloud, and you can see people that have college photos. Like, there are no pictures of me from college.

No, me neither. Thank God.

They would have to be photos from film.

Somebody's going to find some right now, by the way.

But now, anybody in the last ten years, there are pictures of them from college at every moment. That's going to stay with them forever.

Maybe we are just going to get used to that. I mean, that's what's so interesting to me, is that things that we have taken for granted for millennia ... like boredom, for example ... now, these are things that we have to name and introduce to a generation that will not have experienced it. So, for like the European right to be forgotten ... like forgetting. Remembering used to be like ... You know what I mean? Forgetting used to be a human thing, but now we have to have a European Union ruling, and tech companies have to comply to ... It just used to be part of the human experience, so it's interesting to me we're learning which of those things, like, "Okay, it's fine it's gone," and other ones are like, "Oh, crap." That really messes with society when there's no forgetting anything. I'm not ... I don't ...

This is a Black Mirror episode.

Yeah, well, we're in it.

Let's say somebody doesn't want to accept these terms. You want to reject the terms and conditions of Facebook, of Apple, of carrying a smartphone everywhere, what recourse do consumers have? Is it either in or out?

I don't think it is at all. I think that's unreasonable to ask. Do you know what I mean? I use Google Docs. You have to function. We want to ask people, decide for yourself, you know, so you don't feel icky all day ... You know, that feeling when you're like, "Ah, I agree, I agree." Don't feel icky. Actually, I'm going to reveal this. At the end of the project, Sir Tim Berners-Lee, the inventor of the world wide web, actually sits down with us and writes personal terms of service. Just sort of codify what you're okay with-

For himself. This is what he's okay with; this is what he's not okay with.

We have made a fill-in-the-blank, like Mad Libs, for everyone to do. It's not like, "Oh, wait, is this one okay and this one's not?" No. You'll have it with you. Maybe, I don't know, print it out. Use it as your screensaver. Like, "Privacy is blank to ... " No judgment.

For some people, I think it'll be like, "Oh, I am out." There will be people, and my listeners have told me that. Other people are like, "You know what? I'm okay with this company, maybe Apple, because they do take a stand on privacy, but I'm not okay with maybe ... " I don't want to name names. You know, other folks.

What are your personal privacy limits? You carry a cell phone, but you have no interest in Alexa?

When you sign up for The Privacy Paradox, we've recreated a quiz. There's this dude, Alan Weston; he is the man, sociologist, when it comes to people's feelings about privacy. We have taken his very scientific work and turned it into a fun quiz.

There you go.

Because that's what we do. I'm dying to know what you are. You take the quiz, and it tells you whether you are a believer, a realist, or a shrugger.

I'm a realist.

You're a realist.

For sure.

I'm definitely a realist, too.

I mean, every time I download an app, and I look at the permissions, I look at them, and I decide, "Is there any reason this app needs my GPS? Does it need access to my microphone? Does it have access to the hard drive?" And I will make a decision. There are plenty of apps I just ... If it's just a junky app that doesn't add any value to my life ...

See ya.

... I just pass by it.

Yep. Yep. I mean, keeping in mind, you're the executive editor of PC Mag.

Totally.

Okay. Normal, everyday people should be able to do that. We had an event last night, and we asked people to do that. Like, look at your apps. If there's an app that's asking you for access to your microphone, but they don't have anything to do with voice, why should they have it? Turn it off. And then people are like, "Oh." You know?

I'll tell ya, for app developers, the default is to get access to everything.

Of course.

Because they may want a feature later on, and they want to grow into the app, but their default, they assume most people will just click through. And they do.

Right, right. That makes sense to me. They want as much information as possible. Sir Tim Berners-Lee ... Have you heard about this? He's working-

I've heard of him, yes.

No, no, what he's working on.

No.

These personal data storage, they're pods, this like ... Flipping what the web is. The idea would be like instead of you logging into Facebook; Facebook would log into you. You would grant them whatever information you felt okay with, and you would also be able to take it back whatever you wanted.

That way you have your terms of service. And Facebook needs to comply with your terms of service instead of-of you having to comply with theirs. They don't want to comply with your terms; then they can't access your personal information.

Precisely. It's a project at MIT. It's called Solid. He is looking for developers to help him. At first, I was like, "Oh my god, this is crazy town," but then you're like, "Well, this guy invented the web ... "

He did it once….

... so if anyone could do it again …

We got another question from the audience.

Audience Should I trust Windows 10?

Manoush Zomorodi: What do you think, Dan?

Dan Costa: I think you can trust Windows 10 as much as you trust any operating system. Microsoft itself, I don't think it's not anything inherently untrustworthy. When it comes to privacy concerns, I think there's so much other stuff going on. I think ... And there's Google's whole platform, but the thing that worries me is all the advertisers. It's third parties and companies that are tracking and then reselling your data, and that's where there's not a lot of regulation. That's not a Microsoft thing. It's not a Google thing. It's this third-party market that operates completely under everybody's radar.

Which, I mean, goes back decades, right, in this country to junk mail, to people selling and swapping. I didn't know this, but ProPublica, Julia Angwin, the investigative reporter, was telling me that actually, Facebook is the biggest purchaser of the six big data firms, that they spend the most money not only taking the information you put into Facebook as a user but then also purchasing information about you from the big data collection companies.

And the matching that goes on in that background where even if you give Facebook ... you know, you lock everything down on Facebook, you're on the service, but all they really know is your email address, they will then take that email address and match it to a cookie, match it to your credit report, because all the credit report agencies, they distribute all this information as well. There's this huge network of databases back there. You don't even have to give your information to Facebook. They can get it elsewhere.

I mean, I don't do Facebook like a normal person. Can I just say, Dan and I have a history here we are dragging out in from of the camera.

Dan Costa: This is an old argument.

I kind of love it, because when Facebook ... What was that, like ten years ago?

Yeah, when they let old folks on.

I was like ... Yeah, when they let old people on, you were like, "You gotta do it." I was like, "No way. I like-

I said you had to do it for your career.

You did. I was like, "I don't like the idea of all my identities being smushed into one, like, nothingness." You were like, "Welcome to the new online world." I was like, "No, I'm not doing it." I mean, I'm sure I've suffered the consequences.

I was on your Facebook page today. There's content up there.

It's pretty crappy. There's something, but I'm not ... like, I don't have any friends. I'm okay with it.

You have real life friends.

I have real life friends like you, Dan.

We've talked about my big concerns around the sort of private sector spying, snooping, reselling your personal information. There's a flip side too, which is that all this information can also be used by the government. That is something that's ... I think we need to be more and more concerned about these things.

It is fascinating to me. Another person I spoke to, Laura Donohue, she's at Georgetown University, she's a Fourth Amendment legal expert. I was like, "When did they think that it was okay to take all of our information?" She said, "Oh, they look to the companies." They were like, "Well, if all the companies can have all this information, why can't we?"

If Experian has this information, why should we not have this information?

Yeah, totally. The UK, new investigative powers, they are holding on to everyone's search terms for a year. Just sitting there, no matter you've searched for. Everybody.

It's only for a year.

That year requirement had to be legislated. Otherwise, it would be there forever.

Just keep it there, just in case.

You search for something when you're 21, that search record exists when you're 41 or 61.

I think, to the other paradox, to me, is there are bad people who can be stopped with these techniques. This horrible pedophile ring that they ended up using, the FBI went online and infiltrated this. Of course, that is the truth, but they didn't get a warrant. There are no legal protections for the other people. For every one great thing that happens, what is happening to the rest of our civil liberties? And at what point do we draw the line? I mean, I think this is like the bigger question. I think, you know, with the Supreme Court back in the news today, this is going to be the next big question. We know that they can't search our phones, but what else, you know? They're going to have to decide.

They can make you unlock your phone with a fingerprint.

With a fingerprint, but not make you hand over your PIN which is so weird.

If you have the passcode, then you're safe.

Why is that?

Your fingerprint is something that they can legally get from you when they take you into custody. Getting information from you all the sudden becomes incriminating yourself, and therefore, that's why you should always have a passcode, even if you use your fingerprint to lock your phone.

Even if you use that, the Third-party doctrine applies. We tell the story. It's a very sexy story, actually. Third-Party Doctrine, 1979, Supreme Court ruling that said the minute you make a phone call, you are handing over your personal information to the phone company, but not the contents of what you say on the phone. Here we are, we're handing over ... Yes, we have accounts with Gmail, but what about all the emails that we're writing? Is that similar ... Is that content?

Did you find any takeaways from the Project that were like, "Wow, I didn't know I could do this with that tool," or, "I didn't know I could protect myself that way"? What takeaways did you have that you were like, "Oh, I'm going to do this differently going forward"?

Well, I didn't know about digital fingerprinting. Does that seem lame that I didn't know that?

fast-forward--and-39;note-to-self-and-39;-host-manoush-zomorodi photo 2

I don't think most people do.

We're asking people to try a tool from EFF, Electronic Frontier Foundation, called Panopticlick ... this is on day two ... I was like, "I have got all the blockers on. I'm going to get a clean bill of health." You just literally you click a button that says, "Test me," and it tells you what is tracking you where. I had all the ad blockers on. I wasn't being followed by cookies. But the digital fingerprinting was going on. For those of you who don't know digital fingerprinting, it's literally, what version of the browser are you on? What font do you use? What kind of computer are you on? How often do you get online at a certain time? It combines all these teeny tiny data points to figure out who you are, even if you've opted out or you're incognito or whatever else. I didn't know that was possible. Now I have Privacy Badger. Do you use Privacy Badger?

I turn everything on and off at one point or another. I systematically have just to purge it all because I can't keep track of all the things that are running. Ghostery is another great one.

Ghostery is another one. I mean, you can't do it all, right? If we give up, at the risk of sounding like Joan of Arc or something, if we give up, it's not okay, especially, I think, with what we've seen be called into question over the last couple of weeks.

At the same time, if everybody starts using Privacy Badger, Ghostery, and Adblock Plus, the free web goes away. We are no longer going to be able to put out content for free because the ad model breaks down.

That's correct.

It's a little shaky right now.

It is. I would pay for these things. I am looking ... Can you help me? I am looking for a great email provider that is private, that will let me take out my emails, and that ... You're going to mock me for it. You know. Don't tell them-

You want your personal email server, because what could go wrong with that?

Right. Exactly. I mean, Gmail's probably the safest place for your email, except oh yeah, Google has everything anyway.

Email's pretty basic. There's a lot of different places you can get it. Also, then you get into the thing where you do most your email at work anyway. You're trusting WNYC.

The minute you email someone who has an account anywhere else, then you've lost the game anyway. I would pay for that. I would pay for a Facebook that didn't do all those things. I mean, that's not fair to people who can't afford to pay for it.

There have been projects to start it, but you just can't get enough people on it to make it sustainable.

Right. I mean, I will say, The New York Times is showing that people will pay for news.

They'll pay The New York Times for news.

I think we will have a different conversation if we try to get them to pay for PCMag.com.

Do you?

We've talked about it, but I just don't think for what we do ... There are so many other people doing reviews. There are going to be a lot fewer reporters making money on the web because of these trends.

All right, so let's say that one won't work. What about ... Here's another one that I'm really into, another idea because we're going to float a lot of these next week. What about a Hippocratic Oath for developers? Like, ethical ... Doctors have it. Lawyers have it. Journalists even have it. Why not people who make the technology?

It would be for the developers or for ... I mean, because basically, you're talking about ad technology and tracking technology.

Maybe people won't stick to it, but at least there's something.

You were on a panel with Anil Dash last night. Did he bring this up?

He did.

Because he said, this was a big thing that he's talked about, which is that all these technologies that were birthed in Silicon Valley, amazing tools that have changed the world, there's very little consideration as to the ethics involved in doing this. I mean, Google, to its credit, said, "Look, don't be evil," in its mission statement.

They don't say it anymore, though.

That was very broad, specifically in the day-to-day life of engineers. I don't think that there are a lot of developers or coders that are thinking about: what's the impact of this tool going to be once I let it loose into the world?

We have heard from some of our listeners, because we do have a bunch of techies who are fans of the show, and I've had people write in and say, "I am trying to start this conversation on the ground at my organization, that it's something that just becomes part of the process." Anil was saying how there's no CS program out there that has ethics as part of the discussion.

That's a great point.

I mean, at least let's talk about it, you know? Privacy requires a public conversation. Another paradox.

We're having that conversation right now.

Yes, we are.

We're education people. We've talked about a few tools that they can use regarding keeping their information private. There's also an upside to this data revolution, these huge data sets, and you've covered that on your show as well regarding: we've got really big data, and that big data can be used to solve really big problems.

Yes, absolutely. I got another treat and got to go to MIT and dork out. I went to visit Sandy Pentland's lab. He is one of the people that's doing social, big data change, so looking at cities. All kinds of data. Where does the sunlight land? Where are people Tweeting from? Which corner has the most investment dollars? Then his idea is to get all these stakeholders of a city around the table and look at what they can know. For example, in Germany, he tested it out with a mayor. They were getting several thousand Syrian refugees. The question was: how can we make sure that those refugees aren't ghettoized? How can we integrate them into the city? How can we get all the stakeholders, you know, the Department of Commerce, the housing people, the refugee people, the healthcare workers, and make this work?

That's exciting that they can look at the data and try to make our cities more healthy, make people live together in harmony. That's wonderful. Then, of course, he's like, "But, on the other hand, if you gave the data to a dictator who didn't invite any of the stakeholders, they would be able to use it to completely ruin people's lives and make sure that there were no health services in a certain ... went to a certain neighborhood, or there was no law enforcement in a certain neighborhood. Or maybe they'd be like, 'Oh, all the poor people live there. Let's send a lot of law enforcement there.'" It ... You know, and techies love to say this, the technology itself is not to blame. It is neutral. It is the people who use it, and we are dependent on that.

What terrifies you the most? I ask everybody what they are most afraid of when it comes to technology in the future. What is keeping you up at night?

You know, it's this quote from this philosopher at the Oxford Internet Institute. He was called on by Google to advise them. I mean, I love the fact that Google had an in-house philosopher.

He just said to me, "Why is privacy important? Because a life without shadow is a flat life." The thought of us not being able to find space mentally to think through problems, to come to ideas without fear of judgment, or that we ... We opine so quickly now, and a lot of these issues are extremely complicated, and they take time and conversation, and they take privacy and solitude. That worries me. You know, I have kids. I want them to be able to sit and figure something out before they have to tell someone about it, or they get shot down, or that they have to seek validation for it. Joyfulness. I want us to live in a place where we care for each other, and there's decent ... there's kindness in the world. Is that weird and Oprah-esque?

I don't think it's too bad.

I'm over 40, Dan. This is what happens.

Regarding the positive things, things that excite you the most that either you use in your daily life or that you're just super excited about happening, what are you optimistic about?

I mean, I worry about these, that people are like, "Oh, she's anti-technology." Oh my god, no. I love my phone. The artistic ability that we have, like my son is starting to make comic books online. We can do so many cool, cool, wonderful things. I love ... I mean, again, my producer and I, we will FaceTime and work on Google Docs and use every known capability so that we can be home with our kids for dinner. You know? It has made it possible for me to be a working mother and truly have a big career and also tuck my kids into bed at night. That's amazing. That's an amazing thing that it has given me. I wouldn't have been able to do that ten years ago.

I told you it was going to help you out.

You did say. I know.

Tell the audience how they can participate next week, or really at any time, to join the project.

Join us. It's going to be fun. It's going to be intense, I think, too. Privacyparadox.org is the place to go. What will happen is you will be asked for your email. We will not share it with anyone. We even have our own privacy statement, but it is in plain English. Very simple. No cookies on there either, by the way. Then what happens is you will trigger a newsletter, which has ... You can get as much or as little background to the technology and the science as you want, and will also give you tips and things that you can try. Monday, again, it's Bruce Schneier, the cryptographer. He's awesome. We have a little thing that we want everyone to get on. We want everyone to try Signal.

We've covered Signal many times.

It's good stuff.

We've given it a great review.

Yeah. I think it's going to be good. I think that what we want to do is ... And what we've seen happen is, like, change can happen if we all do this together and support each other. We're not saying you have to be like, throw away your phone. What we're saying is, "Just try different things." There are alternatives. There are ways of modifying the technology so that it feels like it aligns with your values and beliefs.

Related

  • Fast Forward: Telemedicine, Chatbots, and the Future of HealthcareFast Forward: Telemedicine, Chatbots, and the Future of Healthcare
  • Fast Forward: Aaron Shapiro, CEO of Digital Agency Huge Inc.Fast Forward: Aaron Shapiro, CEO of Digital Agency Huge Inc.
  • Fast Forward: Scientist, AI Expert, Entrepreneur Vivienne MingFast Forward: Scientist, AI Expert, Entrepreneur Vivienne Ming

Well, you've sold me. I'm going to participate. Can people find you on Twitter?
Yeah, @manoushz.


Recommended stories

More stories