Dr. Google, or: How I Learned to Stop Worrying and Love the Loss of online Privacy

Send to Kindle

4.0s by Sylwester Ratowt

Computer algorithms that have access to information about us already influence how we think of ourselves. Soon we will turn over to them the creation our own identities because it will be convenient. How I think of myself, how I want others to perceive me, what is the real me… I want my devices to tell me that.

The DVDs I had on my shelf at one time defined how I thought of myself. It was a fine movie collection and I was proud when visitors examined it (“Why yes, I do have Tarkovsky’s Solaris”). My Netflix queue did the same— it reflected my identity. Recently, however, I have been upset by the recommendations Netflix is giving me. “I am not a person who watches crappy Discovery Channel specials!” I protest. But, apparently I am. (And, yeah, I don’t think I ever did finish watching that Solaris).

The reason I get upset by those recommendations is that I think of them as a reflection of who I am, just as displaying DVDs used to showcase what kind of a person I was. How I think of myself is no longer defined by movies I choose to put on display for others, but instead by a Netflix algorithm telling me my taste in programing.

I am no longer the khakis I buy.  I am no longer the films I put on the Netflix queue. I am what Netflix recommends to me.

The idea of self-fashioning by choosing what we do is so crude. We still think of it as selecting which food market we frequent, which clothes we wear, which DVDs we display. We also do it by thoughtfully deciding which like or +1 buttons to push, what to pin or retweet. But that is such a labor and time intensive way of creating our image. If only the algorithms could create a portrayal of who I am based on my activities, it would be so much easier. The algorithms already know what I do, because I do it with my devices. They know where I go and whom I meet. We all carry personal tracing devices in our pockets. Wouldn’t such a computer generated identity—created based on who I really am—be more authentic, therefore more persuasive, than one I cobble together myself? Wouldn’t it facilitate human interactions?

I would like my friends to be able to search for “What does Sylwester think is the best novel of the decade?” and for Google to return just the obscure book by just the obscure author. In a world where such searches are possible, who needs skinny jeans and ironic t-shirts?

(but the search might show what is beneath the veneer of pretense, so be it; radical honesty)

It is all about what we get in return.

I can go to a cocktail party and make small talk and eventually run into someone interesting, but probably not (I am not very good at the small talk thing). But if an app knew all about me and the other guests it could suggest, “Go over there and talk to Alex,” (it would show me a picture and everything). That would be useful and convenient. Isn’t that what a good host would do?

“Hi I’m Sylwester. My app told me I should talk to you, Alex. So… you think we should take Zeno more seriously? Do tell!”

Good or bad, it would work well. It would save us time, and make the old-fashioned way of doing things seem so inefficient (today some people think that LinkedIn is useful. Ha!). No matter how strange it seems to us right now, we get used to things, especially convenient things, very quickly. We have no problem changing our rituals and norms if convenience is at stake.

I already offload remembering phone numbers and memorizing directions to my phone. This carries a high cost. If my phone dies, I cannot call my mother. If the GPS signal is gone, I am lost in the mountains. People have gotten physically injured by blindly following their devices. But for so many of us, the convenience outweighs the costs.

It is closer than you think.

Our online experience is already somewhat personalized. The results of a search are based on what the search algorithms know about us: our past browsing history, our location, what links we followed. As our online activities get even more personalized, we will increasingly think of the Google search result, or Facebook friend suggestion, as reflections of who we are. We already get offended when an online service suggests that we like a politician from the opposite party. I want to scream at Facebook “Don’t you read my posts! I am a liberal! How dare you suggest that I like that conservative!” or whatever.

I want the algorithms to figure out who I am and present this identity to me and to others. It sounds scary, but when this happens it will be a convenience we will gladly indulge.

***

We need to fight for privacy from other people, but not from computer algorithms.

Our intuitive privacy concerns are based on a habit of interacting with humans. It used to be that if there was some information about me out there, it was a person that knew it. We were right to be distrustful of people. But now, if there is some information out there about me, no person needs to know. It is a computer algorithm that has access to the data.

The algorithms crunch data, providing useful information for us. Sure, they can be used to do evil, anything can. But we already share sensitive information with people: with family, friends, doctors, therapists, clergy, strangers. We know that people lie, betray, forget. People do evils large and small all the time. I trust an algorithm which places ads on my display more than I trust you, my dear reader.

As far as human access to information, our intuition about loss of privacy is correct. We must not let humans with their good intention have access to our data, be it a boyfriend, or a government agent, or an Apple employee, or a coworker.

However, right now we are comingling humans and algorithms, we think that if an algorithm has access, humans have access (and that very well might be the case). However, the barrier we should be fighting to maintain is the one between other humans and algorithms, not a barrier between algorithms and our personal data.

{Additionally: we must be able to choose with whom and when we share our information, we must have an option to be anonymous, we must have an option to change our online identities and have multiple identities.

In the meantime: a good resource here.}

 
Send to Kindle
This entry was posted in Identity and tagged , . Bookmark the permalink.

5 Responses to Dr. Google, or: How I Learned to Stop Worrying and Love the Loss of online Privacy

  1. ryan says:

    yes, but are we ready for radical honesty? am I ready for everyone at the cocktail party to know I relax at home by watching reality TV shows about loggers? (which are awesome.)

    • admin says:

      What if in return for sharing this information you found out that five other people at the party also enjoyed the logger shows, would that make you more willing to open up. Also, remember that you should always have an option to hide whatever you don’t want shared.

  2. Djinn_en_Thanik says:

    To an extent, facebook performs this service (just at a much slower pace). I had a dj friend, nice enough guy, who turned into a vitriol spewing right wing facist on the internet.

    How much better if my phone had PINGED the first time I met the guy, and said to me “No way, Jose. That dude is a total nutjob.” Nutjob being defined by pre-exisiting parameters already established in my phone (-50 points for liking rick santorum. That’s liking, not licking).

    • I guess that would be part of how the system works: it would learn that you value particular aspects of a person and would “score” those higher than other in making its recommendation. In this way it could reinforce already existing divides in our society. A more optimistic interpretation would be that it could alert you that you shouldn’t talk politics with him, but he might have some interesting things to say about the music. Judging on how we tend to behave on the internet, both of the possibilities would play out. The question is, is there a way to design the technology to promote focusing on commonalities rather than differences?

  3. Pingback: A Brief Note on the Grayness of Privacy » 23. {insert footnote}

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>