Edge.org
To arrive at the edge of the world's knowledge, seek out the most complex and sophisticated minds, put them in a room together, and have them ask each other the questions they are asking themselves.
Published on Edge.org (http://www.edge.org)

Home >

2013 : WHAT *SHOULD* WE BE WORRIED ABOUT? [1]

In the News [ 33 ] [2]
  |  
Contributors [ 155 ] [3]   |   View All Responses [ 154 ] [4]

[ print ] [5]

[6]
David Pizarro [6]
Psychologist, Cornell University
Human Intuitions Will Stifle Technological Progress

It is increasingly clear that human intuitions—particularly our social and moral intuitions—are ill-equipped to deal with the rapid pace of technological innovation. We should be worried that this will hamper the adoption of technologies that might otherwise be of practical benefit to individuals, and of great benefit to society.

Here's an example: For quite some time it has been possible for my email provider to generate targeted advertisements based on the content of my email. But it can now also suggest a calendar entry for an upcoming appointment mentioned in an email, track my current location as the time of the appointment approaches, alert me about when I should leave, and initiate driving directions that will get me there in time.

When talking about these services, it feels natural to say that Google "reads my email" and that it "knows where I have to be." We can't help but interpret this automated information through the lens of our social intuitions, and we end up perceiving agency and intentionality where there is none.

So even if we know for a fact that no human eyes have actually seen our emails, it can still feel, well... creepy. It's as if we're not quite convinced that there isn't someone in the back going through our stuff, following us around, and possibly talking about us behind our back. It is no surprise that many view these services as a "violation of privacy," even when there is no agent that is doing the "violating." Yet the adoption of these technologies has suffered for these reasons.

These social intuitions interfere with the adoption of technologies that offer more than mere convenience. For instance, the technology for self-driving cars exists now, and offers the promise that thousands of lives might be saved each year because of reduced traffic collisions. But the technology depends fundamentally on the ability to track one's precise location at all times. This is just "creepy" enough that a great deal of people will likely actively avoid the technology and opt for the riskier option of driving themselves.

Of course, we are not necessarily at the whims of our psychological intuitions. Given enough time we can (and do) learn to set them aside when necessary. However, I doubt that we can do so quickly enough to match the current speed of technological innovation.

  • John Brockman, Editor and Publisher
  • Russell Weinberger, Associate Publisher
  • Karina Knoll, Editorial Assistant
 
  • Contact Info:editor@edge.org
  • In the News
  • Manage Email Subscription
  • Get Edge.org by email
 
Edge.org is a nonprofit private operating foundation under Section 501(c)(3) of the Internal Revenue Code.
Copyright © 2012 By Edge Foundation, Inc All Rights Reserved.

 


Links:
[1] http://www.edge.org/annual-question/q2013
[2] http://www.edge.org/inthenews/q2013
[3] http://www.edge.org/contributors/q2013
[4] http://www.edge.org/responses/q2013
[5] http://www.edge.org/print/response-detail/23734
[6] http://www.edge.org/memberbio/david_pizarro