Edge 288—June 4, 2009
(11,675 words)

THE THIRD CULTURE

THE IMPENDING DEMISE OF THE UNIVERSITY
By Don Tapscott

THE REALITY CLUB

Stewart Brand, Alun Anderson, Laurence C. Smith on "Will We Decamp for the Nortern Rim?" By Laurence C. Smith

ARTICLES OF NOTE

THE NEW YORK TIMES
Wisdom in a Cleric’s Garb; Why Not a Lab Coat Too? Learning to Accept
By Dennis Overbye

NEWSWEEK
Can Admitting a Wrong Make It Right?
By Christopher Dickey

SCIENCE
RETROSPECTIVE:
John Maddox (1925–2009)
By Nicholas Wade

THE WALL STREET JOURNAL
Black Swan Fund Makes a Big Bet on Inflation
By Scott Patterson

THE NEW YORK TIMES
A Human Language Gene Changes the Sound of Mouse Squeaks
By Nicholas Wade

NEWSWEEK
Let's Talk About God
By Lisa Miller

NEW YORK TIMES
Would You Slap Your Father? If So, You're a Liberal
By Nicholas D. Kristof

NEW YORK TIMES
Guest Column: Guest Column: Loves Me, Loves Me Not (Do the Math)
By Steven Strogatz

NEW YORK TIMES
Why Are Humans Different From All Other Apes? It's the Cooking, Stupid
By Dwight Garner



Follow Edge On

In the industrial model of student mass production, the teacher is the broadcaster. A broadcast is by definition the transmission of information from transmitter to receiver in a one-way, linear fashion. The teacher is the transmitter and student is a receptor in the learning process. The formula goes like this: "I'm a professor and I have knowledge. You're a student, you're an empty vessel and you don't. Get ready, here it comes. Your goal is to take this data into your short-term memory and through practice and repetition build deeper cognitive structures so you can recall it to me when I test you."... The definition of a lecture has become the process in which the notes of the teacher go to the notes of the student without going through the brains of either.

THE IMPENDING DEMISE OF THE UNIVERSITY
By Don Tapscott

Introduction

In his Edge feature "Gin, Television, and Cognitive Surplus", Clay Shirky noted that after WWII we were faced with something new: "free time. Lots and lots of free time. The amount of unstructured time among the educated population ballooned, accounting for billions of hours a year. And what did we do with that time? Mostly, we watched TV."

In "The End of Universal Rationality", Yochai Benkler explored the social implications of the Internet and network societies since the early 90s. Benkler has been looking at the social implications of the Internet and network societies since the early 90s. He saw the end of an era:

For those of us like me who have been working on the Internet for years, it was very clear you couldn't encounter free software and you couldn't encounter Wikipedia and you couldn't encounter all of the wealth of cultural materials that people create and exchange, and the valuable actual software that people create, without an understanding that something much more complex is happening than the dominant ideology of the last 40 years or so. But you could if you weren't looking there, because we were used in the industrial system to think in these terms.

Benkler believes that these "phenomena on the Net are not ephemeral". And he has spent the last 20 years trying to get his head around the process of understanding what is transpiring.

In a Reality Club discussion "On 'Is Google Making Us Stupid' By Nicholas Carr" W. Daniel Hillis, Kevin Kelly, Nicholas Carr, Jaron Lanier, Douglas Rushkoff and others explored the future of the printed book.

And Shirky, in his recent piece "Newspapers and Thinking the Unthinkable", (with comments from Nicholas Carr, Martin Wattenberg and Fernanda Viégas, Marc Frons) wrote:

When reality is labeled unthinkable, it creates a kind of sickness in an industry. Leadership becomes faith-based, while employees who have the temerity to suggest that what seems to be happening is in fact happening are herded into Innovation Departments, where they can be ignored en masse. This shunting aside of the realists in favor of the fabulists has different effects on different industries at different times. One of the effects on the newspapers is that many of their most passionate defenders are unable, even now, to plan for a world in which the industry they knew is visibly going away.

Enter Don Tapscott, who is looking at the challenges the digital revolution poses to the fundamental aspects of the University.

"Universities are finally losing their monopoly on higher learning", he writes. "There is fundamental challenge to the foundational modus operandi of the University — the model of pedagogy. Specifically, there is a widening gap between the model of learning offered by many big universities and the natural way that young people who have grown up digital best learn."

The old-style lecture, with the professor standing at the podium in front of a large group of students, is still a fixture of university life on many campuses. It's a model that is teacher-focused, one-way, one-size-fits-all and the student is isolated in the learning process. Yet the students, who have grown up in an interactive digital world, learn differently. Schooled on Google and Wikipedia, they want to inquire, not rely on the professor for a detailed roadmap. They want an animated conversation, not a lecture. They want an interactive education, not a broadcast one that might have been perfectly fine for the Industrial Age, or even for boomers. These students are making new demands of universities, and if the universities try to ignore them, they will do so at their peril.

Contrary to Nicholas Carr's proposition that Google is making us stupid, Tapscott counters with the following:

My research suggests these critics are wrong. Growing up digital has changed the way their minds work in a manner that will help them handle the challenges of the digital age. They're used to multi-tasking, and have learned to handle the information overload. They expect a two-way conversation. What's more, growing up digital has encouraged this generation to be active and demanding enquirers. Rather than waiting for a trusted professor to tell them what's going on, they find out on their own on everything from Google to Wikipedia.

There's a new kind of conversation taking place among the younger generation and our Universities have yet to embrace it, This is a topic that is worthy of a serious conversation by the Edge community and I hope to present comment from contributors in future Edge editions.

John Brockman


DON TAPSCOTT is the author of 13 books on new technology in society, most recently Grown Up Digital. He recently completed a $4 million dollar investigation of the Net Generation. He is Chairman of the think tank nGenera Insight and an Adjunct Professor at the Rotman School of Management, University of Toronto.

Don Tapscott's Edge Bio Page


THE IMPENDING DEMISE OF THE UNIVERSITY

For fifteen years, I've been arguing that the digital revolution will challenge many fundamental aspects of the University. I've not been alone. In 1998, none other than, Peter Drucker predicted that big universities would be "relics" within 30 years.

Flash forward to today and you'd be reasonable to think that we have been quite wrong. University attendance is at an all time high. The percentage of young people enrolling in degree granting institutions rose over 115% from 1969-1970 to 2005-2007, while the percentage of 25- to 29-year-old Americans with a college degree doubled. The competition to get into the greatest universities has never been fiercer. At first blush the university seems to be in greater demand than ever.

Yet there are troubling indicators that the picture is not so rosy. And I'm not just talking about the decimation of university endowments by the current financial meltdown.

Universities are finally losing their monopoly on higher learning, as the web inexorably becomes the dominant infrastructure for knowledge sweeney both as a container and as a global platform for knowledge exchange between people.

Meanwhile on campus, there is fundamental challenge to the foundational modus operandi of the University — the model of pedagogy. Specifically, there is a widening gap between the model of learning offered by many big universities and the natural way that young people who have grown up digital best learn.

The old-style lecture, with the professor standing at the podium in front of a large group of students, is still a fixture of university life on many campuses. It's a model that is teacher-focused, one-way, one-size-fits-all and the student is isolated in the learning process. Yet the students, who have grown up in an interactive digital world, learn differently. Schooled on Google and Wikipedia, they want to inquire, not rely on the professor for a detailed roadmap. They want an animated conversation, not a lecture. They want an interactive education, not a broadcast one that might have been perfectly fine for the Industrial Age, or even for boomers. These students are making new demands of universities, and if the universities try to ignore them, they will do so at their peril.

The model of pedagogy, of course, is only one target of criticism directed toward universities.

The Many Challenges to the University

Most resources of large universities are directed towards research, not learning. The universities are not primarily institutes of higher learning, but institutes for science and research. In his book Rethinking Science, Michael Gibbons developed a scathing critique of the current model science as conducted in the university.

Recently the questioning has heated up on other fronts. In the New York Times last month, Mark Taylor, chairman of Columbia University's religion department, whipped up a storm of academic controversy with a provocative OpEd page article called "The End of University as We Know It".

"Graduate education," he began, "is the Detroit of higher learning. Most graduate programs in American universities produce a product for which there is no market (candidates for teaching positions that do not exist) and develop skills for which there is diminishing demand (research in subfields within subfields and publication in journals read by no one other than a few like-minded colleagues), all at a rapidly rising cost (sometimes well over $100,000 in student loans)." The key problem, he noted, began with Kant in his 1798 work, "The Conflict of the Faculties." Kant argued that universities should "handle the entire content of learning by mass production, so to speak, by a division of labor, so that for every branch of the sciences there would be a public teacher or professor appointed as its trustee."

Taylor argued that graduate education must be restructured at a fundamental level to move away from the ultra-narrow scholarship. Among other things, he called for more cross-disciplinary inquiry, the creation of problem-focused programs, with a sunset clause, as well as more collaboration between all educational institutions, and the abolition of tenure. One week later, the outcry from fellow academics filled the entire letters page on the Sunday New York Times. One of his own colleagues at Columbia said it was "alarming and embarrassing" to hear "crass anti-intellectualism" emerge from his own institution. Another academic accused Taylor of "poisoning the waters of higher education."

The Model of Pedagogy

Whatever the merits of Taylor's call to restructure higher education, I think he is right to call for a deep debate on how universities function in a networked society. Yet I think he misses the most fundamental challenge to the university as we know it. The basic model of pedagogy is broken. "Broadcast learning" as I've called it is no longer appropriate for the digital age and for a new generation of students who represent the future of learning.  

In the industrial model of student mass production, the teacher is the broadcaster. A broadcast is by definition the transmission of information from transmitter to receiver in a one-way, linear fashion. The teacher is the transmitter and student is a receptor in the learning process. The formula goes like this: "I'm a professor and I have knowledge. You're a student you're an empty vassal and you don't. Get ready, here it comes. Your goal is to take this data into your short-term memory and through practice and repetition build deeper cognitive structures so you can recall it to me when I test you."

The definition of a lecture has become the process in which the notes of the teacher go to the notes of the student without going through the brains of either.

As someone who gives many lectures a year, I appreciate the irony of this view. But I understand that my lectures are not a good way of learning. They play a limited role of interesting an audience, changing their view or possibly motivating them to do something different. But I dare say that 90 percent of what I've said is lost.

True, this broadcast model is enhanced in some disciplines through essays, labs and even seminar discussions.  And of course many professors are working hard to move beyond this model. However, it remains dominant overall.

Technology and the web provide an important element of a new model, but so far few have adopted it. If someone frozen 300 years ago miraculously came alive today and looked at the professions — a physician in an operating theater, a pilot in a jumbo cockpit, a engineer designing an automobile in a CAD system — they would surely marvel at how technologies had transformed the knowledge work.  But if they walked into a university lecture hall, they would no doubt be comforted that some things have not changed.

The New Generation of Students

The broadcast model might have been perfectly adequate for the baby-boomers, who grew up in broadcast mode, watching 24 hours a week of television (not to mention being broadcast to as children by parents, as students by teachers, as citizens by politicians, and when then entered the workforce as employees by bosses). But young people who have grown up digital are abandoning one-way TV for the higher stimulus of interactive communication they find on the Internet. In fact television viewing is dropping and TV has become nothing more than ambient media for youth — akin to Muzak. Sitting mutely in front of a TV set — or a professor — doesn't appeal to or work for this generation. They learn differently best through non-sequential, interactive, asynchronous, multi-tasked and collaborative

Young Americans under 30 are the first to have grown up digital. Growing up at a time when cell phones, the Internet, texting and Facebook are as normal as the refrigerator. This interactive media immersion at a formative stage of life has affected their brain development and consequently the way they think and learn.

Some writers, of course, think that Google makes you stupid; it's so hard to concentrate and think deeply amid the overwhelming amounts of bits of information online, they contend. Mark Bauerlein, an English professor at Emory University, even calls them the "dumbest generation" in his recent book on the topic.

My research suggests these critics are wrong. Growing up digital has changed the way their minds work in a manner that will help them handle the challenges of the digital age. They're used to multi-tasking, and have learned to handle the information overload. They expect a two-way conversation. What's more, growing up digital has encouraged this generation to be active and demanding enquirers. Rather than waiting for a trusted professor to tell them what's going on, they find out on their own on everything from Google to Wikipedia.

If universities want to adapt the teaching techniques to their current audience, they should, as I've been saying for years, make significant changes to the pedagogy. And the new model of learning is not only appropriate for youth — but increasingly for all of us. In this generation's culture is the new culture of learning.

The professors who remain relevant will have to abandon the traditional lecture, and start listening and conversing with the students — shifting from a broadcast style and adopting an interactive one. Second, they should encourage students to discover for themselves, and learn a process of discovery and critical thinking instead of just memorizing the professor's store of information. Third, they need to encourage students to collaborate among themselves and with others outside the university. Finally, they need to tailor the style of education to their students' individual learning styles.

Because of technology this is now possible. But this is not fundamentally about technology per se. Rather it represents a change in the relationship between students and teachers in the learning process.

The Most Vulnerable Universities

The ability to engage young people at university obviously depends on the institution, and the individual professor. The great liberal arts colleges are doing a wonderful job of stimulating young minds because with big endowments and small class sizes students can have more of a customized collaborative experience. My son Alex graduated from Amherst College, a small undergraduate university with a student teacher ratio of 8-1. His teachers included a Pulitzer prize winner, Nobel Laureate and overall professors who live to work with students who enable them to learn.

But the same cannot be said of many of the big universities that regard their prime role to be a centre for research, with teaching as an inconvenient afterthought, and class sizes so large that they only want to "teach" is through lectures.

These universities are vulnerable, especially at a time when students can watch lectures online for free by some of the world's leading professors on sites like Academic Earth. They can even take the entire course online, for credit. According to the Sloan Consortium, a recent article in Chronicle of Higher Education tells us, "nearly 20 per cent of college students — some 3.9 million people — took an online course in 2007, and their numbers are growing by hundreds of thousands each year. The University of Phoenix enrolls over 200,000 each year."

The New Model

Some leading educators are calling for this kind of massive change; one of these is Richard Sweeney, university librarian at the New Jersey Institute of Technology. He says the education model has to change to suit this generation of students. Smart but impatient, they like to collaborate and they reject one-way lectures, he notes. While some educators view this as pandering to a generation, Sweeney is firm: "They want to learn, but they want to learn only from what they have to learn, and they want to learn it in a style that is best for them."

There are shining examples of interactive education, though. Dr. Maria Terrell, who teaches calculus at Cornell University, used an interactive method that's part of a program called "Good Questions," which is funded by the National Science Foundation.

One strategy being used in this program is called just-in-time teaching; it is a teaching and learning strategy that combines the benefits of Web-based assignments and an active-learner classroom where courses are customized to the particular needs of the class. Warm-up questions, written by the students, are typically due a few hours before class, giving the teacher an opportunity to adjust the lesson "just in time," so that classroom time can be focused on the parts of the assignments that students struggled with. Harvard professor Eric Mazur, who uses this approach in his physics class, puts it this way: "Education is so much more than the mere transfer of information. The information has to be assimilated. Students have to connect the information to what they already know, develop mental models, learn how to apply the new knowledge, and how to adapt this knowledge to new and unfamiliar situations.

This technique produces real results. An evaluation study of 350 Cornell students found that those who were asked "deep questions" (that elicit higher-order thinking) with frequent peer discussion scored noticeably higher on their math exams than students who were not asked deep questions or who had little to no chance for peer discussion. Dr. Terrell explains: "It's when the students talk about what they think is going on and why, that's where the biggest learning occurs for them…. You can hear people sort of saying, 'Oh I see, I get it.' … And then they're explaining to somebody else … and there's an authentic understanding of what's going on. So much better than what would happen if I, as the teacher person, explain it. There's something that happens with this peer instruction."

Interactive education enables students to learn at their own pace. I saw this myself back in the mid-1970s when I was taking a statistics course for my graduate degree in educational psychology at the University of Alberta in Canada. It was one of the first classes conducted online — an educational groundbreaker from Dr. Steve Hunka, a visionary in computer-mediated education. This was before PCs, so we sat down in front of a computer terminal that was connected to a computer-controlled slide display. I could stop at any time and review, and test myself to see how I was doing. The exam was online too.

There were no lectures. Just as well: the statistics lecture is by definition a bust. There is no "one-size-fits-all" for statistics – everyone in the lecture hall is either bored or doesn't get it. Instead, we got face-to-face time with Dr. Hunka, who was freed up from being a transmitter of data to someone who customized a learning experience for each of us, one on one.

Back then, online learning was expensive, but today the tools on the Net make it a great way to teach and free up the teacher to design the learning experience and converse with the students on an individual and more meaningful basis. It works. The research evidence is very strong and dates back years: "Compared with students enrolled in conventionally taught courses, students who use well-crafted computer-mediated instruction ... generally achieve higher scores on summary examinations, learn their lessons in less time, like their classes more, and develop more positive attitudes towards the subject matter they're learning," according to an article as long ago as 1997 called "Technology in the Classroom: from Theory to Practice," which appeared in Educom Review. "These results hold for a broad range of students stretching elementary to college students, studying across a broad range of disciplines, from mathematics to the social sciences to the humanities."

Challenging the Purpose of the University

The issue of pedagogy raises a deeper issue — the purpose of the university. In the old model, teachers taught and students were expected to absorb vast quantities of content. Education was about absorbing content and being able to recall it on exams. You graduated and you were set for life — just "keeping" up in your chosen field. Today when you graduate you're set for say, 15 minutes. If you took a technical course half of what you learned in the first year may be obsolete by the 4th year. What counts is your capacity to learn lifelong, to think, research, find information, analyze, synthesize, contextualize, critically evaluate it; to apply research to solving problems; to collaborate and communicate.

But now that students can obviously find the information they're looking for in an instant online in the crania of others online, this old model doesn't make any sense. It's not only what you know that really counts when you graduate; it's how you navigate in the digital world, and what you do with the information you discover. This new style of learning, I believe, will suit them.

Universities should be places to learn, not to teach.

Net Geners, immersed in digital technology, are keen to try new things, often at high speed. They want university to be fun and interesting. So they should enjoy the delight of discovering things for themselves. As Seymour Papert, one of the world's foremost experts on how technology can provide new ways to learn put it: "The scandal of education is that every time you teach something, you deprive a child of the pleasure and benefit of discovery."

A Challenge to Teaching

John Seely Brown is director emeritus of Xerox PARC and a visiting scholar at USC. He noticed that when a child first learns how to speak, she or he is totally immersed in a social context and highly motivated to engage in learning this new, amazingly complex system of language. It got him to thinking that "once you start going to school, in some ways you start to learn much slower because you are being taught, rather than what happens if you're learning in order to do things that you yourself care about…. Very often just going deeply into one or two topics that you really care about lets you appreciate the awe of the world … once you learn to honor the mysteries of the world, you're kind of always willing to probe things … you can actually be joyful about discovering something you didn't know … and you can expect always to need to keep probing. And so that sets the stage for lifelong inquiry."

Another fixture of old-style learning is the assumption that students should learn on their own. Sharing notes in an exam hall, or collaborating on some of the essays and homework assignments, was strictly forbidden. Yet the individual learning model is foreign territory for most Net Geners, who have grown up collaborating, sharing, and creating together online. Progressive educators are recognizing this. Students start internalizing what they've learned in class only once they start talking to each other, says Seely Brown: "The whole notion of passively sitting and receiving information has almost nothing to do with how you internalize information into something that makes sense to you. Learning starts as you leave the classroom, when you start discussing with people around you what was just said. It is in conversation that you start to internalize what some piece of information meant to you."

The lecture hall is a prime example of mass education. It came along with mass production, mass marketing, and the mass media. Schooling, says Howard Gardner, is a mass-production idea. "You teach the same thing to students in the same way and assess them all in the same way." Pedagogy is based on the questionable idea that optimal learning experiences can be constructed for groups of learners at the same chronological age. In this view, a curriculum is developed based on predigested information and structured for optimal transmission. If the curriculum is well structured and interesting, then large proportions of students at any given grade level will "tune in" and get engaged with the information. But too often, it doesn't work out that way.

Consider one of the smash hits on YouTube last year, a short video called "A Vision of Students Today".

Created by Michael Wesch, an assistant professor of cultural anthropology at Kansas State University, it is a stinging indictment of the education delivered by standard large-scale American university. Wesch recruited 200 student collaborators to describe their view of the education they're receiving. Their verdict: Nothing much has changed since the early nineteenth century, when the blackboard was introduced as a brilliant new way to help students visualize information. They painted a grim picture of university life — huge classes, teachers who didn't know the students' names, students who didn't complete the assigned readings, multiple-choice exams that were a waste of intellectual capital.

I know many bright students who feel the same way. The big thing these days is to get an "A" without ever having gone to a lecture. When the crème de la crème of an entire generation is boycotting the formal model of pedagogy in our educational institutions, the writing is on the wall.

A Challenge of the Revenue Model

As the model of pedagogy is challenged it's inevitable that the revenue model of universities will be too. The arrival of online education raises the question: If all that the big universities have to offer to students are lectures that you can get online for free — from other professors — why pay the tuition fees? If universities want to survive the arrival of free university-level education online, they need to change the way professors and students interact on campus. Some are taking bold steps to reinvent themselves, with help from the Internet. Massachusetts Institute of Technology, for example, is offering free lecture notes, exams and videotaped lectures by MIT professors to the online world.

Anyone in the world can watch the entire series of lectures for some 30 courses, such as Walter Lewin's ever-popular introductory physics course, which gets viewed by over 40,000 people a month on OpenCourseWear, MIT's version of intellectual philanthropy. Universities worldwide have joined the movement.

A Challenge to Credentialing

Of course, universities play an important role in the sorting of individuals in society, through the admissions process and the awarding of degrees. One of the most important roles of the university is to screen human capital for future employers, and more broadly stratifying society. Those who get good marks in high school and on their SATs, who are proven to be hard workers and have other talents, get into the best universities. Those who graduate — better still with distinction — have a credential, to get the most desirable jobs or entrance to graduate programs. They have proven they have a degree of discipline and that they're prepared to play by the rules.

But a credential and even the prestige of a university is rooted in its effectiveness as a learning institution. If these institutions are shown to be inferior learning environments to other alternatives their capacity to credential will surely diminish.

How much longer will, say, a Harvard undergraduate degree, taught in large class sizes by teaching assistants, largely through lectures, be able to compete in status to the small class size liberal arts colleges or superior delivery systems that harness the new models of learning. Surely the proof being in the pudding will change the status for various recipes for learning.

A Challenge to the Campus


The university campus has been "a wonderful place for young people to go for four years to get older", as Princeton sociologist Marvin Dressler told me a decade ago. "While they're there they're bound to learn something" he said.

But if campuses are seen as places where learning is inferior to other models, or worse places where learning is restricted and stifled, the role of the campus experience will be undermined as well.

Campuses that embrace the new models become more effective learning environments and more desirable places. Even something as simple as online lectures do not undermine the value of on-campus education, they have enhanced it. The video lectures allow students to absorb the course content online — whenever it's convenient — and then get together to tinker, invent new things, or discuss the material. The experience has shown MIT that real value of what they offer is not the lecture per se, but rather the whole package — the content tied to the human learning experience on campus, plus the certification. Universities, in other words, cannot survive on lectures alone.

Videotaping lectures can free up intellectual capital — on the part of both professors and students — to spend their on-campus time thinking and inquiring and challenging each other, rather than just absorbing information.

A Challenge to the Relationship of the University to Other Institutions

"The time has come for some far reaching changes to the university, our model of pedagogy, how we operate, and our relationship to the rest of the world," says Luis M. Proenza, president of the University of Akron.

He asks a provocative question: Why should a university student be restricted to learning from the professors at the university he or she is attending. True, students can obviously learn from intellectuals around the world through books, or via the Internet. Yet in a digital world, why shouldn't a student be able to take a course from a professor at another university? Proenza thinks universities should use the Internet to create a global centre of excellence. In other words, choose the best courses you have and link them with the best at a handful of universities around the world to create an unquestionably best-in-class program for students. Students would get to learn from the world's greatest minds in their area of interest — either in the physical classroom, or online. This global academy would be also be open to anyone online. This is a beautiful example of the collaboration I described in the book I co-authored, Wikinomics.

So why hasn't it happened yet? "It's the legacy of established human and educational infrastructure," says Proenza. The analogy is not the newspaper business, which has been weakened by the distribution of knowledge on the Internet, he notes. "We're more like health care. We're challenged by obstructive, non-market-based business models. We're also burdened by a sense that doctor knows best, or professor knows best."

"There are a lot of sacred cows," he said. Why, for example, are universities judged by the number of students they exclude, or by how much they spend? Why aren't they judged by how well they teach, and at what price?

The digital world, which has trained young minds to inquire and collaborate, is challenging not only the lecture-driven teaching traditions of the university, but also the very notion of a walled-in institution that excludes large numbers of people. Why not allow a brilliant grade 9 student to take first-year math, without abandoning the social life of his high school? Why not deploy the interactive power of the internet to transform the university into a place of life-long learning, not just a place to grow up?

Old Paradigms Die Hard

Yet the Industrial Age model of education is hard to change. New paradigms cause dislocation, disruption, confusion, uncertainty. They are nearly always received with coolness or hostility. Vested interests fight change. And leaders of old paradigms are often the last to embrace the new.

Back in 1997 I presented my views to a group of about 100 University presidents at a dinner hosted by Ameritech in Chicago. After the talk I sat down at my table and asked the smaller group what they thought about my remarks. They responded positively. So I said to them "why is this taking so long?" "The problem is funds," one president said. "We just don't have the money to reinvent the model of pedagogy." Another educator put it this way: "Models of learning that go back decades are hard to change." Another got a chuckle around the table when he said, "I think the problem is the faculty — their average age is 57 and they're teaching in a 'post-Gutenberg' mode."

A very thoughtful man named Jeffery Bannister, who at the time was president of Butler College, was seated next to me. "Post-Gutenberg?" he said. "I don't think so! At least not at Butler. Our model of learning is pre-Gutenberg! We've got a bunch of professors reading from handwritten notes, writing on blackboards, and the students are writing down what they say. This is a pre-Gutenberg model — the printing press is not even an important part of the learning paradigm." He added, "Wait till these students who are 14 and have grown up learning on the Net hit the [college] classrooms — sparks are going to fly."

Bannister was right. A powerful force to change the university is the students. And sparks are flying today. There is a huge generational clash emerging in these institutions. It turns out that the critique of the university from years ago were ideas in waiting — waiting for the new web and a new generation of digital natives who could effectively challenge the old model.

Changing the model of pedagogy for this generation is crucial for the survival of the university. If students turn away from a traditional university education, this will erode the value of the credentials universities award, their position as centers of learning and research, and as campuses where young people get a change to "grow up."


"A captivating collection of essays ... a medley of big ideas." — Amanda Gefter, New Scientist

WHAT'S NEXT?
Dispatches on the Future of Science
Edited By Max Brockman

If these authors are the future of science, then the science of the future will be one exciting ride! Find out what the best minds of the new generation are thinking before the Nobel Committee does. A fascinating chronicle of the big, new ideas that are keeping young scientists up at night. Daniel Gilbert, author of Stumbling on Happiness

"A preview of the ideas you're going to be reading about in ten years." — Steven Pinker, author of The Stuff of Thought

"Brockman has a nose for talent." — Nassim Nicholas Taleb, author The Black Swan

"Capaciously accessible, these writings project a curiosity to which followers of science news will gravitate." — Booklist


On "WILL WE DECAMP FOR THE NORTHERN RIM?"
By Laurence C. Smith


Stewart Brand
, Alun Anderson, Laurence C. Smith


STEWART BRAND
Founder, Whole Earth Catalog, cofounder; The Well; cofounder, Global Business Network; Author, Whole Earth Discipline: An Ecopragmatist Manifesto (October 15th)

One of the finest short essays I've seen.

I'm eager to hear Smith's perspective on the Northern Rim as a climate driver. As the permafrost melts and the boreal forest marches north, what happens with methane and CO2 emissions? What happens with snow and vegetation albedo? What happens with cloud and precipitation regimes? How are coastal areas different from the vast inlands?

And what does Smith think of ecologist Sergei Zimov's effort to restore the "mammoth tundra steppe" in northeastern Siberia?


ALUN ANDERSON
Senior Consultant (and former Editor-in-Chief and Publishing Director of New Scientist); Author, After the Ice: Life, Death and Politics in the New Arctic (November 15th)

This is a fun essay if you read it backwards. The real conclusion towards the end is that we are talking about a "conversion from land that is hardly livable to land that is somewhat livable" which is perhaps not such a big change.

I don't think that the small change in winter low temperatures as climate warms is the constraint on the development of the Arctic.

The fundamental constraints are the world price of natural resources and the strength of the environmental lobby (outside of Russia). There has been  development in the Arctic already, long before the temperature warmed a little bit. It is just very very expensive.

A good way to understand the remoteness of the Arctic is to consider Canada's territory of Nunavut. It is three times the size of California and has a population of 30,000 people. To put that in perspective, imagine if the entire population of the United States was just 150,000, or only 3,000 people lived in the whole of Great Britain. These nations would then have the same population density as Nunavut (might sound like heaven to some). There are no roads, railroads or useful ports in Nunavut. You face distance, cold, no infrastructure and an unbelievably sparse population. Warming undermines hunting and merely lengthens the ice-free season for supply by ship by a small period. That's it.

To take out natural mineral resources from these areas you have to start with high-value-for-weight materials like diamonds and gold which can be flown out. But these mines don't really bring development, just short-lived boom and bust mining towns. When you move onto heavier stuff, like iron ore, that can run for many decades, you have staggering transport investment costs. Mary River in Baffin Island is home to perhaps the biggest deposit of high grade iron ore in the world but to get it out requires a railway across the tundra, a new port and a fleet of ice breaking bulk carriers. The ambitious BaffinLand company is all ready and waiting for a $4.1  billion investment to get going and I hope they succeed. But such big opportunities are rare and few have been taken, like the Red Dog Mine in Alaska and Norilsk in Russia.

Oil and gas are the only large sources of long-term natural resource wealth across the Arctic. In Alaska, offshore development, where the big fields are, has come to a halt in the face of environmental groups concerned about the risks to wildlife and fisheries, already under strain from climate change. Only in Russia is really rapid development under way. That's where the innovation is right now, in the Shtokman field and out around Yamal. Is this really going to change as we face the realities of a warming world? Are we going to say, let's go for lots more high-priced Arctic oil in Alaska? More hydrocarbons please and don't worry about oil spills and polar bears? I doubt it, or if we do, it won't be for long. The race will go to the brilliant innovators who show us how to replace high-priced oil.

Still I very much hope that some development will come to the Arctic, but not any more people. Taking the arc of land from Alaska to Greenland, the Inuit lands, the situation of the indigenous people is very tough. There's a booming population, (60 per cent of the population is under 25 in Nunavut, while in the US, 32 per cent of the population is under 25), high unemployment, staggering suicide rates for young men (it's not the dark cold winters, 80 per cent of suicides are in the 24 hour summer light), and low education levels. They desperately need jobs. One Inuit regional development official put it like this to me: "When Inuit are making a meaningful living, it's a lot better. You see the community being much more vibrant, everybody feels much better about themselves and life is good."

So I'd rather not think of the Arctic as a place southerners might settle but as a place where we southerners might help to bring a life that is good.

LINKS: See "Green Oil"; "The Changing Arctic: A Response to Freeman Dyson's "Heretical Thoughts"


LAURENCE C. SMITH
Professor and Vice Chairman of Geography and Professor of Earth and Space Sciences, UCLA



Brand and Anderson's remarks highlight beautifully the strange dichotomy that our northern high latitudes have with the rest of the world. They are remote, marginal, and thinly populated yet also have enormous potential to play with the rest of us. One way, as Anderson describes, is through sharing of their vast resource wealth. It's an iffy affair that depends crucially on the economics of transport, labor availability, and commodity prices; and often environmental regulation.

The prospect of southern refugees pouring into the Arctic — or even wanting to — is miniscule; time will whether coming decades will see the rapid growth of human activities in the North. But the pressures are there, and climate change is just one of several including demographic, political, and resource-based. Aboriginal people are in a surprisingly good position to advance northern development and with it, themselves.

Another way is through climate-change feedback loops, most famously the sea ice-albedo effect (shrinking sea ice > dark ocean exposed > less sunlight reflected to space > a net heat gain for the planet), but there are others and Brand's questions highlight some of the biggest. Albedo (reflectivity) is important not only for sea ice but for land, where a shorter snow season — or new boreal forest poking darkly out of the snow — tends to reduce it thus amplifying the warming. Clouds remain a huge uncertainty: they create opposite effects by turning away sunlight by day but trapping heat by night.

Cloud physics are poorly captured in our coarse-scale climate models and the future of cloud radiative forcing is a very active research subject. However, the models express near-unanimity when it comes to precipitation: That is going to increase. It probably already has, if only our lousy snow-gauges could measure it well enough. Other far-reaching effects of a warming North include global sea-level rise (from melting land-based glaciers, not sea ice), an easing of extreme winter cold (allowing northward penetration of southern biota, pests, and disease), and — as Brand notes, the potential unleashing of new greenhouse gas sources from thawing, carbon-rich soils.

Methane, a more potent but less voluminous greenhouse gas than CO2, is expected to rise under a warmer, wetter future thanks to accelerated anaerobic decomposition in wetlands and waterlogged soil. The fate of methane hydrates (a sort of ancient methane dry-ice, found in sea beds and some permafrost ground) is deeply controversial. CO2 release from thawing soils has, until just this week, been the biggest unknown of all: Might heating the frozen "carbon storage locker" of (relatively) undecomposed, organic-rich soils turn thawing tundra into a giant rotting compost heap, outgassing vast quantities of old sequestered carbon as freshly-made CO2 back to the atmosphere? Or would the simultaneous explosion in vegetation growth snatch back that carbon, storing it once again in the form of new biomass?

Because huge amounts of carbon constantly exit and enter northern soils — the net balance is but a tiny residual of two huge numbers of opposing sign — this question has frustrated us for years. But just this week, Ted Schuur and others may have discovered the answer. In their paper published in Nature (i) they learned, by radiocarbon dating the ages of carbon released from recent vs. not-so-recently thawed permafrost, that the vegetation grab-back is likely a temporary, transient effect. So it seems, Mr. Brand, the CO2 compost heap remains very much on the table.

__
(i) E.A.G. Schuur et al., The effect of permafrost thaw on old carbon release and net carbon exchange from tundra, Nature 459, 28 May 2009.





THE NEW YORK TIMES
June 2, 2009

ESSAY

Wisdom in a Cleric’s Garb; Why Not a Lab Coat Too?

By DENNIS OVERBYE


The movie “Angels & Demons” offers a chance to join an ancient discussion on religion and science.

There is a warm fuzzy moment near the end of the movie “Angels & Demons,” starring Tom Hanks and directed by Ron Howard.

Mr. Hanks as the Harvard symbologist Robert Langdon has just exposed the archvillain who was threatening to blow up the Vatican with antimatter stolen from a particle collider. A Catholic cardinal who has been giving him a hard time all through the movie and has suddenly turned twinkly-eyed says a small prayer thanking God for sending someone to save them.

Mr. Hanks replies that he doesn’t think he was “sent.”

Of course he was, he just doesn’t know it, the priest says gently. Mr. Hanks, taken aback, smiles in his classic sheepish way. Suddenly he is not so sure.

This may seem like a happy ending. Faith and science reconciled or at least holding their fire in the face of mystery. But for me that moment ruined what had otherwise been a pleasant two hours on a rainy afternoon. It crystallized what is wrong with the entire way that popular culture regards science. Scientists and academics are smart, but religious leaders are wise.

These smart alecks who know how to split atoms and splice genes need to be put in their place by older steadier hands.

It was as if the priest had patted Einstein on the head and chuckled, “Never mind, Sonny, some day you’ll understand." ...

...In a recent interview, Mr. Howard said that he didn’t think there was any conflict between science and religion. Both are after big mysteries, but whatever science finds, he said, “There’s still going be that question: ‘And before that?’ ”

I don’t really mind that the movie and book have rewritten history, and the movie takes fewer liberties with science than much science fiction.

But I can’t help being bugged by that warm, fuzzy moment at the end, that figurative pat on the head. After all is said and done, it seems to imply, having faith is just a little bit better than being smart. ...




NEWSWEEK
June 1, 2009

Can Admitting a Wrong Make It Right?
To address the future of the Middle East, Obama must look to the past.
By Christopher Dickey

When the president of the United States of America stands before a huge crowd at Cairo University and makes his long-anticipated speech to the Muslim world Thursday, will he say that he's sorry? Will he, for instance, offer to make amends for the blind support some of his predecessors have shown for Israel's occupation of Arab lands? Will he ask forgiveness for the CIA coup in Iran that overthrew a democratically elected government there in 1953? Will Barack Obama try to talk directly to the people and apologize for the many decades Washington has spent supporting Arab dictators, including the one who rules in Egypt, the country where he is speaking?

Probably Obama will say none of these things, and wise voices argue that he should not. "Discussions of who is going to apologize for what and how the apologies are going to be worded and what they're supposed to convey is a prescription for getting sidetracked, bogged down and producing more antagonism," former U.S. national-security adviser Zbigniew Brzezinksitold me a few weeks ago.

There would have to be reciprocity, after all. Will we hear the Iranians apologize for their long history supporting suicide bombings, and the holding of American hostages in Tehran and Beirut in the 1980s? Or their training and equipping of militias that killed many American troops in Iraq? Would the Palestinians regret the repeated slaughter of innocent Israelis in blatant terrorist attacks?

You see the problem. Yet there is a body of evidence to suggest that the most vital element in Middle East peacemaking may lie in questions of language and symbols–what social anthropologist Scott Atran calls a "moral logic" based on "sacred values." And sometimes what that boils down to, essentially, is saying you're sorry. As Atran sees it, this is not really a theological question. It's more fundamental than fundamentalism. The need for dignity and respect—a craving for recognition and vindication—is at the heart of the region's most intractable conflicts.

Such issues defy conventional notions of cost and benefit, says Atran, who holds distinguished posts at the University of Michigan, John Jay College in New York and the National Center for Scientific Research in France. Working with fellow scholar Jeremy Ginges, Atran has interviewed Israelis and Arabs, leaders and followers, throughout the region. And he has found that among the hardliners who now tend to dominate the debate and dictate stalemate on all sides, the offer of money or other material benefits not only is rejected, it increases their anger and their recalcitrance. "Billions of dollars have been sacrificed to demonstrate the advantages of peace and coexistence," Atran and Ginges wrote earlier this year at the height of fighting in Gaza. "Yet still both sides opt for war."

Even when ballots replace bullets, these factors that Atran calls "intangible" remain important. An obvious reason that extremists have done so well in the region's elections in recent years, whether among the Arabs, Iranians or Israelis, is that they have addressed emotional and moral questions head on. ...



SCIENCE
May 22, 2009

RETROSPECTIVE:
John Maddox (1925–2009)
Nicholas Wade
Nicholas Wade, now at the New York Times, was at Nature from 1968 to 1971.

Scientific ideas are exciting, yet the scientific literature is far from lively. John Maddox's achievement was to sidestep the drabness of scientific writing by emphasizing the ideas that thrived beneath the leaden prose. In doing so he made Nature a compellingly interesting journal whose success forced others in the staid world of scientific publishing to adopt many of his ideas.

Though trained as a physical chemist, Maddox was a journalist at heart, having spent his formative early career as a science writer for the Manchester Guardian. On becoming editor of Nature in 1966, he recognized that the dry format of the scientific article could not be greatly changed but that the excitement of scientific ideas could be conveyed by other kinds of articles. Maddox expanded the News and Views section of Nature into a lively commentary on the scientific issues of the day.

Long is the list of original ideas that have been rejected by Nature or Science but later proved correct. My guess is that far fewer of these mistaken rejections occurred while Maddox was editor. He loved new ideas and was always ready to take a chance on a bold paper.

His other great virtue as an editor was that he thought like a publisher. Instead of waiting for interesting papers to come in, he went around asking for them, and people responded to his enthusiasm. The more visible Nature became under its unorthodox new editor, the more its prestige and circulation grew, especially in the United States. ...



THE WALL STREET JOURNAL
June 1, 2009

Black Swan Fund Makes a Big Bet on Inflation
By SCOTT PATTERSON

A hedge fund firm that reaped huge rewards betting against the market last year is about to open a fund premised on another wager: that the massive stimulus efforts of global governments will lead to hyperinflation.

The firm, Universa Investments L.P., is known for its ties to gloomy investor Nassim Nicholas Taleb, author of the 2007 bestseller "The Black Swan," which describes the impact of extreme events on the world and financial markets.

Funds run by Universa, which is managed and owned by Mr. Taleb's long-time collaborator Mark Spitznagel, last year gained more than 100% thanks to its bearish bets. Universa now runs about $6 billion, up from the $300 million it began with in January 2007. Earlier this year, Mr. Spitznagel closed several funds to new investors....

Mr. Taleb doesn't have an ownership interest in the Santa Monica, Calif., firm, but he has significant investments in it and helps shape its strategies.

The term "black swan," which has become a market catchphrase in the last few years, alludes to the once-widespread belief in the West that all swans are white. The notion was proven false when European explorers discovered black swans in Australia. A black-swan event, according to Mr. Taleb, is something that is extreme and highly unexpected. ...



THE NEW YORK TIMES
May 29, 2009

A Human Language Gene Changes the Sound of Mouse Squeaks
NICHOLAS WADE

People have a deep desire to communicate with animals, as is evident from the way they converse with their dogs, enjoy myths about talking animals or devote lifetimes to teaching chimpanzees how to speak. A delicate, if tiny, step has now been taken toward the real thing: the creation of a mouse with a human gene for language.

The gene, FOXP2, was identified in 1998 as the cause of a subtle speech defect in a large London family, half of whose members have difficulties with articulation and grammar. All those affected inherited a disrupted version of the gene from one parent. FOXP2 quickly attracted the attention of evolutionary biologists because other animals also possess the gene, and the human version differs significantly in its DNA sequence from those of mice and chimpanzees, just as might be expected for a gene sculpted by natural selection to play an important role in language.

Researchers at the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, have now genetically engineered a strain of mice whose FOXP2 gene has been swapped out for the human version. Svante Paabo, in whose laboratory the mouse was engineered, promised several years ago that when the project was completed, "We will speak to the mouse." He did not promise that the mouse would say anything in reply, doubtless because a great many genes must have undergone evolutionary change to endow people with the faculty of language, and the new mouse was gaining only one of them. So it is perhaps surprising that possession of the human version of FOXP2 does in fact change the sounds that mice use to communicate with other mice, as well as other aspects of brain function. ...



NEWSWEEK
June 8, 2009

Let's Talk About God
By Lisa Miller
A new book redefines the faith debate
.

The atheist writers Sam Harris, Richard Dawkins and Christopher Hitchens have presented us with a choice: either you don't believe in God or you're a dope. "It is perfectly absurd for religious moderates to suggest that a rational human being can believe in God, simply because that belief makes him happy," writes Harris in the 2005 "Atheist Manifesto" now posted on the Web site of his new nonprofit, The Reason Project. Their brilliance, wit and (general) good humor have made the new generation of atheists celebrities among people who like to consider themselves smart. We enjoy their books and their telegenic bombast so much that we don't mind their low opinion of us. Dopey or not, 90 percent of Americans continue to say they believe in God. ...

...Robert Wright's The Evolution of God, which comes out next week, is about to reframe this debate. Wright doesn't argue one side or other of the "Is God real?" question. He leaves that aside. Instead, he grapples with God as an idea that has changed—evolved—through history. ...

...Though he never comes right out and declares that the human propensity for morality—and, by extension, truth and love—is given by God (or is God), he comes awfully close. In an imaginary debate with a scientist, he compares God to an electron. You know it's there, but you don't know anything real about what it looks like or what its properties are. Scientists believe in electrons because they see the effects of electrons on the world. "You might say," he writes in his afterword, "that love and truth are the two primary manifestations of divinity in which we can partake, and that by partaking in them we become truer manifestations of the divine. Then again, you might not say that. The point is just that you wouldn't have to be crazy to say it." (I can already hear Steven Pinker typing like mad.)

With those three sentences, Wright gives relief and intellectual ballast to those believers weary of the punching-bag tone of the recent faith-and—reason debates. ...



THE NEW YORK TIMES
May 28, 2009

OP-ED COLUMNIST

Would You Slap Your Father? If So, You're a Liberal

By NICHOLAS D. KRISTOF

...This came up after I wrote a column earlier this year called "The Daily Me." I argued that most of us employ the Internet not to seek the best information, but rather to select information that confirms our prejudices. To overcome that tendency, I argued, we should set aside time for a daily mental workout with an ideological sparring partner. Afterward, I heard from Jonathan Haidt, a psychology professor at the University of Virginia. "You got the problem right, but the prescription wrong," he said.

Simply exposing people to counterarguments may not accomplish much, he said, and may inflame antagonisms....

...Some evolutionary psychologists believe that disgust emerged as a protective mechanism against health risks, like feces, spoiled food or corpses. Later, many societies came to apply the same emotion to social "threats." Humans appear to be the only species that registers disgust, which is why a dog will wag its tail in puzzlement when its horrified owner yanks it back from eating excrement.

Psychologists have developed a "disgust scale" based on how queasy people would be in 27 situations, such as stepping barefoot on an earthworm or smelling urine in a tunnel. Conservatives systematically register more disgust than liberals. (To see how you weigh factors in moral decisions, take the tests at www.yourmorals.org.) ..



NEW YORK TIMES
May 26, 2009

THE WILD SIDE

Guest Column: Loves Me, Loves Me Not (Do the Math)
BY STEVEN STROGATZ

"In the spring," wrote Tennyson, "a young man's fancy lightly turns to thoughts of love." And so in keeping with the spirit of the season, this week's column looks at love affairs — mathematically. The analysis is offered tongue in cheek, but it does touch on a serious point: that the laws of nature are written as differential equations. It also helps explain why, in the words of another poet, "the course of true love never did run smooth."

To illustrate the approach, suppose Romeo is in love with Juliet, but in our version of the story, Juliet is a fickle lover. The more Romeo loves her, the more she wants to run away and hide. But when he takes the hint and backs off, she begins to find him strangely attractive. He, on the other hand, tends to echo her: he warms up when she loves him and cools down when she hates him.

What happens to our star-crossed lovers? How does their love ebb and flow over time? That's where the math comes in. By writing equations that summarize how Romeo and Juliet respond to each other's affections and then solving those equations with calculus, we can predict the course of their affair. The resulting forecast for this couple is, tragically, a never-ending cycle of love and hate. At least they manage to achieve simultaneous love a quarter of the time. ...



THE NEW YORK TIMES
May 27, 2009

BOOKS OF THE TIMES
Why Are Humans Different From All Other Apes? It's the Cooking, Stupid
By DWIGHT GARNER

Catching Fire" is a plain-spoken and thoroughly gripping scientific essay that presents nothing less than a new theory of human evolution.

Human beings are not obviously equipped to be nature's gladiators. We have no claws, no armor. That we eat meat seems surprising, because we are not made for chewing it uncooked in the wild. Our jaws are weak; our teeth are blunt; our mouths are small. That thing below our noses? It truly is a pie hole.

To attend to these facts, for some people, is to plead for vegetarianism or for a raw-food diet. We should forage and eat the way our long-ago ancestors surely did. For Richard Wrangham, a professor of biological anthropology at Harvard and the author of "Catching Fire," however, these facts and others demonstrate something quite different. They help prove that we are, as he vividly puts it, "the cooking apes, the creatures of the flame."

The title of Mr. Wrangham's new book — "Catching Fire: How Cooking Made Us Human" — sounds a bit touchy-feely. Perhaps, you think, he has written a meditation on hearth and fellow feeling and s'mores. He has not. "Catching Fire" is a plain-spoken and thoroughly gripping scientific essay that presents nothing less than a new theory of human evolution, one he calls "the cooking hypothesis," one that Darwin (among others) simply missed. ...


"For those seeking substance over sheen, the occasional videos released at Edge.org hit the mark. The Edge Foundation community is a circle, mainly scientists but also other academics, entrepreneurs, and cultural figures. ... Edge's long-form interview videos are a deep-dive into the daily lives and passions of its subjects, and their passions are presented without primers or apologies. The decidedly noncommercial nature of Edge's offerings, and the egghead imprimatur of the Edge community, lend its videos a refreshing air, making one wonder if broadcast television will ever offer half the off-kilter sparkle of their salon chatter. — Boston Globe

Mahzarin Banaji, Samuel Barondes, Yochai Benkler, Paul Bloom, Rodney Brooks, Hubert Burda, George Church, Nicholas Christakis, Brian Cox, Iain Couzin, Helena Cronin, Paul Davies, Daniel C. Dennett, David Deutsch,Dennis Dutton, Jared Diamond, Freeman Dyson, Drew Endy, Peter Galison, Murray Gell-Mann, David Gelernter, Neil Gershenfeld, Anthony Giddens, Gerd Gigerenzer, Daniel Gilbert, Rebecca Goldstein, John Gottman, Brian Greene, Anthony Greenwald, Alan Guth, David Haig, Marc D. Hauser, Walter Isaacson, Steve Jones, Daniel Kahneman, Stuart Kauffman, Ken Kesey, Stephen Kosslyn, Lawrence Krauss, Ray Kurzweil, Jaron Lanier, Armand Leroi, Seth Lloyd, Gary Marcus, John Markoff, Ernst Mayr, Marvin Minsky, Sendhil Mullainathan, Dennis Overbye, Dean Ornish, Elaine Pagels, Steven Pinker, Jordan Pollack, Lisa Randall, Martin Rees, Matt Ridley, Lee Smolin, Elisabeth Spelke, Scott Sampson, Robert Sapolsky, Dimitar Sasselov, Stephen Schneider, Martin Seligman, Robert Shapiro, Clay Shirky, Lee Smolin, Dan Sperber, Paul Steinhardt, Steven Strogatz, Seirian Sumner, Leonard Susskind, Nassim Nicholas Taleb, Timothy Taylor, Richard Thaler, Robert Trivers, Neil Turok, J.Craig Venter, Edward O. Wilson, Lewis Wolpert, Richard Wrangham, Philip Zimbardo

[Continue to Edge Video]



WHAT HAVE YOU CHANGED YOUR MIND ABOUT
Edited by John Brockman
With An Introduction By BRIAN ENO

The world's finest minds have responded with some of the most insightful, humbling, fascinating confessions and anecdotes, an intellectual treasure trove. ... Best three or four hours of intense, enlightening reading you can do for the new year. Read it now."
San Francisco Chronicle

"A great event in the Anglo-Saxon culture."
El Mundo


Praise for the online publication of
What Have You Change Your Mind About?

"The splendidly enlightened Edge website (www.edge.org) has rounded off each year of inter-disciplinary debate by asking its heavy-hitting contributors to answer one question. I strongly recommend a visit." The Independent

"A great event in the Anglo-Saxon culture." El Mundo

"As fascinating and weighty as one would imagine." The Independent

"They are the intellectual elite, the brains the rest of us rely on to make sense of the universe and answer the big questions. But in a refreshing show of new year humility, the world's best thinkers have admitted that from time to time even they are forced to change their minds." The Guardian

"Even the world's best brains have to admit to being wrong sometimes: here, leading scientists respond to a new year challenge." The Times

"Provocative ideas put forward today by leading figures."The Telegraph

The world's finest minds have responded with some of the most insightful, humbling, fascinating confessions and anecdotes, an intellectual treasure trove. ... Best three or four hours of intense, enlightening reading you can do for the new year. Read it now." San Francisco Chronicle

"As in the past, these world-class thinkers have responded to impossibly open-ended questions with erudition, imagination and clarity." The News & Observer

"A jolt of fresh thinking...The answers address a fabulous array of issues. This is the intellectual equivalent of a New Year's dip in the lake—bracing, possibly shriek-inducing, and bound to wake you up." The Globe and Mail

"Answers ring like scientific odes to uncertainty, humility and doubt; passionate pleas for critical thought in a world threatened by blind convictions." The Toronto Star

"For an exceptionally high quotient of interesting ideas to words, this is hard to beat. ...What a feast of egg-head opinionating!" National Review Online


WHAT ARE YOU OPTIMISTIC ABOUT?
Today's Leading Thinkers on Why Things Are Good and Getting Better
Edited by John Brockman
Introduction by DANIEL C. DENNETT



[2007]

"The optimistic visions seem not just wonderful but plausible." Wall Street Journal

"Persuasively upbeat." O, The Oprah Magazine

"Our greatest minds provide nutshell insights on how science will help forge a better world ahead." Seed

"Uplifting...an enthralling book." The Mail on Sunday


WHAT IS YOUR DANGEROUS IDEA?
Today's Leading Thinkers on the Unthinkable
Edited by John Brockman
Introduction by STEVEN PINKER
Afterword by RICHARD DAWKINS


[2006]

"Danger – brilliant minds at work...A brilliant bok: exhilarating, hilarious, and chilling." The Evening Standard (London)

"A selection of the most explosive ideas of our age." Sunday Herald

"Provocative" The Independent

"Challenging notions put forward by some of the world's sharpest minds" Sunday Times

"A titillating compilation" The Guardian

"Reads like an intriguing dinner party conversation among great minds in science" Discover


WHAT WE BELIEVE BUT CANNOT PROVE?
Today's Leading Thinkers on Science in the Age of Certainty
Edited by John Brockman
Introduction by IAN MCEWAN


[2006]

"Whether or not we believe proof or prove belief, understanding belief itself becomes essential in a time when so many people in the world are ardent believers." LA Times

"Belief appears to motivate even the most rigorously scientific minds. It stimulates and challenges, it tricks us into holding things to be true against our better judgment, and, like scepticism -its opposite -it serves a function in science that is playful as well as thought-provoking. not we believe proof or prove belief, understanding belief itself becomes essential in a time when so many people in the world are ardent believers." The Times

"John Brockman is the PT Barnum of popular science. He has always been a great huckster of ideas." The Observer

"An unprecedented roster of brilliant minds, the sum of which is nothing short of an oracle—a book ro be dog-eared and debated." Seed

"Scientific pipedreams at their very best." The Guardian

"Makes for some astounding reading." Boston Globe

"Fantastically stimulating...It's like the crack cocaine of the thinking world.... Once you start, you can't stop thinking about that question." BBC Radio 4

"Intellectual and creative magnificence" The Skeptical Inquirer



[2008]



"Compelling"
"Stellar"
"Important"

[2006]

"Irresistible"
"Excellent"
"Fascinating"


[2006]

"incisive"
"deeply passionate"
"engaging"

[2004]

"Intriguing"
"Engrossing"
"Invigorating"



[1994]

"Rousing"
"Astonishing"
"Bloodthirsty"

[2000]

"Dazzling"
"Wondrous"
"Outstanding"


[2002]


"Provocative"
"Captivating"
"Mind-stretching"

Edge Foundation, Inc. is a nonprofit private operating foundation under Section 501(c)(3) of the Internal Revenue Code.


|Top|