MIND

The Crusade Against Multiple Regression Analysis

Richard Nisbett
[1.21.16]

A huge range of science projects are done with multiple regression analysis. The results are often somewhere between meaningless and quite damaging. ...                             

I hope that in the future, if I’m successful in communicating with people about this, that there’ll be a kind of upfront warning in New York Times articles: These data are based on multiple regression analysis. This would be a sign that you probably shouldn’t read the article because you’re quite likely to get non-information or misinformation.

RICHARD NISBETT is a professor of psychology and co-director of the Culture and Cognition Program at the University of Michigan. He is the author of Mindware: Tools for Smart Thinking; and The Geography of Thought. Richard Nisbett's Edge Bio Page


THE CRUSADE AGAINST MULTIPLE REGRESSION ANALYSIS

The thing I’m most interested in right now has become a kind of crusade against correlational statistical analysis—in particular, what’s called multiple regression analysis. Say you want to find out whether taking Vitamin E is associated with lower prostate cancer risk. You look at the correlational evidence and indeed it turns out that men who take Vitamin E have lower risk for prostate cancer. Then someone says, "Well, let’s see if we do the actual experiment, what happens." And what happens when you do the experiment is that Vitamin E contributes to the likelihood of prostate cancer. How could there be differences? These happen a lot. The correlational—the observational—evidence tells you one thing, the experimental evidence tells you something completely different.

L.A. Paul: "The Transformative Experience"

HeadCon '14
L.A. Paul
[11.18.14]

We're going to pretend that modern-day vampires don't drink the blood of humans; they're vegetarian vampires, which means they only drink the blood of humanely farmed animals. You have a one-time-only chance to become a modern-day vampire. You think, "This is a pretty amazing opportunity, do I want to gain immortality, amazing speed, strength, and power? But do I want to become undead, become an immortal monster and have to drink blood? It's a tough call." Then you go around asking people for their advice and you discover that all of your friends and family members have already become vampires. They tell you, "It is amazing. It is the best thing ever. It's absolutely fabulous. It's incredible. You get these new sensory capacities. You should definitely become a vampire." Then you say, "Can you tell me a little more about it?" And they say, "You have to become a vampire to know what it's like. You can't, as a mere human, understand what it's like to become a vampire just by hearing me talk about it. Until you're a vampire, you're just not going to know what it's going to be like."


[48:42 minutes]

L.A. PAUL is Professor of Philosophy at the University of North Carolina at Chapel Hill, and Professorial Fellow in the Arché Research Centre at the University of St. Andrews. L.A. Paul's Edge Bio page


THE TRANSFORMATIVE EXPERIENCE

My name is Laurie Paul, and I'm a professor of philosophy at the University of North Carolina at Chapel Hill. I'm a metaphysician. I'm especially interested in metaphysics and philosophy of mind. I have been developing what I think of as formal phenomenology. In other words, I'm especially interested in looking at formal techniques engaging with the nature of experience, and I've paid special attention to temporal experience. One thing I've been thinking a lot about lately is the notion of transformative experience, which I'll tell you a little bit about today.

The questions that have been occupying me involve questions that come up when we as individuals think about making big life decisions. Metaphorically, it's when we think about making decisions when we're at life's crossroads. As we live our lives, all of us experience a series of these crossroad-style big decisions.

Sarah-Jayne Blakemore: "The Teenager's Sense of Social Self"

HeadCon 14
Sarah-Jayne Blakemore
[11.18.14]

The reason why that letter is nice is because it illustrates what's important to that girl at that particular moment in her life. Less important that man landed on moon than things like what she was wearing, what clothes she was into, who she liked, who she didn't like. This is the period of life where that sense of self, and particularly sense of social self, undergoes profound transition. Just think back to when you were a teenager. It's not that before then you don't have a sense of self, of course you do.  A sense of self develops very early. What happens during the teenage years is that your sense of who you are—your moral beliefs, your political beliefs, what music you're into, fashion, what social group you're into—that's what undergoes profound change.


[36:22 minutes]

SARAH-JAYNE BLAKEMORE is a Royal Society University Research Fellow and Professor of Cognitive Neuroscience, Institute of Cognitive Neuroscience, University College London. Sarah-Jayne Blakemore's Edge Bio


THE TEENAGER'S SENSE OF SOCIAL SELF

I'm Sarah-Jayne Blakemore from University College London. Today I'm going to be talking about the adolescent brain, which is the focus of my lab's research. I'm going to talk about the history of this young area of science, and I'll also tell you about some of the current questions for the future in this area.

I did my PhD on schizophrenia, and I also did a post-doc on schizophrenia. I became interested in the fact that schizophrenia is a devastating psychiatric disease that has its onset right at the end of adolescence. Normally people develop schizophrenia, on average, between about 18 and 25 years. This is interesting because it's a developmental disorder, but it develops much later than most developmental disorders. I became interested in whether that might be something to do with brain development during the teenage years going wrong in people who go on to develop schizophrenia.

This was about 12 years ago. Back then, I delved into the literature and, to my surprise, there was little known about how the human teenage brain develops. There were a handful of studies back in the year 2002, a small handful, but they were intriguing because even though there were only a few of them, they all pointed to significant and protracted development of the brain right throughout adolescence and into the 20s. This was an interesting finding because, prior to those papers, most neuroscientists would have assumed, and the dogma at the time I was an undergraduate and a graduate, was that the human brain stops developing some time in childhood and doesn't change much after mid to late-childhood.

Hugo Mercier: "Toward The Seamless Integration Of The Sciences"

HeadCon '14
Hugo Mercier
[11.18.14]

One of the great things about cognitive science is that it allowed us to continue that seamless integration of the sciences, from physics, to chemistry, to biology, and then to the mind sciences, and it's been quite successful at doing this in a relatively short time. But on the whole, I feel there's still a failure to continue this thing towards some of the social sciences such as, anthropology, to some extent, and sociology or history that still remain very much shut off from what some would see as progress, and as further integration. 


[39:34 minutes]

HUGO MERCIER, a Cognitive Scientist, is an Ambizione Fellow at the Cognitive Science Center at the University of Neuchâtel. Hugo Mercier's Edge Bio Page


TOWARD THE SEAMLESS INTEGRATION OF THE SCIENCES

I am Hugo Mercier. I'm a cognitive scientist, and I currently work at the University of Neuchâtel, in Switzerland, in the Cognitive Science Center. Today I want to talk about the integration of the cognitive and the social sciences, and in particular how the work of Dan Sperber can help us further that integration between the cognitive and the social sciences.  

One of the great things about cognitive science is that it allowed us to continue that seamless integration of the sciences, from physics, to chemistry, to biology, and then to the mind sciences, and it's been quite successful at doing this in a relatively short time. But on the whole, I feel there's still a failure to continue this thing towards some of the social sciences such as, anthropology, to some extent, and sociology or history that still remain very much shut off from what some would see as progress, and as further integration. 

Molly Crockett: "The Neuroscience of Moral Decision Making"

HeadCon '14
Molly Crockett
[11.18.14]

Imagine we could develop a precise drug that amplifies people's aversion to harming others; on this drug you won't hurt a fly, everyone taking it becomes like Buddhist monks. Who should take this drug? Only convicted criminals—people who have committed violent crimes? Should we put it in the water supply? These are normative questions. These are questions about what should be done. I feel grossly unprepared to answer these questions with the training that I have, but these are important conversations to have between disciplines. Psychologists and neuroscientists need to be talking to philosophers about this. These are conversations that we need to have because we don't want to get to the point where we have the technology but haven't had this conversation, because then terrible things could happen.

MOLLY CROCKETT is an associate professor in the Department of Experimental Psychology, University of Oxford; Wellcome Trust Postdoctoral Fellow, Wellcome Trust Centre for Neuroimaging. Molly Crockett's Edge Bio Page


THE NEUROSCIENCE OF MORAL DECISION MAKING

I'm a neuroscientist at the University of Oxford in the UK. I'm interested in decision making, specifically decisions that involve tradeoffs; for example, tradeoffs between my own self-interest and the interests of other people, or tradeoffs between my present desires and my future goals.

One thing that's always fascinated me, specifically about human decision making, is the fact that we have multiple conflicting motives in our decision process. And not only do we have these forces pulling us in different directions, but we can reflect on this fact. We can witness the tug of war that happens when we're trying to make a difficult decision. One thing that is great about our ability to reflect on this process is that it suggests that we can intervene somehow in our decisions. We can make better decisions—more self-controlled decisions, or more moral decisions.
 
The reason I've become interested in the neuroscience of decision making is because I have this sense that pulling apart the different moving parts of this process and looking under the hood will give us clues about where we might be able to intervene and shape our own decisions.

Simone Schnall: "Moral Intuitions, Replication, and the Scientific Study of Human Nature"

Topic: 

  • MIND
http://vimeo.com/106007484

In the end, it's about admissible evidence and ultimately, we need to hold all scientific evidence to the same high standard. Right now we're using a lower standard for the replications involving negative findings when in fact this standard needs to be higher. To establish the absence of an effect is much more difficult than the presence of an effect.

The Paradox of Wu-Wei

Edward Slingerland
[5.2.14]

"One way to look at the trajectory of Chinese thought is that it's driven by this tension I call "the paradox of wu-wei." Wu-wei is effortless action or spontaneity. They all want you to be wu-wei, but none of them think you are right now. You've got to try to be wu-wei, but how do you try not to try? How do you try to be spontaneous? I call it the paradox of wu-wei, and I argue it's at the center of all their theorizing about other things. There are theories about human nature, there are theories about self-cultivation, there are theories about government. These are all ways of grappling with this central tension that's driving a lot of the theorizing."

EDWARD SLINGERLAND is Professor of Asian Studies and Canada Research Chair in Chinese Thought and Embodied Cognition at the University of British Columbia and the author of Trying Not to Try: The Art and Science of SpontaneityEdward Slingerland's Edge Bio Page


THE PARADOX OF WU-WEI

My training was fairly traditional. I got degrees in sinology, the study of Chinese language, and religious studies. I finished my dissertation, which was a fairly traditional, intellectual history of this concept of wu-wei, or effortless action in early China, and it got accepted by Oxford University Press. I was supposed to clean it up and turn it in, and then everything started to go sideways. The first job I had at the University of Colorado, Boulder, I was about to turn in the manuscript and a graduate student in a class I was teaching handed me this book and said, "You might be interested in this," and it was Lakoff and Johnson's Philosophy in the Flesh, which had just come out. This book blew my mind. It immediately solved all of these problems I had with what I was doing.

I had this problem where I was arguing with all these different stories and different texts and saying they're all about wu-wei, they're all about effortless action, but many of the stories don't use the term wu-wei. So how can I say they're really talking about the same concept if they're not using the word? My only solution at that point was just to put the stories side by side and go, "Eh?" Reading about metaphor theory changed everything. The basic argument that Lakoff and Johnson lay out is that we're not disembodied minds floating around somewhere. We are embodied creatures. A lot of our cognition is arising from our embodied interactions with the world, pre-linguistic interactions with the world. And so we build up these basic patterns: walking down a path, dealing with objects, dealing with containers that then structure our abstract thinking. A lot of even very abstract philosophical language is relying on very basic bodily experiences.

DISFLUENCY

A Conversation with
Adam Alter
[2.25.13]

We've shown that disfluency leads you to think more deeply, as I mentioned earlier, that it forms a cognitive roadblock, and then you think more deeply, and you work through the information more comprehensively. But the other thing it does is it allows you to depart more from reality, from the reality you're at now. 
 

Introduction

Adam Alter is interested in examining the concrete ways in which we are affected by subtle cues, such as symbols, culture, and colors. Why are Westerners easily fooled by the Müller-Lyer illusion of two lines with different arrows at their ends, while Bushmen from southern Africa are not? Why do certain colors have a calming effect on the intoxicated? Why is it that people with easy-to-pronounce names get ahead in life?

In this conversation, we get an overview of Alter's current line of work on how we experience fluent and disfluent information. Fluency implies that information comes at a very low cost, often because it is already familiar to us in some similar form. Disfluency occurs when information is costly–perhaps it takes a lot of effort to understand a concept, or a name is unfamiliar and therefore difficult to say. His work has interesting implications in the realms of market forces (stocks with pronounceable ticker codes tend to do better when they first enter the market than those that don't, for instance) and globalization, and is highly relevant in a world where cultures continue to meet and to merge.

Jennifer Jacquet

ADAM ALTER is an Assistant Professor of Marketing at Stern School of Business, NYU. He is the author of Drunk Tank Pink: And Other Unexpected Forces that Shape How We Think, Feel, and Behave. Adam Alter's Edge Bio Page 


[33.48 minutes]

[This conversation with Adam Alter was conducted in New York City by Edge Editor-at-Large, Jennifer Jacquet.]


DISFLUENCY

From the beginning of psychology, psychologists have been interested in two aspects of thinking. They've been interested in a lot of aspects, but one of the ways of breaking thinking down is into these two. The first is to look at the content of our thoughts or our cognitions. What is the stuff that we're thinking about, and that goes all the way back to the 1800s, with the introspectionists; they were interested in looking at what people were actually thinking about when they were introspecting. That's carried right through psychology from the 1800s to today.

Another aspect that's only been studied much more recently, which has formed the basis for a lot of my research now, is to look at not what people are thinking, but how that experience is for them. What is it like to think about these things? That topic is known as meta-cognition, and it's basically the idea that when we have any thoughts, not only do we have those thoughts, but there's an overlaid experience of how it feels to have those thoughts. Is it easy to generate those thoughts? Is it difficult to process them? Do we feel like we understand what we're thinking about well? Do we think we understand it poorly? I'll give you a couple of examples of this.


The Normal Well-Tempered Mind

Daniel C. Dennett
[1.8.13]

The vision of the brain as a computer, which I still champion, is changing so fast. The brain's a computer, but it's so different from any computer that you're used to. It's not like your desktop or your laptop at all, and it's not like your iPhone except in some ways. It's a much more interesting phenomenon. What Turing gave us for the first time (and without Turing you just couldn't do any of this) is a way of thinking about in a disciplined way and taking seriously phenomena that have, as I like to say, trillions of moving parts. Until late 20th century, nobody knew how to take seriously a machine with a trillion moving parts. It's just mind-boggling. 

Hardcover [ May, 2013 ]
Daniel C. Dennett
 

DANIEL C. DENNETT is University Professor, Professor of Philosophy, and Co-Director of the Center for Cognitive Studies at Tufts University. His books include Consciousness Explained; Darwin's Dangerous Idea; Kinds of Minds; Freedom Evolves; and Breaking the Spell. Daniel C. Dennett's Edge Bio Page


[50 minutes]

THE NORMAL WELL-TEMPERED MIND

Topic: 

  • MIND
http://vimeo.com/81869179

The vision of the brain as a computer, which I still champion, is changing so fast. The brain's a computer, but it's so different from any computer that you're used to. It's not like your desktop or your laptop at all, and it's not like your iPhone except in some ways. It's a much more interesting phenomenon. What Turing gave us for the first time (and without Turing you just couldn't do any of this) is a way of thinking about in a disciplined way and taking seriously phenomena that have, as I like to say, trillions of moving parts.

Pages

Subscribe to RSS - MIND