Quantum Hanky-Panky

Seth Lloyd [8.22.16]

Thinking about the future of quantum computing, I have no idea if we're going to have a quantum computer in every smart phone, or if we're going to have quantum apps or quapps, that would allow us to communicate securely and find funky stuff using our quantum computers; that's a tall order. It's very likely that we're going to have quantum microprocessors in our computers and smart phones that are performing specific tasks.

This is simply for the reason that this is where the actual technology inside our devices is heading anyway. If there are advantages to be had from quantum mechanics, then we'll take advantage of them, just in the same way that energy is moving around in a quantum mechanical kind of way in photosynthesis. If there are advantages to be had from some quantum hanky-panky, then quantum hanky‑panky it is. 

SETH LLOYD, Professor, Quantum Mechanical Engineering, MIT; Principal Investigator, Research Laboratory of Electronics; Author, Programming the UniverseSeth Lloyd's Edge Bio Page

Right now, there's been a resurgence of interest in ideas of applying quantum mechanics and quantum information to ideas of quantum gravity, and what the fundamental theory of the universe actually is. It turns out that quantum information has a lot to offer people who are looking at problems like, for instance, what happens when you fall into a black hole? (By the way, my advice is don't do that if you can help it.) If you fall into a black hole, does any information about you ever escape from the black hole? These are questions that people like Stephen Hawking have been working on for decades. It turns out that quantum information has a lot to give to answer these questions.                                

Summer Reading: Highlights From the Edge Archive

[7.18.16]

"Deliciously creative, the variety astonishes. Intellectual skyrockets of stunning brilliance. Nobody in the world is doing what Edge is doing...the greatest virtual research university in the world.
— Denis Dutton, Founding Editor, Arts & Letters Daily

[ED NOTE: It’s summer and a good time to reflect on twenty years of Edge. Each week through the rest of the season, we will revisit five highlights from the Edge archives worthy of your time and attention. — JB]


The question is, as a scientist, can we take these ideas and do what we did in biology, at least based on networks and other ideas, and put this into a quantitative, mathematizable, predictive theory, so that we can understand the birth and death of companies, how that stimulates the economy?

[Continue...]


Dan Sperber came up with the argumentative theory because he was taking stock of what was happening in the world of psychology at large. A lot of people in psychology were accumulating evidence that the mind, and reasoning in particular, doesn't work so well. Reasoning produces a lot of mistakes. We are not very good in statistics, and we can't understand very basic logical problems.

We do all these irrational things, and despite mounting results, people are not really changing their basic assumption. They are not challenging the basic idea that reasoning is for individual purposes. The premise is that reasoning should help us make better decisions, get at better beliefs. And if you start from this premise, then it follows that reasoning should help us deal with logical problems and it should help us understand statistics. But reasoning doesn't do all these things, or it does all these things very, very poorly.

[Continue...]


Prediction is the very essence of science. We judge the correctness a scientifictheory by its ability to predict specificevents. And from amore real-world practical point of view, the primary purpose of science itself is to achieve a prediction capability which will give us some control over our lives and some protection from the environment around us. To avoid the dangers of the world we must be able to predict where and especially when they will happen.

[Continue...]


Spinoza had argued that our capacity for reason is what makes each of us a thing of inestimable worth, demonstrably deserving of dignity and compassion. That each individual is worthy of ethical consideration is itself a discoverable law of nature, obviating the appeal to divine revelation. An idea that had caused outrage when Spinoza first proposed it in the 17th century, adding fire to the denunciation of him as a godless immoralist, had found its way into the minds of men who set out to create a government the likes of which had never before been seen on this earth.

Spinoza's dream of making us susceptible to the voice of reason might seem hopelessly quixotic at this moment, with religion-infested politics on the march. But imagine how much more impossible a dream it would have seemed on that day 350 years ago. And imagine, too, how much even sorrier our sorry world would have been without it.

[Continue...]


Brand first encountered systems-oriented ways of thinking at Stanford in a biology class taught by Paul Ehrlich. By the end of the decade, Ehrlich was famous for predicting in his book The Population Bomb (1968) that population growth would soon lead to ecological disaster. In the late 1950s, however, he was concentrating on the fundamentals of butterfly ecology and systems-oriented approaches to evolutionary biology. These preoccupations reflected the extraordinary influence of cybernetics and information theory on American biology following World War II. At the level of microbiology, information theory provided a new language with which to understand heredity. Under its influence, genes and sequences of DNA became information systems, bits of text to be read and decoded. In the 1950s, as Lily Kay has pointed out, microbiology became "a communication science, allied to cybernetics, information theory, and computers." Information theory also exerted a tremendous pull on biological studies of organisms and their interaction. Before World War II, biologists often focused on the study of individual organisms, hierarchical taxonomies of species, and the sexual division of labor. Afterward, many shifted toward the study of populations and the principles of natural selection in terms modeled on cybernetic theories of command and control.

[Continue...]


I believe that self-deception evolves in the service of deceit. That is, that the major function of self-deception is to better deceive others. Both make it harder for others to detect your deception, and also allow you to deceive with less immediate cognitive cost. So if I'm lying to you now about something you actually care about, you might pay attention to my shifty eyes if I'm consciously lying, or the quality of my voice, or some other behavioral cue that's associated with conscious knowledge of deception and nervousness about being detected. But if I'm unaware of the fact that I'm lying to you, those avenues of detection will be unavailable to you.

[Continue...]


So, is a third culture possible, as defined by John Brockman, in which the natural sciences take part in making sense of ourselves and our actions?

[Continue...]


Researchers at UCLA found that cells in the human anterior cingulate, which normally fire when you poke the patient with a needle ("pain neurons"), will also fire when the patient watches another patient being poked. The mirror neurons, it would seem, dissolve the barrier between self and others. [1] I call them "empathy neurons" or "Dalai Lama neurons". (I wonder how the mirror neurons of a masochist or sadist would respond to another person being poked.) Dissolving the "self vs. other" barrier is the basis of many ethical systems, especially eastern philosophical and mystical traditions. This research implies that mirror neurons can be used to provide rational rather than religious grounds for ethics (although we must be careful not to commit the is/ought fallacy).

[Continue...]


There is no progress in evolution. The fact of evolutionary change through time doesn't represent progress as we know it. Progress isn't inevitable. Much of evolution is downward in terms of morphological complexity, rather than upward. We're not marching toward some greater thing.

[Continue...]


Silicon-based life and dust-based life are fiction and not fact. I use them as examples to illustrate an abstract argument. The examples are taken from science-fiction but the abstract argument is rigorous science. The abstract concepts are valid, whether or not the examples are real. The concepts are digital-life and analog-life. The concepts are based on a broad definition of life. For the purposes of this discussion, life is defined as a material system that can acquire, store, process, and use information to organize its activities. In this broad view, the essence of life is information, but information is not synonymous with life. To be alive, a system must not only hold information but process and use it. It is the active use of information, and not the passive storage, that constitutes life.

[Continue...]

Subscribe to Front page feed