Edge 179 — April 7, 2006
(5,650 words)



EDGE WEB TRAFFIC REPORT

For the first three months of the year, Edge readership has grown at a rate that surprises even us. It's an indication that the third culture is alive and well and continues to grow. Thanks from the Edge community to our readers/subscribers for their interest, and to the many bloggers who continue to put out the word. For your interest, here is the Edge Web Traffic Report for the first three months of 2006:

1,436,699 Visitor Sessions — (Daily Average: 15,963)
5,122,517 Page Views — (Daily Average: 56,916)


THE TEMPLETON FOUNDATION: A SKEPTIC'S TAKE [4.07.06]
By John Horgan

I rationalized that taking the foundation's money did not mean that it had bought me, as long as I remained true to my views. Yes, I used the same justification as a congressman accepting a golf junket from the lobbyist Jack Abramoff. But I'd already written freelance pieces for two Templeton publications, so declining this more-lucrative gig seemed silly. In for a dime, in for a dollar.

[...continue]


Science will continue to surprise us with what it discovers and creates; then it will astound us by devising new methods to surprises us. At the core of science's self-modification is technology. New tools enable new structures of knowledge and new ways of discovery. The achievement of science is to know new things; the evolution of science is to know them in new ways. What evolves is less the body of what we know and more the nature of our knowing.

SPECULATIONS ON THE FUTURE OF SCIENCE
[4.7.06]
By Kevin Kelly



INTRODUCTION By Stewart Brand

Science, says Kevin Kelly, is the process of changing how we know things. It is the foundation our culture and society. While civilizations come and go, science grows steadily onward. It does this by watching itself.

Recursion is the essence of science. For example, science papers cite other science papers, and that process of research pointing at itself invokes a whole higher level, the emergent shape of citation space. Recursion always does that. It is the engine of scientific progress and thus of the progress of society.

A particularly fruitful way to look at the history of science is to study how science itself has changed over time, with an eye to what that trajectory might suggest about the future. Kelly chronicled a sequence of new recursive devices in science...

[...continue]


EDGE QUBIT DINNER 2006
New York City — March 23, 2006

It is in the laws of how quantum systems register and process information that we are to find the measure of the universe. Men used to measure distance by the length between their elbow and outstretched fingers—the cubit. Now we measure distances by the information contained in the light emitted by two-level atoms—the quantum bit or qubit. Seth Lloyd

[click to enlarge]

(standing, from left:) Laura Chang, Editor, New York Times, Science Times; Steve Lohr, Tecnology Reporter, New YorkTimes; Seth Lloyd, Physicist, MIT, Programming the Universe; John Rennie, Editor, Scientific American; Jerry Adler, Science Reporter, Newsweek; JB, Tracy Day, Founder, New York Science Festival; John Horgan, science writer; Chris Anderson, TED Conferences; (seated, from left:) Brian Greene, Physicist, Columbia, The Fabric of the Cosmos; Adam Bly, Publisher, Seed; Bob Guccione, Jr,. Publisher, Discover

Seth Lloyd flew down from Cambridge as the keynote speaker at the Edge Qubit dinner. He was supposed to talk about quantum search engine algorithms. But he forgot. Steve Lohr, technology correspondent at The New York Times was there and noted:

"I've chatted with Seth before, of course, and even quoted him, but I've never talked to him at this length. He's a stitch. My personal favorite was his description of teaching quantum computing to first graders: 'Arms up, arms down.'"


HARVARD BOOK STORE PRESENTS...

Wednesday, April 12th, 6:30 PM
@ Longfellow Hall, Askwith Lecture Hall
13 Appian Way

Harper Perennial
$13.95


A discussion about Science in the Age of Certainty
with JOHN BROCKMAN, DANIEL C. DENNETT, DANIEL GILBERT, MARC D. HAUSER, ELIZABETH SPELKE & SETH LLOYD


We are excited to announce that on Wednesday, April 12th Harvard Book Store and Seed Magazine will cosponsor a discussion on Science in the Age of Certainty with John Brockman, Daniel C. Dennett, Daniel Gilbert, Marc D. Hauser, Elizabeth Spelke and Seth Lloyd. This event coincides with the publication of the new book What We Believe But Cannot Prove: Today's Leading Thinkers on Science in the Age of Certainty, edited by Mr. Brockman.

more details...


 


I rationalized that taking the foundation's money did not mean that it had bought me, as long as I remained true to my views. Yes, I used the same justification as a congressman accepting a golf junket from the lobbyist Jack Abramoff. But I'd already written freelance pieces for two Templeton publications, so declining this more-lucrative gig seemed silly. In for a dime, in for a dollar.

THE TEMPLETON FOUNDATION: A SKEPTIC'S TAKE [4.7.06]
by John Horgan

Introduction by John Brockman

In the previous edition of Edge, which reported on the celebration of the 30th anniversary of the publication of Richard Dawkins' The Selfish Gene, Ian McEwan noted the following:

"None of us, I think, in the mid-'70s, when The Selfish Gene was published, would have thought we'd be devoting so much mental space now to confront religion. We thought that matter had long been closed."*

But the matter is far from closed.

John Horgan, in his essay below, has something new to say on the subject as he explores what he considers to be troublesome aspects of the so-called "reconciliation of science and religion". He writes:

Since many Edgies, like me, have been beneficiaries in one way or the other of the Templeton Foundation, which promotes reconciliation of science and religion, I thought they might be interested in my critique of the foundation, which was just published by the Chronicle of Higher Education. It's already stirring up quite a ruckus.

Quite a few Edgies have been the beneficiaries of Templeton Foundation financial support, from $15,000 fees for attending a conference, to the $1,500,000 Temple Prize. It would be interesting to hear from some of these individuals in an Edge Reality Club discussion based on Horgan's essay.

JB

JOHN HORGAN is director of the Center for Science Writings at the Stevens Institute of Technology. He is the author of The End of Science;The Undiscovered Mind; and, most recently, Rational Mysticism: Dispatches From the Border Between Science and Spirituality.

John Horgan's Edge Bio Page


THE TEMPLETON FOUNDATION: A SKEPTIC'S TAKE

(JOHN HORGAN:)
A year ago, I faced an ethical dilemma. The John Templeton Foundation was inviting me to be one of the first batch of Templeton-Cambridge Journalism Fellows in Science and Religion. The 10 fellows were to spend several weeks at the University of Cambridge, listening to scientists and philosophers pontificate on topics related to science and religion. The fellowship not only sounded like fun, it also paid all expenses and threw in an extra $15,000 — a tempting sum for a freelancer, which I was at the time. On the other hand, as an agnostic increasingly disturbed by religion's influence on human affairs, I had misgivings about the foundation's agenda of reconciling religion and science.

So what did I do? I went to Cambridge, of course. I rationalized that taking the foundation's money did not mean that it had bought me, as long as I remained true to my views. Yes, I used the same justification as a congressman accepting a golf junket from the lobbyist Jack Abramoff. But I'd already written freelance pieces for two Templeton publications, so declining this more-lucrative gig seemed silly. In for a dime, in for a dollar.

Then in January, a journalist considering applying for a Templeton journalism fellowship called and asked me about my experience. I found myself trying a bit too hard to justify my acceptance of the fellowship, even as I told the journalist how much I'd enjoyed it. I decided to write this essay to exorcise my lingering guilt, and perhaps to help others wondering whether to join the large and fast-growing list of Templeton donees, which includes many of the world's leading scientists and institutions.

A devout Presbyterian born and raised in Tennessee, John M. Templeton launched the extremely successful Templeton mutual funds in the 1950s and became a billionaire. He started spending serious money to promote his religious values in 1972, when he established the Templeton Prize for Progress Toward Research or Discoveries About Spiritual Realities. The prize, which Templeton stipulated should exceed the Nobel Prize in monetary value, now totals almost $1.5-million and is awarded in Buckingham Palace. Previous winners include Mother Teresa, Billy Graham, Aleksandr Solzhenitsyn, and Charles W. Colson, the born-again Watergate convict. Over the past 20 years, most of the winners have been scientists who see inklings of the divine in nature, including Paul Davies, Freeman J. Dyson, John C. Polkinghorne, and Charles Hard Townes. This year's winner, John D. Barrow, a cosmologist at the University of Cambridge, continues in that vein.

Knighted by Queen Elizabeth in 1987, Templeton established the Templeton Foundation that same year to support a broad range of activities aimed at finding common ground between science and religion. So far the foundation has spent more than $250-million on prizes, academic programs, publications, broadcasts, lectures, conferences, and research on topics such as the neurobiology and genetics of religious belief; the evolutionary origins of altruism; and the medical benefits of prayer, church attendance, and forgiveness.

Sir John recently added $550-million to the foundation, boosting its endowment to $1.1-billion. Foundation officials plan to double their annual outlays, which now total $60-million for more than 300 projects.

By all accounts, Sir John is a charming, open-minded man, who emphasizes the importance of humility in all spheres of life. But at 93, he has yielded the day-to-day leadership of the foundation to his son John Jr., a pediatric surgeon who quit his practice in 1995 to become the organization's president. An evangelical Christian, "Jack" is the chairman of Let Freedom Ring Inc., which raises funds for conservative causes. He has reportedly contributed to both presidential campaigns of George Bush, whose relations with the scientific community are arguably the worst of any president in history.

Nevertheless the nation's leading scientific organization — the American Association for the Advancement of Science — and scores of research universities are Templeton Foundation beneficiaries. Largely as a result of Templeton grants, some 90 American medical schools now offer courses on links between health and spirituality. Templeton funds have even trickled down to atheists like the physicist Steven Weinberg, who once proclaimed during a AAAS conference sponsored by the foundation in 1999, "I am all in favor of a dialogue between science and religion, but not a constructive dialogue."

Weinberg has not held his tongue as a result of pocketing Templeton cash, but other recipients have. In March 2003, I attended a Templeton-sponsored conference at Stanford University titled "Becoming Human: Brain, Mind, and Emergence" (my expenses were paid not by the foundation but by a magazine). The meeting was supposed to be a dialogue between neuroscientists, such as V.S. Ramachandran, Robert M. Sapolsky, and Antonio R. Damasio, and religious figures, including the theologian Nancey Murphy and the Australian archbishop George Pell. But the dialogue was nominal; each side listened politely to the other's presentations without really commenting on them. Several areligious scientists told me privately that they did not want to challenge the beliefs of religious speakers for fear of offending them and the Templeton hosts.

At least one scientist has publicly refused to accept money from the foundation. Sean M. Carroll, a physicist at the University of Chicago, declined an invitation to speak at a Templeton-sponsored conference held last fall, which featured 16 Nobel laureates and was endorsed by the American Physical Society. Carroll explained in his blog that "the entire purpose of the Templeton Foundation is to blur the line between straightforward science and explicitly religious activity, making it seem like the two enterprises are part of one big undertaking." An atheist, Carroll did not want his name to be "implicitly associated with an effort I find to be woefully misguided." Yet Carroll admitted that he had been tempted by the foundation's offer of a $2,000 honorarium.

Two years ago I faced a similar temptation, when an editor for a Templeton journal asked me to write an essay. Before accepting the assignment — which seemed reasonably interesting and, more important, paid well — I revealed my reservations about the foundation's religious agenda. The editor turned out to be an agnostic who shared my reservations, particularly about the leadership of Jack Templeton; money persuaded me and the editor to swallow our misgivings. I have now written three articles for Templeton publications, for a total of $8,800.

My ambivalence about the foundation came to a head during my fellowship in Cambridge last summer. The British biologist Richard Dawkins, whose participation in the meeting helped convince me and other fellows of its legitimacy, was the only speaker who denounced religious beliefs as incompatible with science, irrational, and harmful. The other speakers — three agnostics, one Jew, a deist, and 12 Christians (a Muslim philosopher canceled at the last minute) — offered a perspective clearly skewed in favor of religion and Christianity.

Some of the Christian speakers' views struck me as inconsistent, to say the least. None of them supported intelligent design, the notion that life is in certain respects irreducibly complex and hence must have a divine origin, and several of them denounced it. Simon Conway Morris, a biologist at Cambridge and an adviser to the Templeton Foundation, ridiculed intelligent design as nonsense that no respectable biologist could accept. That stance echoes the view of the foundation, which over the last year has taken pains to distance itself from the American intelligent-design movement.

And yet Morris, a Catholic, revealed in response to questions that he believes Christ was a supernatural figure who performed miracles and was resurrected after his death. Other Templeton speakers also rejected intelligent design while espousing beliefs at least as lacking in scientific substance.

The Templeton prize-winners John Polkinghorne and John Barrow argued that the laws of physics seem fine-tuned to allow for the existence of human beings, which is the physics version of intelligent design. The physicist F. Russell Stannard, a member of the Templeton Foundation Board of Trustees, contended that prayers can heal the sick — not through the placebo effect, which is an established fact, but through the intercession of God. In fact the foundation has supported studies of the effectiveness of so-called intercessory prayer, which have been inconclusive.

One Templeton official made what I felt were inappropriate remarks about the foundation's expectations of us fellows. She told us that the meeting cost more than $1-million, and in return the foundation wanted us to publish articles touching on science and religion. But when I told her one evening at dinner that — given all the problems caused by religion throughout human history — I didn't want science and religion to be reconciled, and that I hoped humanity would eventually outgrow religion, she replied that she didn't think someone with those opinions should have accepted a fellowship. So much for an open exchange of views.

Still I can't regret spending three weeks in Cambridge classrooms, pubs, and punts, jawing with brainy folks about the meaning of life. The highlight for me was getting to know the nine other fellows, who represented such big-time media as National Public Radio, ABC News, the BBC, The New York Times, and The Washington Post (I was the only freelancer there). About half were believers, and half were skeptics like me.

My conversations with the faithful deepened my appreciation of why some intelligent, well-educated people embrace religion. One reporter discussed the experience of speaking in tongues, and another described having an intimate relationship with Jesus. My convictions did not change, but others' did. At least one fellow said that his faith was wavering as a result of Dawkins's dissection of religion. And if the Templeton Foundation can help bring about even such a tiny step toward my vision of a world without religion, how bad can it be?

The foundation recently named 12 recipients of its 2006 journalism fellowship, and I suspect that some of the new fellows have doubts about jumping on the Templeton bandwagon. The foundation could assuage the misgivings of those and other grantees with a few simple acts.

First, the foundation should state clearly that it is not committed to any particular conclusion of the science-religion dialogue, and that one possible conclusion is that religion — at least in its traditional, supernatural manifestations — is not compatible with science. To demonstrate its open-mindedness, the foundation should award the Templeton Prize to an opponent of religion, such as Steven Weinberg or Richard Dawkins. At the very least, the foundation should post this essay on its Web site.

[Originally published by The Chronicle of Higher Education on April 7, 2006.]


 

[ED. NOTE: As part of the activites of the Long Now Foundation, Stewart Brand has organized a series of seminars which are held at Fort Mason in San Francisco. "The purpose of the series", Brand writes, "is to build a coherent, compelling body of ideas about long-term thinking, to help nudge civilization toward Long Now's goal of making long-term thinking automatic and common instead of difficult and rare."

Speakers in the series so far include a number of Edgies: Brian Eno, Jared Diamond, George Dyson, Kevin Kelly, Clay Shirky, and Bruce Sterling. All seminars are archived and freely downloadable.

The following Edge feature is based on Kevin Kelly's March 10th talk on "The Next 100 Years of Science: Long-term Trends in the Scientific Method." — JB ]


SPECULATIONS ON THE FUTURE OF SCIENCE [4.7.06]
By Kevin Kelly

Science will continue to surprise us with what it discovers and creates; then it will astound us by devising new methods to surprises us. At the core of science's self-modification is technology. New tools enable new structures of knowledge and new ways of discovery. The achievement of science is to know new things; the evolution of science is to know them in new ways. What evolves is less the body of what we know and more the nature of our knowing

Introduction by Stewart Brand

Science, says Kevin Kelly, is the process of changing how we know things.  It is the foundation our culture and society.  While civilizations come and go, science grows steadily onward.  It does this by watching itself.

Recursion is the essence of science.  For example, science papers cite other science papers, and that process of research pointing at itself invokes a whole higher level, the emergent shape of citation space.  Recursion always does that.  It is the engine of scientific progress and thus of the progress of society.

A particularly fruitful way to look at the history of science is to study how science itself has changed over time, with an eye to what that trajectory might suggest about the future.  Kelly chronicled a sequence of new recursive devices in science...

2000 BC — First text indexes
200 BC — Cataloged library (at Alexandria)
1000 AD — Collaborative encyclopedia
1590 — Controlled experiment (Roger Bacon)
1600 — Laboratory
1609 — Telescopes and microscopes
1650 — Society of experts
1665 — Repeatability (Robert Boyle)
1665 — Scholarly journals
1675 — Peer review
1687 — Hypothesis/prediction (Isaac Newton)
1920 — Falsifiability (Karl Popper)
1926 — Randomized design (Ronald Fisher)
1937 — Controlled placebo
1946 — Computer simulation
1950 — Double blind experiment
1962 — Study of scientific method (Thomas Kuhn)

Projecting forward, Kelly had five things to say about the next 100 years in science...

1)  There will be more change in the next 50 years of science than in the last 400 years.

2)  This will be a century of biology.  It is the domain with the most scientists, the most new results, the most economic value, the most ethical importance, and the most to learn.

3)  Computers will keep leading to new ways of science.  Information is growing by 66% per year while physical production grows by only 7% per year.  The data volume is growing to such levels of  "zillionics" that we can expect science to compile vast combinatorial libraries, to run combinatorial sweeps through possibility space (as Stephen Wolfram has done with cellular automata), and to run multiple competing hypotheses in a matrix.  Deep realtime simulations and hypothesis search will drive data collection in the real world.

4)  New ways of knowing will emerge.  "Wikiscience" is leading to perpetually refined papers with a thousand authors.  Distributed instrumentation and experiment, thanks to miniscule transaction cost, will yield smart-mob, hive-mind science operating "fast, cheap, & out of control."  Negative results will have positive value (there is already a "Journal of Negative Results in Biomedicine"). Triple-blind experiments will emerge through massive non-invasive statistical data collection--- no one, not the subjects or the experimenters, will realize an experiment was going on until later. (In the Q&A, one questioner predicted the coming of the zero-author paper, generated wholly by computers.)

5)  Science will create new levels of meaning.  The Internet already is made of one quintillion transistors, a trillion links, a million emails per second, 20 exabytes of memory.  It is approaching the level of the human brain and is doubling every year, while the brain is not.  It is all becoming effectively one machine.  And we are the machine.

"Science is the way we surprise God," said Kelly.  "That's what we're here for."  Our moral obligation is to generate possibilities, to discover the infinite ways, however complex and high-dimension, to play the infinite game.  It will take all possible species of intelligence in order for the universe to understand itself. Science, in this way, is holy.  It is a divine trip.

Stewart Brand

KEVIN KELLY helped launch Wired magazine in 1993, and served as its Executive Editor until January 1999. He is now Editor-At-Large for WiredFrom 1984 to 1990 Kelly was publisher and editor of the Whole Earth Review. In the late 80s, Kelly conceived and oversaw the publication of four versions of the Whole Earth Catalogs. He was a founding board member of the WELL.

Kelly is the author of Out of Control and New Rules for the New Economy, and his writing has appeared in many national and international publications such as the New York Times, The Economist, Time, Harpers, Science, GQ, and Esquire.

Kevin Kelly's Edge Bio page


SPECULATIONS ON THE FUTURE OF SCIENCE

(KEVIN KELLY:) Science will continue to surprise us with what it discovers and creates; then it will astound us by devising new methods to surprises us. At the core of science's self-modification is technology. New tools enable new structures of knowledge and new ways of discovery. The achievement of science is to know new things; the evolution of science is to know them in new ways. What evolves is less the body of what we know and more the nature of our knowing.

Technology is, in its essence, new ways of thinking. The most powerful type of technology, sometimes called enabling technology, is a thought incarnate which enables new knowledge to find and develop news ways to know. This kind of recursive bootstrapping is how science evolves. As in every type of knowledge, it accrues layers of self-reference to its former state.

New informational organizations are layered upon the old without displacement, just as in biological evolution. Our brains are good examples. We retain reptilian reflexes deep in our minds (fight or flight) while the more complex structuring of knowledge (how to do statistics) is layered over those primitive networks. In the same way, older methods of knowing (older scientific methods) are not jettisoned; they are simply subsumed by new levels of order and complexity. But the new tools of observation and measurement, and the new technologies of knowing, will alter the character of science, even while it retains the old methods.

I'm willing to bet the scientific method 400 years from now will differ from today's understanding of science more than today's science method differs from the proto-science used 400 years ago. A sensible forecast of technological innovations in the next 400 years is beyond our imaginations (or at least mine), but we can fruitfully envision technological changes that might occur in the next 50 years.

Based on the suggestions of the observers above, and my own active imagination, I offer the following as possible near-term advances in the evolution of the scientific method.

Compiled Negative Results — Negative results are saved, shared, compiled and analyzed, instead of being dumped. Positive results may increase their credibility when linked to negative results. We already have hints of this in the recent decision of biochemical journals to require investigators to register early phase 1 clinical trials. Usually phase 1 trials of a drug end in failure and their negative results are not reported. As a public heath measure, these negative results should be shared. Major journals have pledged not to publish the findings of phase 3 trials if their earlier phase 1 results had not been reported, whether negative or not.

Triple Blind Experiments – In a double blind experiment neither researcher nor subject are aware of the controls, but both are aware of the experiment. In a triple blind experiment all participants are blind to the controls and to the very fact of the experiment itself. The way of science depends on cheap non-invasive sensor running continuously for years generating immense streams of data. While ordinary life continues for the subjects, massive amounts of constant data about their lifestyles are drawn and archived. Out of this huge database, specific controls, measurements and variables can be "isolated" afterwards. For instance, the vital signs and lifestyle metrics of a hundred thousand people might be recorded in dozens of different ways for 20-years, and then later analysis could find certain variables (smoking habits, heart conditions) and certain ways of measuring that would permit the entire 20 years to be viewed as an experiment – one that no one knew was even going on at the time. This post-hoc analysis depends on pattern recognition abilities of supercomputers. It removes one more variable (knowledge of experiment) and permits greater freedom in devising experiments from the indiscriminate data.

Images-25

Combinatorial Sweep Exploration – Much of the unknown can be explored by systematically creating random varieties of it at a large scale. You can explore the composition of ceramics (or thin films, or rare-earth conductors) by creating all possible types of ceramic (or thin films, or rare-earth conductors), and then testing them in their millions. You can explore certain realms of proteins by generating all possible variations of that type of protein and they seeing if they bind to a desired disease-specific site. You can discover new algorithms by automatically generating all possible programs and then running them against the desired problem. Indeed all possible Xs of almost any sort can be summoned and examined as a way to study X. None of this combinatorial exploration was even thinkable before robotics and computers; now both of these technologies permit this brute force style of science. The parameters of the emergent "library" of possibilities yielded by the sweep become the experiment. With sufficient computational power, together with a pool of proper primitive parts, vast territories unknown to science can be probed in this manner.

Images-26

Evolutionary Search – A combinatorial exploration can be taken even further. If new libraries of variations can be derived from the best of a previous generation of good results, it is possible to evolve solutions. The best results are mutated and bred toward better results. The best testing protein is mutated randomly in thousands of way, and the best of that bunch kept and mutated further, until a lineage of proteins, each one more suited to the task than its ancestors, finally leads to one that works perfectly. This method can be applied to computer programs and even to the generation of better hypothesis.

Simmatrix

Multiple Hypothesis Matrix – Instead of proposing a series of single hypothesis, in which each hypothesis is falsified and discarded until one theory finally passes and is verified, a matrix of many hypothesis scenarios are proposed and managed simultaneously. An experiment travels through the matrix of multiple hypothesis, some of which are partially right and partially wrong. Veracity is statistical; more than one thesis is permitted to stand with partial results. Just as data were assigned a margin of error, so too will hypothesis. An explanation may be stated as: 20% is explained by this theory, 35% by this theory, and 65% by this theory. A matrix also permits experiments with more variables and more complexity than before.

Pattern Augmentation – Pattern-seeking software which recognizes a pattern in noisy results. In large bodies of information with many variables, algorithmic discovery of patterns will become necessary and common. These exist in specialized niches of knowledge (such particle smashing) but more general rules and general-purpose pattern engines will enable pattern-seeking tools to become part of all data treatment.

Adaptive Real Time Experiments – Results evaluated, and large-scale experiments modified in real time. What we have now is primarily batch-mode science. Traditionally, the experiment starts, the results are collected, and then conclusions reached. After a pause the next experiment is designed in response, and then launched. In adaptive experiments, the analysis happens in parallel with collection, and the intent and design of the test is shifted on the fly. Some medical tests are already stopped or re-evaluated on the basis of early findings; this method would extend that method to other realms. Proper methods would be needed to keep the adaptive experiment objective.

AI Proofs – Artificial intelligence will derive and check the logic of an experiment. Ever more sophisticated and complicated science experiments become ever more difficult to judge. Artificial expert systems will at first evaluate the scientific logic of a paper to ensure the architecture of the argument is valid. It will also ensure it publishes the required types of data. This "proof review" will augment the peer-review of editors and reviewers. Over time, as the protocols for an AI check became standard, AI can score papers and proposals for experiments for certain consistencies and structure. This metric can then be used to categorize experiments, to suggest improvements and further research, and to facilitate comparisons and meta-analysis. A better way to inspect, measure and grade the structure of experiments would also help develop better kinds of experiments.

200603031234

Wiki-Science – The average number of authors per paper continues to rise. With massive collaborations, the numbers will boom. Experiments involving thousands of investigators collaborating on a "paper" will commonplace. The paper is ongoing, and never finished. It becomes a trail of edits and experiments posted in real time — an ever evolving "document." Contributions are not assigned. Tools for tracking credit and contributions will be vital. Responsibilities for errors will be hard to pin down. Wiki-science will often be the first word on a new area. Some researchers will specialize in refining ideas first proposed by wiki-science.

Defined Benefit Funding — Ordinarily science is funded by the experiment (results not guaranteed) or by the investigator (nothing guaranteed). The use of prize money for particular scientific achievements will play greater roles. A goal is defined, funding secured for the first to reach it, and the contest opened to all. The Turing Test prize awarded to the first computer to pass the Turing Test as a passable intelligence. Defined Benefit Funding can also be combined with prediction markets, which set up a marketplace of bets on possible innovations. The bet winnings can encourage funding of specific technologies.

Zillionics – Ubiquitous always-on sensors in bodies and environment will transform medical, environmental, and space sciences. Unrelenting rivers of sensory data will flow day and night from zillions of sources. The exploding number of new, cheap, wireless, and novel sensing tools will require new types of programs to distill, index and archive this ocean of data, as well as to find meaningful signals in it. The field of "zillionics" — - dealing with zillions of data flows — - will be essential in health, natural sciences, and astronomy. This trend will require further innovations in statistics, math, visualizations, and computer science. More is different. Zillionics requires a new scientific perspective in terms of permissible errors, numbers of unknowns, probable causes, repeatability, and significant signals.

Images-23

Deep Simulations – As our knowledge of complex systems advances, we can construct more complex simulations of them. Both the success and failures of these simulations will help us to acquire more knowledge of the systems. Developing a robust simulation will become a fundamental part of science in every field. Indeed the science of making viable simulations will become its own specialty, with a set of best practices, and an emerging theory of simulations. And just as we now expect a hypothesis to be subjected to the discipline of being stated in mathematical equations, in the future we will expect all hypothesis to be exercised in a simulation. There will also be the craft of taking things known only in simulation and testing them in other simulations—sort of a simulation of a simulation.

Hyper-analysis Mapping – Just as meta-analysis gathered diverse experiments on one subject and integrated their (sometimes contradictory) results into a large meta-view, hyper-analysis creates an extremely large-scale view by pulling together meta-analysis. The cross-links of references, assumptions, evidence and results are unraveled by computation, and then reviewed at a larger scale which may include data and studies adjacent but not core to the subject. Hyper-mapping tallies not only what is known in a particular wide field, but also emphasizes unknowns and contradictions based on what is known outside that field. It is used to integrate a meta-analysis with other meta-results, and to spotlight "white spaces" where additional research would be most productive.

Images-24

Return of the Subjective – Science came into its own when it managed to refuse the subjective and embrace the objective. The repeatability of an experiment by another, perhaps less enthusiastic, observer was instrumental in keeping science rational. But as science plunges into the outer limits of scale – at the largest and smallest ends – and confronts the weirdness of the fundamental principles of matter/energy/information such as that inherent in quantum effects, it may not be able to ignore the role of observer. Existence seems to be a paradox of self-causality, and any science exploring the origins of existence will eventually have to embrace the subjective, without become irrational. The tools for managing paradox are still undeveloped.


|Top|