"What I would like to argue for is to stop using the idea of big data as this big rubric to cover all these practices within businesses, like Google, that don't really have the structure to close the empirical loop to determine what part of their success is based on scientifically replicable and testable analytic results versus science, where that's really all we care about. Science is never, in my opinion, going to just get automatic, and it's very rarely easy."
"Computers and networks finally offer us the ability to write. And we do write with them. Everyone is a blogger, now. Citizen bloggers and YouTubers who believe we have now embraced a new "personal" democracy. Personal, because we can sit safely at home with our laptops and type our way to freedom.
But writing is not the capability being offered us by these tools at all. The capability is programming—which almost none of us really know how to do. We simply use the programs that have been made for us, and enter our blog text in the appropriate box on the screen. Nothing against the strides made by citizen bloggers and journalists, but big deal. Let them eat blog."
"We've already had a digital revolution; we don't need to keep having it. The next big thing in computers will be literally outside the box, as we bring the programmability of the digital world to the rest of the world. With the benefit of hindsight, there's a tremendous historical parallel between the transition from mainframes to PCs and now from machine tools to personal fabrication. By personal fabrication I mean not just making mechanical structures, but fully functioning systems including sensing, logic, actuation, and displays."
"Until the '60s, governments were not really involved in car design. Then people like Ralph Nader started noticing that a lot of people were being killed in cars and made it clear why this was happening. We have spent the last 35 years or so designing safety into cars, and it's had a pretty dramatic effect. . . We're in that same era now with security on computer systems. We know we have a problem and now we need to focus on design."
"Maybe there's something beyond computation in the sense that we don't understand and we can't describe what's going on inside living systems using computation only. When we build computational models of living systems—such as a self-evolving system or an artificial immunology system—they're not as robust or rich as real living systems. Maybe we're missing something, but what could that something be?"
"It seems to me that what we're seeing in the software area, and this is the scary part for human society, is the beginning of a kind of dispossession. People are talking about this as dispossession that only comes from piracy, like Napster and Gnutella where the rights of artists are being violated by people sharing their work. But there's another kind of dispossession, which is the inability to actually buy a product. The idea is here: you couldn't buy this piece of software, you could only licence it on a day by day, month by month, year by year basis; As this idea spreads from software to music, films, books, human civilization based on property fundamentally changes."
"When we ask ourselves what the effect will be of time coming into focus the way space came into focus during the 19th century, we can count on the fact that the consequences will be big. It won't cause the kind of change in our spiritual life that space coming into focus did, because we've moved as far outside as we can get, pretty much. We won't see any further fundamental changes in our attitude towards art or religion all that has happened already. We're apt to see other incalculably large affects on the way we deal with the world and with each other, and looking back at this world today it will look more or less the way 1800 did from the vantage point of 1900. Not just a world with fewer gadgets, but a world with a fundamentally different relationship to space and time. From the small details of our crummy software to the biggest and most abstract issues of how we deal with the world at large, this is a big story."
"One of the striking things about being a computer scientist in this age is that all sorts of other people are happy to tell us that what we do is the central metaphor of everything, which is very ego gratifying. We hear from various quarters that our work can serve as the best understanding - if not in the present but any minute now because of Moore's law - of everything from biology to the economy to aesthetics, child-rearing, sex, you name it. I have found myself being critical of what I view as this overuse as the computational metaphor. My initial motivation was because I thought there was naive and poorly constructed philosophy at work. It's as if these people had never read philosophy at all and there was no sense of epistemological or other problems."
Silicon-based life and dust-based life are fiction and not fact. I use them as examples to illustrate an abstract argument. The examples are taken from science-fiction but the abstract argument is rigorous science. The abstract concepts are valid, whether or not the examples are real. The concepts are digital-life and analog-life. The concepts are based on a broad definition of life. For the purposes of this discussion, life is defined as a material system that can acquire, store, process, and use information to organize its activities. In this broad view, the essence of life is information, but information is not synonymous with life. To be alive, a system must not only hold information but process and use it. It is the active use of information, and not the passive storage, that constitutes life.
For the last twenty years, I have found myself on the inside of a revolution, but on the outside of its resplendent dogma. Now that the revolution has not only hit the mainstream, but bludgeoned it into submission by taking over the economy, it's probably time for me to cry out my dissent more loudly than I have before.