| Index | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 |

next >



Blogger: Betsy Devine: Funny ha-ha and/or funny peculiar


In the next five years, policy-makers around the world will embrace economic theories (e.g. those of Richard Layard) aimed at creating happiness. The Tower of Economic Babble is rubble. Long live the new, improved happiness economics!

Cash-strapped governments will love Layard's theory that high taxes on high earners make everyone happier. (They reduce envy in the less fortunate while saving those now super-taxed from their regrettable motivation to over-work.) It also makes political sense to turn people's attention from upside-down mortgages and looted pension funds to their more abstract happiness that you claim you can increase.

Just a few ripple effects from the coming high-powered promotion of happiness:

• Research funding will flow to psychologists who seek advances in happiness creation.

• Bookstores will re-name self-help sections as "Happiness sections"; then vastly expand them to accommodate  hedonic workbooks and gratitude journals in rival formats.

• In public schools, "happiness" will be the new "self-esteem," a sacred concept to which mere educational goals must humbly bow.

• People will pursue happiness for themselves and their children with holy zeal; people whose child or spouse displays public unhappiness will feel a heavy burden of guilt and shame.

Will such changes increase general citizen happiness? This question is no longer angels-on-head-of-pin nonsense; researchers now claim good measures for relative happiness.

The distraction value alone should benefit most of us. But in the short run, I at least would be happy to see that my prediction had come true.

Archaeologist, Journalist; Author, Artifacts


While our minds have been engaging with intangibles of the virtual world, where will our real bodies be taking us on planet earth, compassed with these new perspectives?

Today we are fluent at engaging with the 'other' side of the world. We chart a paradox of scale; from the extra-terrestriall to the international. Those places we call home are both intimately bounded and digitally exposed. Our observation requires a new grammar of both voyager and voyeur.

The generation growing up with both digital maps and terrestrial globes will have the technological means to shake up our orthodoxies, at the very moment that we need to be aware of every last sprawling suburb and shifting sandbank.

Our brave new map of the world is evolving as one created by us as individuals, as much as one which is geographically verifiable by a team of scientists. It will continue to be a mixture of landscapes mapped over centuries, and of unbounded digital terrains. One charted in  well-thumbed atlases in libraries, and by global positioning systems prodded by fingers on the go. Armchair travellers gathering souvenirs through technology, knowing no bounds; a geography of personal space, extended by virtual reality; a sense of place plotted by chats over garden fences, as much as instant messaging exchanged between our digital selves across time zones drawn up in the steam age. 

What will not change is our love for adventure. We will not lose our propensity to explore beyond our own horizons and to re-explore those at home. Our innate curiosity will be as relevant in the digital age as it was to the early Pacific colonisers, to 17th-century merchants heading east from Europe, and America's west-drawn pioneers.

Our primaeval wanderlust will continue in meanderings off the path, and these are as necessary for our physical selves as the day dreams that bring forth innovation. We develop ways to secure our co-ordinates while also straining at the leash. And our maps move with us.

Dislocation is good. Take a traditional map of the world. Cut it in half. Bring the old Oceanic edges together, and look what happens to the Pacific.

Seeing the world differently changes everything.

CNRS, Paris


When asked about what will change our future, the most straightforward reply that comes to mind is, of course, the Internet. But how the Internet will change things that it has not already changed, what is the next revolution ahead on the net, this is a harder matter. The Internet is a complex geography of information technology, networking, multimedia content and telecommunication. This powerful alliance of different technologies has provided not only a brand new way of producing, storing and retrieving information, but a giant network of ranking and rating systems in which information is valued as long as it has been already filtered by other people.

My prediction for the Big Change is that the Information Age is being replaced by a Reputation Age in which the reputation of an item — that is how others value and rate the item — will be the only way we have to extract information about it. This passion of ranking is a central feature of our contemporary practices of filtering information, in and out of the net (take as two different examples of it — one inside and the other outside the net - www.ebay.com and the recent financial crisis).

The next revolution will be a consequence of the impact of reputation on our practices of information gathering. Notice that this won’t mean a world of collective ignorance in which everyone has no other chances to know something than to rely on the judgment of someone else, in a sort of infinite chain of blind trust where nobody seems to know anything for sure anymore: The age of reputation will be a new age of knowledge gathering guided by new rules and principles. This is possible now thanks to the tremendous potential of the social web in aggregating individual preferences and choices to produce intelligent outcomes. Let me explain how more precisely.

One of the main revolution of Internet technologies has been the introduction by Google of the « PageRank » algorithm for retrieving information, that is, an algorithm that bases its search for relevant information on the structure of the links on the Web. Algorithms such as these extract the cultural information contained in each preference users express by putting a link from a page to another with a mathematical cocktail of formulas that gives a special weight to each of these connections. This determines which pages are going to be in the first positions of a search result.

Fears about these tools are obviously many, because our control on the design of the algorithms, on the way the weights are assigned to determine the rank is very poor, nearly inexistent. But let us imagine a new generation of search engines whose ranking procedures are simply generated by the aggregation of individual preferences expressed on these pages: no big calculations, no secret weights: the results of a query are organized just according to the « grades » each of these pages has received by the users that have crossed that page at least once and taken the time to rank it.

A social search engine based on the power of the « soft » social computing, will be able to take advantage of the reputation each site and page has cumulated simply by the votes users have expressed on it. The new algorithms for extracting information will exploit the power of the judgments of the many to produce their result. This softer Web, more controlled by human experiences than complex formulas, will change our interaction with the net, as well as our fears and hopes about it. The potential of social filtering of information is that of a new way of extracting information by relying on the previous judgments of others.

Hegel thought that universal history was made by universal judgments: our history will be written from now on in the language of « good » and « bad », that is, in terms of the judgments people express on things and events around them, that will become the more and more crucial for each of us to extract information about these events.  According to Frederick Hayek, Civilization rests on the fact that we all benefit from knowledge we do not possess: that’s exactly the kind of civilized cyber-world that will be made possible by social tools of aggregating judgments on the Web.

Research Curator, Utah Museum of Natural History, University of Utah; Host, Dinosaur Planet TV Series.


Evolution is the scientific idea that will change everything within next several decades.  

I recognize that this statement might seem improbable.  If evolution is defined generally, simply as change over time, the above statement borders on meaningless.  If regarded in the narrower, Darwinian sense, as descent with modification, any claim for evolution's starring role also appears questionable, particularly given that 2009 is the 150th anniversary of the publication of On the Origin of Species.  Surely Darwin's "Dangerous Idea," however conceived, has made its mark by now.  Nevertheless, I base my claim on evolution's probable impacts in two great spheres: human consciousness and science and technology.

Today, the commonly accepted conception of evolution is extremely narrow, confined largely to the realm of biology and a longstanding emphasis on mutation and natural selection.  In recent decades, this limited perspective has become further entrenched by the dominance of molecular biology and its "promise" of human-engineered cells and lifeforms.  Emphasis has been placed almost entirely on the generation of diversity — a process referred to as "complexification" — reflecting the reductionist worldview that has driven science for four centuries. 

Yet science has also begun to explore another key element of evolution — unification — which transcends the biological to encompass evolution of physical matter.  The numerous and dramatic increases in complexity, it turns out, have been achieved largely through a process of integration, with smaller wholes becoming parts of larger wholes.  Again and again we see the progressive development of multi-part individuals from simpler forms.  Thus, for example, atoms become integrated into molecules, molecules into cells, and cells into organisms.  At each higher, emergent stage, older forms are enveloped and incorporated into newer forms, with the end result being a nested, multilevel hierarchy. 

At first glance, the process of unification appears to contravene the second law of thermodynamics by increasing order over entropy.  Again and again during the past 14 billion years, concentrations of energy have emerged and self-organized as islands of order amidst a sea of chaos, taking the guise of stars, galaxies, bacteria, gray whales, and, on at least one planet, a biosphere.  Although the process of emergence remains somewhat of a mystery, we can now state with confidence that the epic of evolution has been guided by counterbalancing trends of complexification and unification.  This journey has not been an inevitable, deterministic march, but a quixotic, creative unfolding in which the future could not be predicted. 

How will a more comprehensive understanding of evolution affect science and technology?  Already a nascent but fast-growing industry called "biomimicry" taps into nature's wisdom, imitating sustainable, high performance designs and processes acquired during four billion years of evolutionary R&D.  Water repellant lotus plants inspire non-toxic fabrics.  Termite mounds inspire remarkable buildings that make use of passive cooling.  Spider silk may provide inspiration for a new, strong, flexible, yet rigid material with innumerable possible uses.  Ultimately, plant photosynthesis may reveal secrets to an unlimited energy supply with minimal waste products.

The current bout of biomimicry is just the beginning.  I am increasingly convinced that ongoing research into such phenomena as complex adaptive systems will result in a new synthesis of evolution and energetics — let's call it the "Unified Theory of Evolution" — that will trigger a cascade of novel research and designs.  Science will relinquish its unifocal downward gaze on reductionist nuts and bolts, turning upward to explore the "pattern that connects."  An understanding of complex adaptive systems will yield transformative technologies we can only begin to imagine.  Think about the potential for new generations of "smart" technologies, with the capacity to adapt, indeed to evolve and transform, in response to changing conditions. 

And what of human consciousness?  Reductionism has yielded stunning advances in science and technology.  However, its dominant metaphor, life-as-machine, has left us with a gaping chasm between the human and non-human worlds.  With "Nature" (the non-human world) reduced merely to resources, humanity's ever-expanding activities have become too much for the biosphere to absorb.  We have placed ourselves, and the biosphere, on the precipice of a devastating ecological crisis, without the consciousness for meaningful progress toward sustainability.  

At present, Western culture lacks a generally accepted cosmology, a story that gives life meaning.  One of the greatest contributions of the scientific enterprise is the epic of evolution, sometimes called the Universe Story.  For the first time, thanks to the combined efforts of astronomers, biologists, and anthropologists (among many others), we have a realistic, time-developmental understanding of the 14 billion year history of us.  Darwin's tree of life has roots that extend back to the Big Bang, and fresh green shoots reach into an uncertain future.  Far from leading to a view that the Universe is meaningless, this saga provides the foundation for seeing ourselves as fully embedded into the fabric of nature.  To date, this story has had minimal exposure, and certainly has not been included (as it should be) in the core of our educational curricula. 

Why am I confident that these transformations will occur in the near future?  In large part because necessity is the mother of invention.  We are the first generation of humans to face the prospect that humanity may have a severely truncated future.  In addition to new technologies, we need a new consciousness, a new worldview, and new metaphors that establish a more harmonious relationship between the human and the non-human.  Of course, the concept of "changing everything" makes no up-front value judgments, and I can envision evolution's net contribution as being either positive or negative, depending on whether the shift in human consciousness keeps pace with the radical expansion of new (and potentially even more exploitative) technologies.  In sum, our future R&D efforts need to address human consciousness in at least equal measure to science and technology. 

Professor of Biology, Amherst College; Author, Evolution of Infectious Disease


Thirteen decades ago, Louis Pasteur and Robert Koch led an intellectual revolution referred to as the germ theory of disease, which proposes that many common ailments are caused by microbes. Since then the accepted spectrum of infectious causation has been increasingly steadily and dramatically.  The diseases that are most obviously caused by infection were accepted as such by the end of the 19th century; almost all of them were acute diseases.  Acute diseases with a transmission twist, mosquito-borne malaria for example, were accepted a bit later at the beginning of the 20th century.  Since the early 20th century, the spectrum has been broadened mostly by recognition of infectious causation of chronic diseases.  The first of these had distinctly infectious acute phases, which made infectious causation of the chronic disease more obvious. Infectious causation of shingles, for example, was made more apparent by its association with chicken pox.  Over the past thirty years, the spectrum of infectious causation has been broadened mostly through inclusion of chronic diseases without obvious acute phases.  With years or even decades between the onset of infection and the onset of such diseases, demonstration of infectious causation is difficult. 

Technological advances have been critical to resolving the ambiguities associated with infectious causation of such cryptic infectious causation.  In the early 1990s Kaposi's Sarcoma Associated Herpes Virus was discovered using a molecular technique that stripped away the human genetic material from Kaposi's sarcoma cells and found what remained.  A similar approach revealed Hepatitis C virus in blood transfusions.  In these cases there were strong epidemiological signs that an infectious agent was present.  When the cause was discovered, acceptance did not have to confront the barrier of entrenched opinions favoring other non-infectious causes.  If such special interests are present the evidence has to be proportionately more compelling.  Such is the case for schizophrenia, atherosclerosis, Alzheimer's disease, breast cancer, and many other chronic diseases, which are now the focus of vehement disagreements.

Advances in molecular/bioinformatic technology are poised to help resolve these controversies.  This potential is illustrated by two discoveries, which seem cutting edge now, but will soon be considered primitive first steps.  About a decade ago, one member of Stanford team scraped spots on two teeth of another team member, and amplified the DNA from the scrapings.  They found sequences that were sufficiently unique to represent more than 30 new species.  This finding hinted at the magnitude of the challenge--tens or perhaps even hundreds of thousands of viruses and bacteria may need to be considered to evaluate hypotheses of infectious causation. 

The second discovery provides a glimpse of how this challenge may be addressed.  Samples from prostate tumors were tested on a micro-array that contained 20,000 DNA snippets from all known viruses.  The results documented a significant association with an obscure retrovirus related to one that normally infects mice. If this virus is a cause of prostate cancer, it causes only a small portion that occurs in men with a particular genetic background. Other viruses have been associated with prostate cancer in patients without this genetic background. So, not only may thousands of viruses need to be tested to find one correlated with a chronic disease, but even then it may be one of perhaps many different infectious causes.

The problems of multiple pathogens and ingrained predispositions are now coming to a head in research on breast cancer.   Presently, three viruses have been associated with breast cancer: mouse mammary tumor virus, Epstein Barr virus, and human papillomavirus.  Researchers are still arguing about whether these correlations reflect causation.  If they do, these viruses account for somewhere between half and about 95% of breast cancer, depending on the extent to which they act synergistically. Undoubtedly array technology will soon be used to assess this possibility and to identify other viruses that may be associated with breast cancers.

There is a caveat. These technological advancements provide sophisticated approaches to identifying correlations between pathogens and disease.  They do not bridge the gulf between correlation and causation.  One might hope that with enough research all aspects of the pathological process could be understood, from the molecular level up to the whole patient.  But as one moves from molecular to the macro levels, the precision of interpretation becomes confounded by the complex web of interactions that intervene, especially in chronic diseases.   Animal models are generally inadequate for chronic human diseases because the disease in animals is almost never quite the same as the human disease. The only way out of this conundrum, I think, will be to complement the technological advancements in identifying candidate pathogens with clever clinical trials.  These clinical trials will need to use special states, such as temporary immune suppression, to identify those infections that are exacerbated concurrently with exacerbations of the chronic disease in question.  Such correlations will then need to be tested for causation by treatment of the exacerbated infection to determine whether the suppression of the infection is associated with amelioration of the disease.  

Why will this process change things?   For those of us who live in prosperous countries, infectious causes are implicated but not accepted in most common lethal diseases: cancers, heart attacks, stroke, Alzheimer's disease.  Infectious causes are also implicated in the vast majority of nonlethal, incapacitating illness of uncertain cause, such as arthritis, fibromyalgia, and Crohn's disease.  If infectious causes of these diseases are identified medical history tells us that they will tend to be resolved. 

A reasonable estimate of the net effect would be a rise in healthy life expectancy by two or three decades, pushing lifespan up against the ultimate boundary of longevity molded by natural selection, probably an age range between 95 and 105 years.  Being pushed up against this barrier people could be expected to live healthy lives into their 90s and then go downhill quickly.  This demographic transition toward healthy survival will improve productivity, lower medial costs, and enhance quality of life.  In short it will be one of medicine's greatest contributions.

Thomas Chair, Distinguished Professor, Department of Anthropolgy, University of Utah; Coauthor, The 10,000 Year Explosion


Cheap individual genotyping will give a new life to dating services and marriage arrangers. There is a market for sperm and egg donors today, but the information available to consumers about donors is limited. This industry will flourish as individual genotyping costs go down and knowledge of genomics grows.

Potential consumers will be able to evaluate not only whether or not a gamete provider has brown eyes, is tall or short, has a professional degree, but also whether the donor has the appropriate MHC genotypes, long or short androgen receptors, the desired dopamine receptor types, and so on. The list of criteria and the sophistication of algorithms matching consumers and donors will grow at an increasing rate in the next decade.

The idea of a "compatible couple" will have a whole new dimension. Consumers will have information about hundreds of relevant donor genetic polymorphisms to evaluate in the case of gamete markets. In marriage markets there will be evaluation by both parties. Where will all this lead? Three possibilities come immediately to mind:

A. Imagine that Sally is looking at the sperm donor market. Perhaps she is shopping for someone genetically compatible, for example with the right MHC types. She is a homozygote for the 7R allele of the DRD4 genetic locus so she is seeking a sperm donor homozygous for the 4R allele so she won't have to put up with a 7R homozygous child like she was. In other words whether Tom or Dick is a more desirable donor depends on characteristics specific to Sally.

B. But what if Sally values something like intelligence, which is almost completely unidimensional and of invariant polarity: nearly everyone values high intelligence. In this case Sally will evaluate Tom and Dick on simple scales that Sally shares with most other women, Tom will almost always be of higher value than Dick, and he will be able to obtain a higher price for his sperm.

C. Perhaps a new President has red-haired children. Suddenly Sally, along with most other women in the market, wants red-haired children because they are fashionable. Dick, with his red hair, is the sellout star of the sperm market but only for a short time. There is a cohort of children born with red hair, then the fad soon goes away as green eyes, say, become the new hot seller. Dick loses his status in the market and is forced to get a real job.

These three scenarios or any mix of them is a possible future for love and marriage among those prosperous enough to indulge in this market. Scenario A corresponds to traditional views of marriage: for everyone there is someone special and unique. Scenario B corresponds somewhat more closely to how marriage markets really work-every Sally prefers rich to poor, smart to dumb, and a BMW to a Yugo. Scenario C is close to one mechanism of what biologists call sexual selection: male mallards have green heads essentially because it is just the fashion. I would not wager much on which of these scenarios will dominate the coming gamete market but I favor scenario B.

Biologist, University of Minnesota; blogger, Pharyngula


The question, "what will change everything?" is in the wrong tense: it should be "what is changing everything right now?" We're in the midst of an ongoing revision of our understanding of what it means to be human—we are struggling to redefine humanity, and it's going to radically influence our future.

The redefinition began in the 19th century with the work of Charles Darwin, who changed the game by revealing the truth of human history. We are not the progeny of gods, we are the children of worms; not the product of divine planning, but of cruel chance and ages of brutal winnowing. That required a shift in the way we view ourselves that is still working its way through the culture. Creationism is an instance of a reaction against the dethroning of Homo sapiens. Embracing the perspective of evolution, however, allows us to see the value of other species and to appreciate our place in the system as a whole, and is a positive advance.

There are at least two more revolutions in the works. The first is in developmental biology: we're learning how to reprogram human tissues, enabling new possibilities in repair and regeneration. We are acquiring the tools that will make the human form more plastic, and it won't just stop with restoring damaged bodies to a prior state, but will someday allow us to resculpt ourselves, add new features and properties to our biology, and maybe, someday, even free us completely from the boundaries of the fixed form of a bipedal primate. Even now with our limited abilities, we have to rethink what it means to be human. Does a blastocyst really fit the definition? How about a 5-week embryo, or a three-month-old fetus?

The second big revelation is coming from neuroscience. Mind is clearly a product of the brain, and the old notions of souls and spirits are looking increasingly ludicrous…yet these are nearly universal ideas, all tangled up in people's rationalizations for an afterlife, for ultimate reward and punishment, and their concept of self. If many object to the lack of exceptionalism in our history, if they're resistant to the idea that human identity emerges gradually during development, they're most definitely going to find the idea of soullessness and mind as a byproduct of nervous activity horrifying.

This will be our coming challenge, to accommodate a new view of ourselves and our place in the universe that isn't encumbered with falsehoods and trivializing myths. That's going to be our biggest change—a change in who we are.

Clinical Professor of Medicine, UCSF; Author, The Spectrum


We are entering a new era of personalized medicine.  One size does not fit all. 

One way to change your genes is to make new ones, as Craig Venter has elegantly shown. Another is to change your lifestyle: what you eat, how you respond to emotional stress, whether or not you smoke cigarettes, how much you exercise, and the experience of love and intimacy.

New studies show that these comprehensive lifestyle changes may change gene expression in hundreds of genes in only a few months—"turning on" (upregulating) disease-preventing genes and "turning on" (downregulating) genes that promote heart disease, oncogenes that promote breast cancer and prostate cancer, and genes that promote inflammation and oxidative stress. These lifestyle changes also increase telomerase, the enzyme that repairs and lengthens telomeres, the ends of our chromosomes that control how long we live. 

As genomic information for individuals becomes more widely available—via the decoding of each person's complete genome (as Venter and Watson have done) or partially (and less expensively) via new personal genomics companies—this information will be a powerful motivator for people to make comprehensive lifestyle changes that may beneficially affect their gene expression and significantly reduce the incidence of the pandemic of chronic diseases. 

Edwin Howard Armstrong Professor of Computer Science at Columbia University; Author: Complexity and Information


On January 20, 2009, our nation's leaders will gather in Washington for the inauguration of the 44th President of the United States. How many more such public inaugurations will we see?

Due to the threat from increasing precision, range, and availability of weapons, will it be safe for our nation's leaders to gather in one public place? Such weapons are or will be available to a variety of nations, NGOs, and terrorist groups. It might well be in someone's interest to wipe out the nation's leadership in one blow.

Why am I willing to announce this danger publicly? Won't it give terrorists ideas? I've come to believe that terrorist groups as well as other nations are smart and will identify such opportunities themselves.

What can be done about the potential physical threat when our leaders are gathered in one place?

Digital Technologist; Managing Director, Co-Founder, area/code


In just a few years, we’ll see the first generation of adults whose every breath has been drawn on the grid. A generation for whom every key moment (e.g., birth) has been documented and distributed globally. Not just the key moments, of course, but also the most banal: eating pasta, missing the train, and having a bad day at the office. Ski trips and puppies.

These trips and puppies are not simply happening, they are becoming data, building up the global database of distributed memories. They are networked digital photos – 3 billion on Flickr, 10 billion on Facebook. They were blog posts, and now they are tweets, too (a billion in 18 months). They are Facebook posts, Dopplr journals, Last.FM updates.

Further, more and more of these traces we produce will be passive or semi-passive. Consider Loopt, which allows us to track ourselves, our friends through GPS. Consider voicemail transcription bots that transcribe the voice messages we leave into searchable text in email boxes on into eternity. The next song you listen to will likely be stored in a database record somewhere. Next time you take a phonecam photo, it may well have the event’s latitude and longitude baked into the photo’s metadata.

The sharp upswing in all of this record-keeping – both active and passive – are redefining one of the core elements of what it means to be human, namely to remember. We are moving towards a culture that has outsourced this essential quality of existence to machines, to a vast and distributed prosthesis. This infrastructure exists right now, but very soon we’ll be living with the first adult generation whose entire lives are embedded in it.

In 1992, the artist Thomas Bayrle wrote that the great mistakes of the future would be that as everything became digital, we would confuse memory with storage. What’s important about genuine memory and how it differs from digital storage is that human memory is imperfect, fallible, and malleable. It disappears over time in a rehearsal and echo of mortality; our abilities to remember, distort and forget are what make us who we are.

We have built the infrastructure that makes it impossible to forget. As it hardens and seeps into every element of daily life, it will make it impossible to remember. Changing what it means to remember changes what it means to be.

There are a few people with who already have perfect episodic memory, total recall, neurological edge cases. They are harbingers of the culture to come.  One of them, Jill Price, was profiled in Der Spiegel:

"In addition to good memories, every angry word, every mistake, every disappointment, every shock and every moment of pain goes unforgotten. Time heals no wounds for Price. 'I don't look back at the past with any distance. It's more like experiencing everything over and over again, and those memories trigger exactly the same emotions in me. It's like an endless, chaotic film that can completely overpower me. And there's no stop button.'"

This also describes the life of Steve Mann, passively recording his life through wearable computers for many years. This is an unlikely future scenario, but like any caricature, it is based on human features that will be increasingly recognizable. The processing, recording and broadcasting prefigured in Mann’s work will be embedded in everyday actions like the twittering, phonecam shots and GPS traces we broadcast now. All of them entering into an outboard memory that is accessible (and searchable) everywhere we go.

Today is New Year’s Eve. I read today (on Twitter) that three friends, independent of each other, were looking back at Flickr to recall what they were doing a year ago. I would like to start the New Year being able to remember 2008, but also to forget it.

For the next generation, it will be impossible to forget it, and harder to remember. What will change everything is our ability to remember what everything is. Was. And wasn’t.

| Index | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 |

next >