The world's languages differ to the point of inscrutability. Knowing the English word "duck" doesn't help you guess the French "canard" or Japanese "ahiru." But there are commonalities hidden beneath the superficial differences. For instance, human languages tend to have parts of speech (like nouns and verbs). They tend to have ways to embed propositions in other ones. ("John knows that Mary thinks that Paul embeds propositions in other ones.") And so on. But why?
An influential and appealing explanation is known as Universal Grammar: core commonalities across languages exist because they are part of our genetic endowment. On this view, humans are born with an innate predisposition to develop languages with very specific properties. Infants expect to learn a language that has nouns and verbs, that has sentences with embedded propositions, and so on.
This could explain not only why languages are similar but also what it is to be uniquely human and indeed how children acquire their native language. It may also seem intuitively plausible, especially to people who speak several languages: If English (and Spanish… and French!) have nouns and verbs, why wouldn't every language? To date, Universal Grammar remains one of the most visible products of the field of Linguistics—the one minimally counterintuitive bit that former students often retain from an introductory Linguistics class.
But evidence has not been kind to Universal Grammar. Over the years, field linguists (they're like field biologists with really good microphones) have reported that languages are much more diverse than originally thought. Not all languages have nouns and verbs. Nor do all languages let you embed propositions in others. And so it has gone for basically every proposed universal linguistic feature. The empirical foundation has crumbled out from under Universal Grammar. We thought that there might be universals that all languages share and we sought to explain them on the basis of innate biases. But as the purportedly universal features have revealed themselves to be nothing of the sort, the need to explain them in categorical terms has evaporated. As a result, what can plausibly make up the content of Universal Grammar has become progressively more and more modest over time. At present, there's evidence that nothing but perhaps the most general computational principles are part of our innate language-specific human endowment.
So it's time to retire Universal Grammar. It had a good run, but there's nothing much it can bring us now in terms of what we want to know about human language. It can't reveal much about how language develops in children—how they learn to articulate sounds, to infer the meanings of words, to put together words into sentences, to infer emotions and mental states from what people say, and so on. And the same is true for questions about how humans have evolved or how we differ from other animals. There are ways in which humans are unique in the animal kingdom and a science of language ought to be trying to understand these. But again Universal Grammar, gutted by evidence as it has been, will not help much.
Of course, it remains important and interesting to ask what commonalities, superficial and substantial, tie together the world's languages. There may be hints there about how human language evolved and how it develops. But to ignore its diversity is to set aside the most informative dimension of language.