The phrase "news that stays news" was originally how Ezra Pound, in 1934, defined literature—and so it's interesting to contemplate what, in the sciences, might meet that same standard. The answer might be the emerging science of literature itself.
Thinking about the means by which language works on the mind, Pound described a three-part taxonomy. First is phanopoeia—think "phantoms": the images that a word or phrase conjures in the reader's mind. Pound's own "petals on a wet black bough" is a perfect illustration. This, he says, is the poetic capacity most likely to survive translation. Second is melopoeia—think "melody": the music words make. This encompasses rhyme and meter, alliteration and assonance, the things we take to be the classic backbones of poetic form. Though fiendishly difficult to translate faithfully, he notes, it doesn't necessarily need to be, as this is the poetic capacity most likely to be appreciated even in a language you don't know.
Third and most enigmatic is a quality Pound called logopoeia, and described as "akin to nothing but language," "a dance of the intelligence among words." This has proved the most elusive to describe, but Pound later clarified that he meant something like verbal register: the unique patterns of usage and habit unique to each word. Take a pair of words like "doo" and "stool." They can both denote the same thing; their sonic effects are about as near as any pair of words can be. And yet, their difference in register—one juvenile, the other clinical—is so strong that the words can't even be considered synonyms, as it's almost impossible to imagine a context in which one could be substituted for the other.
Logopoiea proves to be one of the most dazzling of poetic effects—see, for instance, the contemporary poet Ben Lerner, who writes lines like “a beauty incommensurate with syntax had whupped my cracker ass”—but also the most fragile. It's almost impossible to translate faithfully, because every language divides its register space so differently. See, for instance, the French film The Class (Entre les murs), in which a teacher tells a pair of students they were behaving with «une attitude de pétasses». The English version subtitled the line “acting like skanks,” and prompted a minor furore over whether that particular word was stern enough to serve as an admonishment that would get through to an unruly student, yet inoffensive enough for a teacher to say without expecting to jeopardize their job, yet offensive enough to do exactly that. What's more, an entire scene pivots on the fact that for the students at the school the word strongly implies "prostitute," but for the teacher the word has no such pointed connotation. What word in English meets all five of those criteria? Maybe there is no such word in English.
Logopoeia, in fact, is so fragile that it doesn't even survive in its own language for long. The New York Times, famously, included the word "scumbag" in a crossword puzzle in 2006, a word almost charmingly inoffensive to their editorial staff and the majority of the public, but jaw-droppingly inappropriate to readers old enough to remember the word when it couldn't be spoken in polite company, as it explicitly summoned the image of a used condom. Changes like this are everywhere in a living language. In 1990 it would have been unthinkable for my parents to say "yo," for instance. In 2000 when they said it, it was painful and tone-deaf, a sad attempt to sound like a younger and/or cooler generation. By 2010 it was just about as normal as "hey." How could a reader, let alone a translator, some centuries hence, possibly be expected to know the logopoetic freight of every single word at the time of the piece's writing?
For the first time in human history we have the tools to answer this question. A century after logopoeia entered the humanities it is becoming a science.
For the first time, computational linguists have access to corpora large enough, and computational means sufficient to see these forces in action: to actually observe words as they emerge, mutate, and evolve—to quantify logopoeia, the subtlest and most ephemeral of linguistic effects.
This has changed our sense of what a word is. The question is far from academic.
When the FCC moved to release a set of documents from a settlement with AT&T to the public in the mid-2000s, AT&T argued in court that this constituted "an unwarranted invasion of personal privacy," citing that it was a "legal person" in the eyes of the law. The Third Circuit, in 2009, agreed, and the FCC appealed. The case went to the Supreme Court to decide, in effect, whether "person" and "personal" are two forms of the same word or are two independent terms that happen to share a lot of their orthography (and at least some of their sense).
The court traditionally has turned to the Oxford English Dictionary in situations like this. In this case, though, they turned instead to computational linguists. They performed an analysis across an enormous corpus of real-world usage to investigate whether the words are used in the same contexts, in the vicinity of the same words. The analysis showed they are not. The words were shown to be divergent enough to constitute two independent terms; thus not every "person" is necessarily entitled to "personal privacy." The documents were released.
"We trust," wrote Chief Justice John Roberts in the majority decision, "that AT&T will not take it personally."
The rapidly maturing science of computational linguistics, possible only in a big data era, has finally given scholars of the word what the telescope is to astronomy, or the microscope to biology. That's big news.
And because words, more unstable than stars and squirrelier than paramecia, refuse to sit still, changing context subtly with every utterance, it's news that will stay so. Pound would, I think, agree.