Submissions Photo

Manifesto

Anti-Intelligence, Anti-Entropy, and Information Death

Information theory developed in the context of communications to answer specific questions regarding the transfer of data across channels. In general, the questions regarded strings of symbols transmitted via a noisy environment to a receiver of some sort. Skip the technical details of the setup and concentrate instead on the idea of information lurking within strings of symbols appearing magically before your eyes: if you can guess the next symbol correctly every time, you need not read the message because no information is imparted. Often the symbols are considered to be independent of one another, so that the string fqrm would be as likely as farm. An English speaking reader seeing the former would likely substitute the latter since the information content of fqrm is too high in English. This is because in English a q without a following u is unusual, and unusual is the hallmark of high information. Farm is expected more frequently than fqrm. Besides the surprise element, semantic notions also urge substituting farm for fqrm. But information theory ignores the semantic content of strings of symbols, considering only their likelihood. (Exercise: what is the information content of being surprised at guessing the next symbol correctly every time if one has no idea of the semantic content of the message?)

Consider partitioning the sequence of symbols into blocks separated by tokens such as . or ,; partition further into subsequences of these blocked segments within larger segments, separated by space with indented openings, and so on until one builds a sequences of symbols blocked into ever increasing blocks, similar in form to what is now before your eyes. This is the process of writing, though it could be carried out by machine using an algorithm assigning symbols according to some transformational rules. The sequence of symbols could be generated so as to have low information content in the sense of surprise, though they may have little to no semantic content. Who knows?

Now consider how information and thermodynamics are related via entropy. Here is an example taken from Karl Petersen, Ergodic Theory: consider a box with a partition far to one side with a particle trapped in the small space, so you have a good idea of where it is. Now assume that the partition can move with this particle's bombardment. Apply a constant heat source to the particle so it constantly bashes itself against the sliding partition, gradually moving it to the other side. By bashing the wall, the particle does work, so needs to invigorate itself with heat, thereby raising entropy. The increased entropy leads to loss of information, since as the partition moves you know less about where the particle resides. Both uncertainty and disorder have simultaneously increased.

Consider a novelist bashing itself against a partition to produce words within a format as discussed above, but now according to some algorithm not so carefully delineated as for the computer. The algorithm is hard-boiled, or cozy mystery, or science fiction (something already shown to not exist outside scientific theory), or chick lit, or erotica, or literary fiction, on ad nauseam. It is based on the work of some other novelists, usually by way of worshipful emulation or of ceremonial magic, in Y's hope that aping X with a pecuniary following will bestow pecuniary success on Y. Now the partition doesn't move, or it perhaps moves an infinitesimal bit, so that any rise in entropy is miniscule. The publishing world overloads the reading world with bloated dross, the output of myriad novelists in myriad partitions, exact duplicates of one another, Y and X or X and Y, no matter which order. (Mathematically speaking, the publishing industry is said to commute.) Entropy decreases, no surprises; nothing new allowed. The process finds an algorithm via constrained humans, optimized to support the mythos of the corporacracy, writer's workshops and writer's programs teaching the canons of literary reputability that define the proper partition.

This is cold death, information death. The publishing world fills with pretentious information zombies, the writing dead. Better literature would be produced by random novelists. At least then there would be surprises.

Information based literary criticism

The astute reader objects that the above is nothing more than metaphor. But information theory is not metaphor. Not in any sense. Perhaps its use above seems metaphorical, but it can be made precise. The slight difficulty is in the notion of semantics, which is a primitive for use in literature and can be stretched. Kathy Acker comes to mind. (Exercise: Make the above discussion precise. Use Kathy Acker.)

As a precept one should take that literature, here meaning novels or short stories of higher entertainment implemented for those few able to grasp, profit from and enjoy strings of words requiring cognitive enterprise, ought to raise the entropy level. The reason is that otherwise little to no information makes its way into the biocomputer perusing the strings. The creature dashing inside the box needs to move the partition, widening it, optimally breaking out of the box. The peruser, not knowing whence comes the strings, sprawls before the river of uncertainty opened in its imagining center. A mind is opened to a novel vision of the noumena. Entropy rises, a step towards ah-ha.

The goal of literature being to open new visions by providing new worlds, to break through the standard vision of the machine, uncertainty must increase because more information is available, because the vision of reality, of certainty, shatters. Uncertainty of vision increases.

This is opposite religion and other forms of superstition, including dogmatic social science, which seek to narrow vision by providing certainty. They lower entropy, the end being cold information death frozen in a static world vision. Science seemingly has a similar goal, a mistaken assumption made by these organized superstitions, but the fact that science can throw away a theory in the face of conflicting evidence says the opposite. It is a form of fiction that raises entropy by expanding the horizon, allowing startling new visions of reality. Quantum theory trumps any so-called "science fiction" in this regard. It is itself a higher form of higher science fiction, as is general relativity, expanding horizons and new potential world views. Entropy increases with science, and anything that considers lowering entropy to be scientific method misses the point of scientific method.

The method is to disprove, not prove. Nothing is proven by experiment; one repeatable failure destroys theory forever. But not so in economics or sociology or religion.

The duality of information and entropy provides a basis for comparative literature. Notions such as mutual entropy and mutual information content, conditional entropy and conditional information, can be defined and used to make these notions precise. This awaits further work.

Poor Slothrop

Pynchon's Slothrop in Gravity's Rainbow undergoes continuous entropy increase. With every new uncovering of the code symbols surrounding his place in the cosmos he loses information, interconnections growing boundless, the more he learns the more uncertain he becomes until he disintegrates. Meanwhile, all those entangled with him suffer a slow freeze, entropy decreasing until they are immobilized without possibility of surprise.