Stanford geneticist
Gerald Crabtree created a mild controversy recently with his thesis that the
living conditions of the modern world do not select for intelligence, and that people are, in evolutionary terms, becoming more stupid.
http://bmi205.stanford.edu/_media/crabtree-2.pdf
Health, he claims, and
resistance to illness have become more advantageous.
He uses the decline in
the olfactory sense as an analogy: with the rise of bipedalism, our sense of
smell was once, long ago, radically diminished.
In more recent times the move from nomadic to agricultural societies reduced our more active hunter-gatherer intelligence. For life in administered urban societies, in
short, the gifts of reason aren’t as important as our powers of endurance.
The rise of a phenomenon
like marathon running would seem to support him. What’s the point, really?
And is it smart—or
even healthy?
The
inspiration comes from Plutarch’s romanticized version of the
defeat of the
Persians at Marathon. The story goes that a Greek
courier delivered the news
to Athens, on foot (approximately 25
miles), pronounced the word νενικήκαμεν, a
verbal form related
to νίκη (nike, or victory),
and promptly dropped dead on the spot.
Far
from futile, the deed seems lofty, noble—highly unlikely!—and
characteristically Greek.
For us, however, there’s
nothing particularly beautiful or inspiring about long-distance running. On
the contrary: much more searing are the images of people breaking down and
staggering, like mad cows, to the finish line.
For the record, the
ancient Greeks never competed in “marathons.”
Quoting
Crabtree: "I would be willing to wager that if an average
citizen of Athens of 1,000 BC (sic) were to appear suddenly among us, he or she would be
among the brightest and most intellectually alive of our colleagues and
companions."
The
reaction to his findings is interesting on a number of levels. Many of his
peers have discredited his methods, but more telling, perhaps, are the
objections to his narrative.
In
general these can be summed up in the author’s alleged undervaluation of
modern accomplishments like writing, computers and space
exploration—i.e., technology.
In
defense of Crabtree's thesis, we can certainly question whether these and similar
advances have made us smarter.
They’ve
certainly made us more sedentary, and effectively severed us from much
first-hand knowledge of an earlier more bio-diverse world—at the same time
seducing us with their own, highly-varied, often compelling, but in the end
merely spectral cosmoi: the
word-made-flesh of the Book; the virtual reality of the screen; and lastly, the
manic dreams of our physicists, facilitated by ever more powerful,
precisely-calibrated instruments of measurement and (mostly imagined) transport.
Any
meaningful discussion of the question of stupidity would need to confront some
of the conceptual antinomies built into our very definition of intelligence—as, for
example, the unquestioned value of literacy. Long taken as an unambiguous marker of civilized
advancement, one would surely be hard-pressed to argue that the cause of literacy is
promoted by the endlessly-proliferating toys of the tele-sphere.
If not
dumber, then surely different.
Fellow
geneticist Steve Jones, among many others, is harshly dismissive of Crabtree’s
findings: “Never mind the hypothesis,
give me the data, and there aren't any. I could just as well argue that mutations have
reduced our aggression, our depression and our penis length but no journal
would publish that. Why do they publish this?”
Who
gets to speak for
evolution? And on what terms? How much data is needed to substantiate a
narrative that, in this case at least (i.e, that environment plays some kind of
discernible role in genetic make-up) seems intuitively sound?
Are
Crabtree’s critics proving his point?
Hard-bitten
techno-drones have always been skeptical—often rightly—of universalizing
accounts, postponing a story which in the end they never get around to
telling.
Much of
our best science, however, has grown precisely out of over-reach: the mad
quests of alchemy, for example. Or astrology.
The
eleven dimensions of string theory?
Though
data may sometimes tell stories, they never ask questions.
Do we
still know how to make wishes?
Observable
reality exerts a constant force on the hermetically-sealed world of pure
research.
It’s no
surprise that experimental conclusions, more often than not, support the world
view of the politico-cultural milieu that (like Foucault’s dispositif) both grounds and informs them.
Much of
the contemporary research on infants has, predictably, become a frontier for
essentializing propaganda missions: are people naturally selfish? Or are we
more inclined to share?
The winner gets to score cheap political points, without alienating their donor base.
And
what a dull question!
Like our stunted discourses on homosexuality, or addiction, or cancer, or crime, we assume we've backed the question into a corner by asking whether some condition is genetic, or based in environment, but our view on the latter is so limited, our understanding of what is "natural" so shallow and uncritical as to miss the chance to ask truly penetrating questions about our deeper relationship to the world and our place in nature.
One of the most disturbing developments of the first decade of the war on terror has been the nakedly ideological attempt to normalize our toxic, repressive, fear-based relationship to a huge portion of the world's population.
Who
gets to speak for the in-fant (i.e.,
the non-speaker), within the hyper-paranoid and self-righteous regimes of the
New Normal?
Babies,
to their credit, and for better or worse, continue to absorb and process our
aspirational models (both conscious and unconscious) in marvelously
unpredictable ways. Not so much mirrors as deep monadic entelechies—tiny
windows onto infinite horizons—our babies articulate, through growth, responses
to questions which we ourselves are incapable of even expressing.
If
Crabtree is correct, and if ontogeny recapitulates phylogeny (i.e., if the
history of the species can be re-read in the developmental phases of the individual), we
should, in theory, be able to discern (if not graph) the decline in
intelligence of which he speaks.
Our
question might then become: in what ways do we actively
limit the intellectual growth of our children?
This, as they say (all irony intended) is a no-brainer: adulthood has always been something of a farce—at least for men!
The chief pain that a
man normally suffers in his progress through this vale is that of
disillusionment; the chief pain that a woman suffers is that of parturition.
There is enormous significance in the difference. The first is artificial and
self-inflicted; the second is natural and unescapable.—H.L. Mencken
For women, historically, growing up has been
less farce than tragedy—suffice to say that, even in the more economically
advanced nations, formal education is a relatively recent acquisition. One
still hears the word co-ed used, if less frequently.
Female intelligence has
long been restricted, as a matter of public policy, a woman’s development often
curtailed as severely as that of any slave—using much of the same reasoning in
fact, i.e., from “nature.”
Mencken is more on the
mark with respect to men. The terms of male maturity have forever been couched
in a language of idealism, and specifically, the need to disabuse oneself of
its indulgences: men have been repeatedly admonished to face up to the “real world,”
which by definition shares little in common with what one was taught in school.
War & heroism
(especially), but also democracy, marriage and professional life. The
university has traditionally been conceived as a kind of shelter, a time to
consider the big questions before getting down to business, raising a family,
etc.
This is obviously more
true in America, but even in Augustan Rome, the poet Ovid was (unsuccessfully)
badgered by an overbearing father into a law career.
The more Greek, philosophical model, a life devoted to learning or poetry, the vita contemplativa held up by Aristotle as most worthy of a man (as being most like that of the gods) was for most Romans already something decidedly unserious.
And the Epicureans, among others, had preached withdrawal from public affairs as “damaging to the soul.”
The ancient world’s aristocratic bias, and a love for festivals (there were as many as 180 per year on the calendar in late republican Rome) took much of the edge off the blunting effects of that collective drudgery we call work.
While hardly an endorsement of abstract intellectual pursuits, at Rome there was still a universal recognition that virtue was something beyond the reach of the laborer—a statement that could stand as the fundamental credo of any aristocratic worldview.
There was no attempt to
ennoble brute effort—on the contrary, the slave’s world (and to a lesser
extent, the mentality of the prosperous freed slave, or libertus) was equated with
that of the child.
The demeaning use of the
word “boy” in reference to black males, a term I still heard as a child,
reflects some of this ancient attitude.
In education, our
national squeamishness about the obvious need for vocational training has
exerted a strong downward pull on our economy, for generations.
The European model,
where the top 10% go to Gymnasium, or the Lycée,
and the rest to technical training, seems excessively cruel to Americans, I
suppose, but it’s certainly more practical.
And so, predictably
(most notably in the last twenty-five years), following Freud’s logic of the
return of the repressed, our universities, in response to mounting economic
pressures, have found themselves becoming increasingly . . . vocational!
The fundamental terms of
this discussion though have been with us since the dawn of philosophy, if not
civilization itself—the practical as opposed to the theoretical,
doing vs. contemplating.
Child psychologist
Alison Gopnik suggests, reasonably enough, that babies are constantly testing
out various hypotheses on their world, applying a highly-sophisticated
version of the scientific method—infants are, she claims, our most theoretical
members, the “R&D division of the human species.”
Growing up, moving out,
dealing with the pressing exigencies of the real world—all of this is first rehearsed, as it were, by the child’s crossing of that all-important
threshold into consciousness, language, and experience.
And this crossing, too,
has its own model, in the “experience” of being born.
Liberating, no doubt,
and often exhilarating, there is nonetheless a fundamental, irreducibly
traumatic moment built into the passage into adulthood.
At the same time, and deriving from childhood innocence, all growing-up involves the most radical solicitation of the suppositions and pretenses of a given socio-cosmic
order.
As Dostoevsky puts it: children
are bold before the face of the Lord.
Posing, above all,
perhaps, that question that no God has ever been able to answer: why do
children have to suffer?
Like children, cultures
find it hard to breathe the rarefied air of theological speculation for very
long, but all reflect in their deepest strata on the pathos involved in
delivering young adults over to a cruel and often merciless world.
This pathos can turn
pathological: like a certain Father who sacrificed his Son—to save the world!
For Dostoevsky, according to Susan McReynolds, this transcendent plan of the Bible's "merchant God" reflects a troubling, even cynical, kind of economic exchange, an eternal harmony . . . constructed through calculations of what can be purchased with innocent suffering.
As I write this, the
news arrives that a gunman has just killed 27 people, including eighteen
children, at an elementary school in Newtown, Connecticut.
If we are not on the side of those whom society wastes in order to reproduce itself, asks Kristeva, then where are we?
In Fellini’s La
Dolce Vita, the intellectual Steiner kills himself and his two children,
fearing for the fate of their souls in a mechanized and materialistic post-war
Italy.
Euripides’ Medea likewise kills her children, as a means of revenge on her faithless husband Jason.
Children have a fairly
rough time of it in Greek myth, to say the least. Tales of child cannibalism
(inadvertent and otherwise) abound.
Phrixus and Helle are saved from sacrifice by Heracles, at Zeus’ bidding. They make their escape (not unlike Isaac) by means of a substitute: a ram, which is killed in their place.
Niobe’s fourteen
children are killed by the Latonides, Apollo and Artemis, shown here resting
peacefully with their mother:
Historically speaking,
war and violence have always been the domain of children, most profoundly at
Sparta, where boys were skirted-off at the age of seven to a life of
trial and training.
Plutarch has a Spartan mother
tell her child, laconically, as he leaves for war (with reference to his
shield): (Come home) with it, or on it.
In the Old Testament,
the figure of Moloch is associated with propitiatory child sacrifice, a
practice which is fairly well-documented among the Semitic Carthaginians.
No cultures, though,
practiced immolation as fervidly or as religiously (!) as the Aztecs and Mayans.
In all instances, the
victim's degree of innocence seems to add special meaning to the ritual, an
effective way to appease the blood-lust of a savage and sadistic deity.
Roman games are more or
less secular, with just a hint of religious ceremony, but offer more bloodshed
than any analogous practice, religious or otherwise, of the so-called barbaric
cultures.
With Western civilization comes
(eventually!) the bloodless sacrifice of the Christian liturgy, the cathartic
spectacles of Greek tragedy. War, though, continues apace, right down to the
technological “precision” of our multiple contemporary drone offensives, the
slaughtering of innocents in Gaza, Pakistan, Yemen, Somalia, and elsewhere.
Palestinian children have, since the beginning of the Second Intifada, been killed at the rate of one every three days.
Unfortunately, violence
seems built-into our DNA. Eradication is not an option. Exploration of “root
causes” goes nowhere, fairly quickly. And not just for political reasons.
Having little choice, we
send our children out, into the wide expanses. The threats that most will continue to
face, all over the world, are more truly frightening than any terrorist, or
monster under the bed.
"Success" in this context, such as it is, goes more and more to the clones, those beings built to mask, or mimic, the brutal laws of an increasingly algorithmic mechanism of prosperity, exclusion, expropriation, and victory.
We don’t identify with the virginal victims, any more than we ever did. Not that we could, if we wanted to.
http://www.wired.com/opinion/2012/12/the-next-warfare-domain-is-your-brain/
http://www.spiegel.de/international/world/pain-continues-after-war-for-american-drone-pilot-a-87 2726.html

0 Comments