The Center for Tactical Magic strikes again

Anarchists in the Aisles? Stores Offer Unwitting Stage

By IAN URBINA
December 24, 2007 New York Times

This is the season of frenetic shopping, but for a devious few people it’s also the season of spirited shopdropping.

Otherwise known as reverse shoplifting, shopdropping involves surreptitiously putting things in stores, rather than illegally taking them out, and the motivations vary.

Anti-consumerist artists slip replica products packaged with political messages onto shelves while religious proselytizers insert pamphlets between the pages of gay-and-lesbian readings at book stores.

Self-published authors sneak their works into the “new releases” section, while personal trainers put their business cards into weight-loss books, and aspiring professional photographers make homemade cards — their Web site address included, of course — and covertly plant them into stationery-store racks.

“Everyone else is pushing their product, so why shouldn’t we?” said Jeff Eyrich, a producer for several independent bands, who puts stacks of his bands’ CDs — marked “free” — on music racks at Starbucks whenever the cashiers look away.

Though not new, shopdropping has grown in popularity in recent years, especially as artists have gathered to swap tactics at Web sites like Shopdropping.net, and groups like the Anti-Advertising Agency, a political art collective, do training workshops open to the public.

Retailers fear the practice may annoy shoppers and raise legal or safety concerns, particularly when it involves children’s toys or trademarked products.

“Our goal at all times is to provide comfortable and distraction-free shopping,” said Bethany Zucco, a spokeswoman for Target. “We think this type of activity would certainly not contribute to that goal.” She said she did not know of any shopdropping at Target stores.

But Packard Jennings does. An artist who lives in Oakland, Calif., he said that for the last seven months he had been working on a new batch of his Anarchist action figure that he began shopdropping this week at Target and Wal-Mart stores in the San Francisco Bay Area.

“When better than Christmas to make a point about hyper-consumerism?” asked Mr. Jennings, 37, whose action figure comes with tiny accessories including a gas mask, bolt cutter, and two Molotov cocktails, and looks convincingly like any other doll on most toy-store shelves. Putting it in stores and filming people as they try to buy it as they interact with store clerks, Mr. Jennings said he hoped to show that even radical ideology gets commercialized. He said for safety reasons he retrieves the figures before customers take them home.

At Powell’s Books in Portland, Ore., religious groups have been hitting the magazines in the science section with fliers featuring Christian cartoons, while their adversaries have been moving Bibles from the religion section to the fantasy/science-fiction section.

This week an arts group in Oakland, the Center for Tactical Magic, began shopdropping neatly folded stacks of homemade T-shirts into Wal-Mart and Target stores in the San Francisco Bay Area. The shirts feature radical images and slogans like one with the faces of Karl Marx, Che Guevara and Mikhail Bakunin, a Russian anarchist. It says, “Peace on Earth. After we overthrow capitalism.”

“Our point is to put a message, not a price tag, on them,” said Aaron Gach, 33, a spokesman for the group.

Mr. Jennings’s anarchist action figure met with a befuddled reaction from a Target store manager on Wednesday in El Cerrito, Calif.

“I don’t think this is a product that we sell,” the manager said as Mr. Jennings pretended to be a customer trying to buy it. “It’s definitely antifamily, which is not what Target is about.”

One of the first reports of shopdropping was in 1989, when a group called the Barbie Liberation Organization sought to make a point about sexism in children’s toys by swapping the voice hardware of Barbie dolls with those in GI Joe figures before putting the dolls back on store shelves.

Scott Wolfson, a spokesman for the federal Consumer Product Safety Commission, said he was not sure if shopdropping was illegal but that some forms of it could raise safety concerns because the items left on store shelves might not abide by labeling requirements and federal safety standards.

Ryan Watkins-Hughes, 28, a photographer from Brooklyn, teamed up with four other artists to shopdrop canned goods with altered labels at Whole Foods stores in New York City this week. “In the holidays, people get into this head-down, plow-through-the-shopping autopilot mode,” Mr. Watkins-Hughes said “‘I got to get a dress for Cindy, get a stereo for Uncle John, go buy canned goods for the charity drive and get back home.’”

“Warhol took the can into the gallery. We bring the art to the can,” he said, adding that the labels consisted of photographs of places he had traveled combined with the can’s original bar code so that people could still buy them.

“What we do is try to inject a brief moment of wonder that helps wake them up from that rushed stupor,” he said, pausing to add, “That’s the true holiday spirit, isn’t it?”

Christopher Maag contributed reporting.

"Satire and Speculation" by Paul Krassner

“Satire and Speculation”
by Paul Krassner
December 21, 2007

A few years ago, in my last album, right after the Abu Ghraib scandal broke, I talked about how furious Senators and congressmen were, looking at such photos as a prisoner forced to wear women’s panties on his head and a naked prisoner with a dog collar attached to a leash held by a woman who is pointing at the man’s penis and laughing. Why were those legislators sputtering with such rage? Because THEY have to pay EXTRA for those services.

Now, I asked Sam Leff–given his background as an anthropologist studying and writing about the hidden rituals of American sadomasochism–for his take on the CIA’s cover-up of torture videos.

“I have been watching with fascinated horror,” he said, “as America’s S/M patterns of culture have emerged into the open in the Abu Ghraib/Gitmo Bush administration. I’ve been flashing on some clear images of the fratboy reality underlying the White House torture tape controversy.

“Picture this. Bush and Karl Rove sitting around a big plasma screen (drinking beer?) and laughing their asses off watching helpless prisoners drowning under a waterboard, or naked getting cigarette burns, or maybe having analgesic balm applied to their genitals.

“Once the existence of the tapes became known, their cover story is that they were having a big discussion about whether or not to keep or destroy the torture tapes. Like that old pervert, J. Edgar Hoover, the reality is they were getting off looking at them as sadistic porn–over and over. Perhaps sharing them with the ‘frat brothers’ of their inner circle.”

Indeed, in November 2005, Garry Trudeau was queried by Editor & Publisher about his Doonesbury strip the previous Sunday which had George Bush defending the branding of Yale University fraternity initiates with a red-hot coat-hanger in 1967, and Trudeau replied that it was “Totally fact based. Bush’s comment in panel seven is a direct quote.” He was referring to the collegiate Bush saying, “Insignificant! There’s no scarring mark physically or mentally!”

Some pledges told the Yale Daily News that their branding was preceded by a physical beating. Said one: “By that time, my body was so numb [from the beatings] that the iron felt good, like a match was being held close to my body.” Bush, who was president of the fraternity, said that the resulting wound was “only a cigarette burn.” Or maybe enhanced pledging technique.

Paul Krassner is the author of One Hand Jerking: Reports From an Investigative Satirist, and publisher of the Disneyland Memorial Orgy poster, both available at paulkrassner.com

The psychedelic secrets of Santa Claus

resurrection.jpg

The Resurrection of Santa Claus (2000) by Jimmy Bursenos.

The psychedelic secrets of Santa Claus
by Dana Larsen, Cannabis Culture Magazine (18 Dec, 2003)

Modern Christmas traditions are based on ancient mushroom-using shamans.

Although most people see Christmas as a Christian holiday, most of the symbols and icons we associate with Christmas celebrations are actually derived from the shamanistic traditions of the tribal peoples of pre-Christian Northern Europe.

The sacred mushroom of these people was the red and white amanita muscaria mushroom, also known as “fly agaric.” These mushrooms are now commonly seen in books of fairy tales, and are usually associated with magic and fairies. This is because they contain potent hallucinogenic compounds, and were used by ancient peoples for insight and transcendental experiences.

Most of the major elements of the modern Christmas celebration, such as Santa Claus, Christmas trees, magical reindeer and the giving of gifts, are originally based upon the traditions surrounding the harvest and consumption of these most sacred mushrooms.

The world tree

These ancient peoples, including the Lapps of modern-day Finland, and the Koyak tribes of the central Russian steppes, believed in the idea of a World Tree. The World Tree was seen as a kind of cosmic axis, onto which the planes of the universe are fixed. The roots of the World Tree stretch down into the underworld, its trunk is the “middle earth” of everyday existence, and its branches reach upwards into the heavenly realm.

The amanita muscaria mushrooms grow only under certain types of trees, mostly firs and evergreens. The mushroom caps are the fruit of the larger mycelium beneath the soil which exists in a symbiotic relationship with the roots of the tree. To ancient people, these mushrooms were literally “the fruit of the tree.”

The North Star was also considered sacred, since all other stars in the sky revolved around its fixed point. They associated this “Pole Star” with the World Tree and the central axis of the universe. The top of the World Tree touched the North Star, and the spirit of the shaman would climb the metaphorical tree, thereby passing into the realm of the gods. This is the true meaning of the star on top of the modern Christmas tree, and also the reason that the super-shaman Santa makes his home at the North Pole.

Ancient peoples were amazed at how these magical mushrooms sprang from the earth without any visible seed. They considered this “virgin birth” to have been the result of the morning dew, which was seen as the semen of the deity. The silver tinsel we drape onto our modern Christmas tree represents this divine fluid.

Reindeer games

The active ingredients of the amanita mushrooms are not metabolized by the body, and so they remain active in the urine. In fact, it is safer to drink the urine of one who has consumed the mushrooms than to eat the mushrooms directly, as many of the toxic compounds are processed and eliminated on the first pass through the body.

It was common practice among ancient people to recycle the potent effects of the mushroom by drinking each other’s urine. The amanita’s ingredients can remain potent even after six passes through the human body. Some scholars argue that this is the origin of the phrase “to get pissed,” as this urine-drinking activity preceded alcohol by thousands of years.

Reindeer were the sacred animals of these semi-nomadic people, as the reindeer provided food, shelter, clothing and other necessities. Reindeer are also fond of eating the amanita mushrooms; they will seek them out, then prance about while under their influence. Often the urine of tripped-out reindeer would be consumed for its psychedelic effects.

This effect goes the other way too, as reindeer also enjoy the urine of a human, especially one who has consumed the mushrooms. In fact, reindeer will seek out human urine to drink, and some tribesmen carry sealskin containers of their own collected piss, which they use to attract stray reindeer back into the herd.

The effects of the amanita mushroom usually include sensations of size distortion and flying. The feeling of flying could account for the legends of flying reindeer, and legends of shamanic journeys included stories of winged reindeer, transporting their riders up to the highest branches of the World Tree.

Santa Claus, super shaman

Although the modern image of Santa Claus was created at least in part by the advertising department of Coca-Cola, in truth his appearance, clothing, mannerisms and companions all mark him as the reincarnation of these ancient mushroom-gathering shamans.

One of the side effects of eating amanita mushrooms is that the skin and facial features take on a flushed, ruddy glow. This is why Santa is always shown with glowing red cheeks and nose. Even Santa’s jolly “Ho, ho, ho!” is the euphoric laugh of one who has indulged in the magic fungus.

Santa also dresses like a mushroom gatherer. When it was time to go out and harvest the magical mushrooms, the ancient shamans would dress much like Santa, wearing red and white fur-trimmed coats and long black boots.

These peoples lived in dwellings made of birch and reindeer hide, called “yurts.” Somewhat similar to a teepee, the yurt’s central smokehole is often also used as an entrance. After gathering the mushrooms from under the sacred trees where they appeared, the shamans would fill their sacks and return home. Climbing down the chimney-entrances, they would share out the mushroom’s gifts with those within.

The amanita mushroom needs to be dried before being consumed; the drying process reduces the mushroom’s toxicity while increasing its potency. The shaman would guide the group in stringing the mushrooms and hanging them around the hearth-fire to dry. This tradition is echoed in the modern stringing of popcorn and other items.

The psychedelic journeys taken under the influence of the amanita were also symbolized by a stick reaching up through the smokehole in the top of the yurt. The smokehole was the portal where the spirit of the shaman exited the physical plane.

Santa’s famous magical journey, where his sleigh takes him around the whole planet in a single night, is developed from the “heavenly chariot,” used by the gods from whom Santa and other shamanic figures are descended. The chariot of Odin, Thor and even the Egyptian god Osiris is now known as the Big Dipper, which circles around the North Star in a 24-hour period.

In different versions of the ancient story, the chariot was pulled by reindeer or horses. As the animals grow exhausted, their mingled spit and blood falls to the ground, forming the amanita mushrooms.

St Nicholas and Old Nick

Saint Nicholas is a legendary figure who supposedly lived during the fourth Century. His cult spread quickly and Nicholas became the patron saint of many varied groups, including judges, pawnbrokers, criminals, merchants, sailors, bakers, travelers, the poor, and children.

Most religious historians agree that St Nicholas did not actually exist as a real person, and was instead a Christianized version of earlier Pagan gods. Nicholas’ legends were mainly created out of stories about the Teutonic god called Hold Nickar, known as Poseidon to the Greeks. This powerful sea god was known to gallop through the sky during the winter solstice, granting boons to his worshippers below.

When the Catholic Church created the character of St Nicholas, they took his name from “Nickar” and gave him Poseidon’s title of “the Sailor.” There are thousands of churches named in St Nicholas’ honor, most of which were converted from temples to Poseidon and Hold Nickar. (As the ancient pagan deities were demonized by the Christian church, Hold Nickar’s name also became associated with Satan, known as “Old Nick!”)

Local traditions were incorporated into the new Christian holidays to make them more acceptable to the new converts. To these early Christians, Saint Nicholas became a sort of “super-shaman” who was overlaid upon their own shamanic cultural practices. Many images of Saint Nicholas from these early times show him wearing red and white, or standing in front of a red background with white spots, the design of the amanita mushroom.

St Nicholas also adopted some of the qualities of the legendary “Grandmother Befana” from Italy, who filled children’s stockings with gifts. Her shrine at Bari, Italy, became a shrine to St Nicholas.

Modern world, ancient traditions

Some psychologists have discussed the “cognitive dissonance” which occurs when children are encouraged to believe in the literal existence of Santa Claus, only to have their parents’ lie revealed when they are older. By so deceiving our children we rob them of a richer heritage, for the actual origin of these ancient rituals is rooted deep in our history and our collective unconscious. By better understanding the truths within these popular celebrations, we can better understand the modern world, and our place in it.

Many people in the modern world have rejected Christmas as being too commercial, claiming that this ritual of giving is actually a celebration of materialism and greed. Yet the true spirit of this winter festival lies not in the exchange of plastic toys, but in celebrating a gift from the earth: the fruiting top of a magical mushroom, and the revelatory experiences it can provide.

Instead of perpetuating outdated and confusing holiday myths, it might be more fulfilling to return to the original source of these seasonal celebrations. How about getting back to basics and enjoying some magical mushrooms with your loved ones this solstice? What better gift can a family share than a little piece of love and enlightenment?

FURTHER LINKS AND REFERENCES:

The Hidden Meanings of Christmas, Mushrooms and Mankind, by James Arthur
Santa Claus & the Amanita Muscaria, by Jimmy Bursenos
Who put the Fly Agaric into Christmas?, Seventh International Mycological Congress, December 1999, Fungus of the Month
The Real Story of Santa, The Spore Print, Los Angeles Mycological Society, December 1998
Santa and those Reindeer: The Hallucinogenic Connection, The Physics of Christmas, by Roger Highfield
Fungi, Fairy Rings and Father Christmas, North West Fungus Group, 1998 Presidential Address, by Dr Sean Edwards
Fly Agaric, Tom Volk’s Fungus of the Month for December 1999
Father Christmas Flies on Toadstools, New Scientist, December 1986
Psycho-mycological studies of amanita: From ancient sacrament to modern phobia, by Jonathan Ott, Journal of Psychedelic Drugs; 1976
Santa is a Wildman, LA Times, Jeffrey Vallance

BOOKS WORTH READING:

Mushrooms and Mankind, by James Arthur
Soma: Divine Mushroom of Immortality, by Gordon Wasson
Mushrooms, Poisons and Panaceas, by Denis R. Benjamin

LAKOTAS SECEDE!

Descendants of Sitting Bull, Crazy Horse break away from US

December 20, 2007

WASHINGTON (AFP) — The Lakota Indians, who gave the world legendary warriors Sitting Bull and Crazy Horse, have withdrawn from treaties with the United States, leaders said Wednesday.

“We are no longer citizens of the United States of America and all those who live in the five-state area that encompasses our country are free to join us,” long-time Indian rights activist Russell Means told a handful of reporters and a delegation from the Bolivian embassy, gathered in a church in a run-down neighborhood of Washington for a news conference.

A delegation of Lakota leaders delivered a message to the State Department on Monday, announcing they were unilaterally withdrawing from treaties they signed with the federal government of the United States, some of them more than 150 years old.

They also visited the Bolivian, Chilean, South African and Venezuelan embassies, and will continue on their diplomatic mission and take it overseas in the coming weeks and months, they told the news conference.

Lakota country includes parts of the states of Nebraska, South Dakota, North Dakota, Montana and Wyoming.

The new country would issue its own passports and driving licences, and living there would be tax-free — provided residents renounce their US citizenship, Means said.

The treaties signed with the United States are merely “worthless words on worthless paper,” the Lakota freedom activists say on their website.

The treaties have been “repeatedly violated in order to steal our culture, our land and our ability to maintain our way of life,” the reborn freedom movement says.

Withdrawing from the treaties was entirely legal, Means said.

“This is according to the laws of the United States, specifically article six of the constitution,” which states that treaties are the supreme law of the land, he said.

“It is also within the laws on treaties passed at the Vienna Convention and put into effect by the US and the rest of the international community in 1980. We are legally within our rights to be free and independent,” said Means.

The Lakota relaunched their journey to freedom in 1974, when they drafted a declaration of continuing independence — an overt play on the title of the United States’ Declaration of Independence from England.

Thirty-three years have elapsed since then because “it takes critical mass to combat colonialism and we wanted to make sure that all our ducks were in a row,” Means said.

One duck moved into place in September, when the United Nations adopted a non-binding declaration on the rights of indigenous peoples — despite opposition from the United States, which said it clashed with its own laws.

“We have 33 treaties with the United States that they have not lived by. They continue to take our land, our water, our children,” Phyllis Young, who helped organize the first international conference on indigenous rights in Geneva in 1977, told the news conference.

The US “annexation” of native American land has resulted in once proud tribes such as the Lakota becoming mere “facsimiles of white people,” said Means.

Oppression at the hands of the US government has taken its toll on the Lakota, whose men have one of the shortest life expectancies — less than 44 years — in the world.

Lakota teen suicides are 150 percent above the norm for the United States; infant mortality is five times higher than the US average; and unemployment is rife, according to the Lakota freedom movement’s website.

“Our people want to live, not just survive or crawl and be mascots,” said Young.

“We are not trying to embarrass the United States. We are here to continue the struggle for our children and grandchildren,” she said, predicting that the battle would not be won in her lifetime.”

Pentangle "Sweet Child" 40th Anniversary Concert at the Royal Festival Hall

from bertjansch.com:

On 29 June 2008, exactly 40 years to the day that unique British folk/jazz ‘supergroup’ Pentangle recorded the live disc of their seminal double album, Sweet Child, at London’s Royal Festival Hall, the original band: Bert Jansch, John Renbourn, Jacqui McShee, Danny Thompson and Terry Cox, will reunite and return to the Royal Festival Hall to celebrate their legacy. From their formation in swinging ‘60s London, Pentangle were one of the most exciting and innovative groups in the world, genuinely pushing boundaries and exploring new musical avenues. Simultaneously stars of the underground and darlings of the mainstream, they enjoyed an unprecedented degree of success worldwide for an acoustic band and their influence and musical impact is still revered and relevant today, as evidenced by the critical and commercial acclaim for The Time Has Come, and their BBC Radio 2 Lifetime Achievement Award presented in February 2007 at the BBC Folk Awards by Sir David Attenborough. This concert is a once-in-a-lifetime opportunity for long-time fans to revisit and new fans to experience for the first time the magic that is Pentangle.

Sunday 29 June 2008, Royal Festival Hall, Southbank Centre, Belvedere Road, London SE1 8XX. Doors 7.00pm. Tickets (£30/£25/£15) go on sale at 10.00am on Thursday 8 November from the Southbank Centre Ticket Office on 0871 663 2500 or http://www.southbankcentre.co.uk. (Transaction fees apply except for Southbank Centre members).

“Pentangle still rule the roost” – The Times (9 February 2007)

“Pentangle revolutionised 60s music” – MOJO (April 2007)

“The godfathers of English folk music…unquestionably the core template for today’s blooming nu-folk scene” – Metro (23 March 2007)

“Pentangle was electrifying, particularly at a time of unprecedented free thinking in music. Together they created an intoxicating instrumental force” – Jazzwise ( April 2007)

“One of the most experimental and influential bands of the Sixties” – The Sun (23 February 2007)

“Pentangle rewrote the Britfolk rulebook…America has nothing to match them” – The Daily Mirror (9 March 2007)

“Britain’s Grateful Dead” – The Guardian (16 March 2007)

What we can expect as culture devolves towards Idiocracy.

Twilight of the Books: What will life be like if people stop reading?
by Caleb Crain
December 24, 2007 New Yorker

In 1937, twenty-nine per cent of American adults told the pollster George Gallup that they were reading a book. In 1955, only seventeen per cent said they were. Pollsters began asking the question with more latitude. In 1978, a survey found that fifty-five per cent of respondents had read a book in the previous six months. The question was even looser in 1998 and 2002, when the General Social Survey found that roughly seventy per cent of Americans had read a novel, a short story, a poem, or a play in the preceding twelve months. And, this August, seventy-three per cent of respondents to another poll said that they had read a book of some kind, not excluding those read for work or school, in the past year. If you didn’t read the fine print, you might think that reading was on the rise.

You wouldn’t think so, however, if you consulted the Census Bureau and the National Endowment for the Arts, who, since 1982, have asked thousands of Americans questions about reading that are not only detailed but consistent. The results, first reported by the N.E.A. in 2004, are dispiriting. In 1982, 56.9 per cent of Americans had read a work of creative literature in the previous twelve months. The proportion fell to fifty-four per cent in 1992, and to 46.7 per cent in 2002. Last month, the N.E.A. released a follow-up report, “To Read or Not to Read,” which showed correlations between the decline of reading and social phenomena as diverse as income disparity, exercise, and voting. In his introduction, the N.E.A. chairman, Dana Gioia, wrote, “Poor reading skills correlate heavily with lack of employment, lower wages, and fewer opportunities for advancement.”

This decline is not news to those who depend on print for a living. In 1970, according to Editor & Publisher International Year Book, there were 62.1 million weekday newspapers in circulation—about 0.3 papers per person. Since 1990, circulation has declined steadily, and in 2006 there were just 52.3 million weekday papers—about 0.17 per person. In January 1994, forty-nine per cent of respondents told the Pew Research Center for the People and the Press that they had read a newspaper the day before. In 2006, only forty-three per cent said so, including those who read online. Book sales, meanwhile, have stagnated. The Book Industry Study Group estimates that sales fell from 8.27 books per person in 2001 to 7.93 in 2006. According to the Department of Labor, American households spent an average of a hundred and sixty-three dollars on reading in 1995 and a hundred and twenty-six dollars in 2005. In “To Read or Not to Read,” the N.E.A. reports that American households’ spending on books, adjusted for inflation, is “near its twenty-year low,” even as the average price of a new book has increased.

More alarming are indications that Americans are losing not just the will to read but even the ability. According to the Department of Education, between 1992 and 2003 the average adult’s skill in reading prose slipped one point on a five-hundred-point scale, and the proportion who were proficient—capable of such tasks as “comparing viewpoints in two editorials”—declined from fifteen per cent to thirteen. The Department of Education found that reading skills have improved moderately among fourth and eighth graders in the past decade and a half, with the largest jump occurring just before the No Child Left Behind Act took effect, but twelfth graders seem to be taking after their elders. Their reading scores fell an average of six points between 1992 and 2005, and the share of proficient twelfth-grade readers dropped from forty per cent to thirty-five per cent. The steepest declines were in “reading for literary experience”—the kind that involves “exploring themes, events, characters, settings, and the language of literary works,” in the words of the department’s test-makers. In 1992, fifty-four per cent of twelfth graders told the Department of Education that they talked about their reading with friends at least once a week. By 2005, only thirty-seven per cent said they did.

The erosion isn’t unique to America. Some of the best data come from the Netherlands, where in 1955 researchers began to ask people to keep diaries of how they spent every fifteen minutes of their leisure time. Time-budget diaries yield richer data than surveys, and people are thought to be less likely to lie about their accomplishments if they have to do it four times an hour. Between 1955 and 1975, the decades when television was being introduced into the Netherlands, reading on weekday evenings and weekends fell from five hours a week to 3.6, while television watching rose from about ten minutes a week to more than ten hours. During the next two decades, reading continued to fall and television watching to rise, though more slowly. By 1995, reading, which had occupied twenty-one per cent of people’s spare time in 1955, accounted for just nine per cent.

The most striking results were generational. In general, older Dutch people read more. It would be natural to infer from this that each generation reads more as it ages, and, indeed, the researchers found something like this to be the case for earlier generations. But, with later ones, the age-related growth in reading dwindled. The turning point seems to have come with the generation born in the nineteen-forties. By 1995, a Dutch college graduate born after 1969 was likely to spend fewer hours reading each week than a little-educated person born before 1950. As far as reading habits were concerned, academic credentials mattered less than whether a person had been raised in the era of television. The N.E.A., in its twenty years of data, has found a similar pattern. Between 1982 and 2002, the percentage of Americans who read literature declined not only in every age group but in every generation—even in those moving from youth into middle age, which is often considered the most fertile time of life for reading. We are reading less as we age, and we are reading less than people who were our age ten or twenty years ago.

There’s no reason to think that reading and writing are about to become extinct, but some sociologists speculate that reading books for pleasure will one day be the province of a special “reading class,” much as it was before the arrival of mass literacy, in the second half of the nineteenth century. They warn that it probably won’t regain the prestige of exclusivity; it may just become “an increasingly arcane hobby.” Such a shift would change the texture of society. If one person decides to watch “The Sopranos” rather than to read Leonardo Sciascia’s novella “To Each His Own,” the culture goes on largely as before—both viewer and reader are entertaining themselves while learning something about the Mafia in the bargain. But if, over time, many people choose television over books, then a nation’s conversation with itself is likely to change. A reader learns about the world and imagines it differently from the way a viewer does; according to some experimental psychologists, a reader and a viewer even think differently. If the eclipse of reading continues, the alteration is likely to matter in ways that aren’t foreseeable.

Taking the long view, it’s not the neglect of reading that has to be explained but the fact that we read at all. “The act of reading is not natural,” Maryanne Wolf writes in “Proust and the Squid” (Harper; $25.95), an account of the history and biology of reading. Humans started reading far too recently for any of our genes to code for it specifically. We can do it only because the brain’s plasticity enables the repurposing of circuitry that originally evolved for other tasks—distinguishing at a glance a garter snake from a haricot vert, say.

The squid of Wolf’s title represents the neurobiological approach to the study of reading. Bigger cells are easier for scientists to experiment on, and some species of squid have optic-nerve cells a hundred times as thick as mammal neurons, and up to four inches long, making them a favorite with biologists. (Two decades ago, I had a summer job washing glassware in Cape Cod’s Marine Biological Laboratory. Whenever researchers extracted an optic nerve, they threw the rest of the squid into a freezer, and about once a month we took a cooler-full to the beach for grilling.) To symbolize the humanistic approach to reading, Wolf has chosen Proust, who described reading as “that fruitful miracle of a communication in the midst of solitude.” Perhaps inspired by Proust’s example, Wolf, a dyslexia researcher at Tufts, reminisces about the nuns who taught her to read in a two-room brick schoolhouse in Illinois. But she’s more of a squid person than a Proust person, and seems most at home when dissecting Proust’s fruitful miracle into such brain parts as the occipital “visual association area” and “area 37’s fusiform gyrus.” Given the panic that takes hold of humanists when the decline of reading is discussed, her cold-blooded perspective is opportune.

Wolf recounts the early history of reading, speculating about developments in brain wiring as she goes. For example, from the eighth to the fifth millennia B.C.E., clay tokens were used in Mesopotamia for tallying livestock and other goods. Wolf suggests that, once the simple markings on the tokens were understood not merely as squiggles but as representations of, say, ten sheep, they would have put more of the brain to work. She draws on recent research with functional magnetic resonance imaging (fMRI), a technique that maps blood flow in the brain during a given task, to show that meaningful squiggles activate not only the occipital regions responsible for vision but also temporal and parietal regions associated with language and computation. If a particular squiggle was repeated on a number of tokens, a group of nerves might start to specialize in recognizing it, and other nerves to specialize in connecting to language centers that handled its meaning.

In the fourth millennium B.C.E., the Sumerians developed cuneiform, and the Egyptians hieroglyphs. Both scripts began with pictures of things, such as a beetle or a hand, and then some of these symbols developed more abstract meanings, representing ideas in some cases and sounds in others. Readers had to recognize hundreds of symbols, some of which could stand for either a word or a sound, an ambiguity that probably slowed down decoding. Under this heavy cognitive burden, Wolf imagines, the Sumerian reader’s brain would have behaved the way modern brains do when reading Chinese, which also mixes phonetic and ideographic elements and seems to stimulate brain activity in a pattern distinct from that of people reading the Roman alphabet. Frontal regions associated with muscle memory would probably also have gone to work, because the Sumerians learned their characters by writing them over and over, as the Chinese do today.

Complex scripts like Sumerian and Egyptian were written only by scribal élites. A major breakthrough occurred around 750 B.C.E., when the Greeks, borrowing characters from a Semitic language, perhaps Phoenician, developed a writing system that had just twenty-four letters. There had been scripts with a limited number of characters before, as there had been consonants and even occasionally vowels, but the Greek alphabet was the first whose letters recorded every significant sound element in a spoken language in a one-to-one correspondence, give or take a few diphthongs. In ancient Greek, if you knew how to pronounce a word, you knew how to spell it, and you could sound out almost any word you saw, even if you’d never heard it before. Children learned to read and write Greek in about three years, somewhat faster than modern children learn English, whose alphabet is more ambiguous. The ease democratized literacy; the ability to read and write spread to citizens who didn’t specialize in it. The classicist Eric A. Havelock believed that the alphabet changed “the character of the Greek consciousness.”

Wolf doesn’t quite second that claim. She points out that it is possible to read efficiently a script that combines ideograms and phonetic elements, something that many Chinese do daily. The alphabet, she suggests, entailed not a qualitative difference but an accumulation of small quantitative ones, by helping more readers reach efficiency sooner. “The efficient reading brain,” she writes, “quite literally has more time to think.” Whether that development sparked Greece’s flowering she leaves to classicists to debate, but she agrees with Havelock that writing was probably a contributive factor, because it freed the Greeks from the necessity of keeping their whole culture, including the Iliad and the Odyssey, memorized.

The scholar Walter J. Ong once speculated that television and similar media are taking us into an era of “secondary orality,” akin to the primary orality that existed before the emergence of text. If so, it is worth trying to understand how different primary orality must have been from our own mind-set. Havelock theorized that, in ancient Greece, the effort required to preserve knowledge colored everything. In Plato’s day, the word mimesis referred to an actor’s performance of his role, an audience’s identification with a performance, a pupil’s recitation of his lesson, and an apprentice’s emulation of his master. Plato, who was literate, worried about the kind of trance or emotional enthrallment that came over people in all these situations, and Havelock inferred from this that the idea of distinguishing the knower from the known was then still a novelty. In a society that had only recently learned to take notes, learning something still meant abandoning yourself to it. “Enormous powers of poetic memorization could be purchased only at the cost of total loss of objectivity,” he wrote.

It’s difficult to prove that oral and literate people think differently; orality, Havelock observed, doesn’t “fossilize” except through its nemesis, writing. But some supporting evidence came to hand in 1974, when Aleksandr R. Luria, a Soviet psychologist, published a study based on interviews conducted in the nineteen-thirties with illiterate and newly literate peasants in Uzbekistan and Kyrgyzstan. Luria found that illiterates had a “graphic-functional” way of thinking that seemed to vanish as they were schooled. In naming colors, for example, literate people said “dark blue” or “light yellow,” but illiterates used metaphorical names like “liver,” “peach,” “decayed teeth,” and “cotton in bloom.” Literates saw optical illusions; illiterates sometimes didn’t. Experimenters showed peasants drawings of a hammer, a saw, an axe, and a log and then asked them to choose the three items that were similar. Illiterates resisted, saying that all the items were useful. If pressed, they considered throwing out the hammer; the situation of chopping wood seemed more cogent to them than any conceptual category. One peasant, informed that someone had grouped the three tools together, discarding the log, replied, “Whoever told you that must have been crazy,” and another suggested, “Probably he’s got a lot of firewood.” One frustrated experimenter showed a picture of three adults and a child and declared, “Now, clearly the child doesn’t belong in this group,” only to have a peasant answer:

Oh, but the boy must stay with the others! All three of them are working, you see, and if they have to keep running out to fetch things, they’ll never get the job done, but the boy can do the running for them.

Illiterates also resisted giving definitions of words and refused to make logical inferences about hypothetical situations. Asked by Luria’s staff about polar bears, a peasant grew testy: “What the cock knows how to do, he does. What I know, I say, and nothing beyond that!” The illiterates did not talk about themselves except in terms of their tangible possessions. “What can I say about my own heart?” one asked.

In the nineteen-seventies, the psychologists Sylvia Scribner and Michael Cole tried to replicate Luria’s findings among the Vai, a rural people in Liberia. Since some Vai were illiterate, some were schooled in English, and others were literate in the Vai’s own script, the researchers hoped to be able to distinguish cognitive changes caused by schooling from those caused specifically by literacy. They found that English schooling and English literacy improved the ability to talk about language and solve logic puzzles, as literacy had done with Luria’s peasants. But literacy in Vai script improved performance on only a few language-related tasks. Scribner and Cole’s modest conclusion—“Literacy makes some difference to some skills in some contexts”—convinced some people that the literate mind was not so different from the oral one after all. But others have objected that it was misguided to separate literacy from schooling, suggesting that cognitive changes came with the culture of literacy rather than with the mere fact of it. Also, the Vai script, a syllabary with more than two hundred characters, offered nothing like the cognitive efficiency that Havelock ascribed to Greek. Reading Vai, Scribner and Cole admitted, was “a complex problem-solving process,” usually performed slowly.

Soon after this study, Ong synthesized existing research into a vivid picture of the oral mind-set. Whereas literates can rotate concepts in their minds abstractly, orals embed their thoughts in stories. According to Ong, the best way to preserve ideas in the absence of writing is to “think memorable thoughts,” whose zing insures their transmission. In an oral culture, cliché and stereotype are valued, as accumulations of wisdom, and analysis is frowned upon, for putting those accumulations at risk. There’s no such concept as plagiarism, and redundancy is an asset that helps an audience follow a complex argument. Opponents in struggle are more memorable than calm and abstract investigations, so bards revel in name-calling and in “enthusiastic description of physical violence.” Since there’s no way to erase a mistake invisibly, as one may in writing, speakers tend not to correct themselves at all. Words have their present meanings but no older ones, and if the past seems to tell a story with values different from current ones, it is either forgotten or silently adjusted. As the scholars Jack Goody and Ian Watt observed, it is only in a literate culture that the past’s inconsistencies have to be accounted for, a process that encourages skepticism and forces history to diverge from myth.

Upon reaching classical Greece, Wolf abandons history, because the Greeks’ alphabet-reading brains probably resembled ours, which can be readily put into scanners. Drawing on recent imaging studies, she explains in detail how a modern child’s brain wires itself for literacy. The ground is laid in preschool, when parents read to a child, talk with her, and encourage awareness of sound elements like rhyme and alliteration, perhaps with “Mother Goose” poems. Scans show that when a child first starts to read she has to use more of her brain than adults do. Broad regions light up in both hemispheres. As a child’s neurons specialize in recognizing letters and become more efficient, the regions activated become smaller.

At some point, as a child progresses from decoding to fluent reading, the route of signals through her brain shifts. Instead of passing along a “dorsal route” through occipital, temporal, and parietal regions in both hemispheres, reading starts to move along a faster and more efficient “ventral route,” which is confined to the left hemisphere. With the gain in time and the freed-up brainpower, Wolf suggests, a fluent reader is able to integrate more of her own thoughts and feelings into her experience. “The secret at the heart of reading,” Wolf writes, is “the time it frees for the brain to have thoughts deeper than those that came before.” Imaging studies suggest that in many cases of dyslexia the right hemisphere never disengages, and reading remains effortful.

In a recent book claiming that television and video games were “making our minds sharper,” the journalist Steven Johnson argued that since we value reading for “exercising the mind,” we should value electronic media for offering a superior “cognitive workout.” But, if Wolf’s evidence is right, Johnson’s metaphor of exercise is misguided. When reading goes well, Wolf suggests, it feels effortless, like drifting down a river rather than rowing up it. It makes you smarter because it leaves more of your brain alone. Ruskin once compared reading to a conversation with the wise and noble, and Proust corrected him. It’s much better than that, Proust wrote. To read is “to receive a communication with another way of thinking, all the while remaining alone, that is, while continuing to enjoy the intellectual power that one has in solitude and that conversation dissipates immediately.”

Wolf has little to say about the general decline of reading, and she doesn’t much speculate about the function of the brain under the influence of television and newer media. But there is research suggesting that secondary orality and literacy don’t mix. In a study published this year, experimenters varied the way that people took in a PowerPoint presentation about the country of Mali. Those who were allowed to read silently were more likely to agree with the statement “The presentation was interesting,” and those who read along with an audiovisual commentary were more likely to agree with the statement “I did not learn anything from this presentation.” The silent readers remembered more, too, a finding in line with a series of British studies in which people who read transcripts of television newscasts, political programs, advertisements, and science shows recalled more information than those who had watched the shows themselves.

The antagonism between words and moving images seems to start early. In August, scientists at the University of Washington revealed that babies aged between eight and sixteen months know on average six to eight fewer words for every hour of baby DVDs and videos they watch daily. A 2005 study in Northern California found that a television in the bedroom lowered the standardized-test scores of third graders. And the conflict continues throughout a child’s development. In 2001, after analyzing data on more than a million students around the world, the researcher Micha Razel found “little room for doubt” that television worsened performance in reading, science, and math. The relationship wasn’t a straight line but “an inverted check mark”: a small amount of television seemed to benefit children; more hurt. For nine-year-olds, the optimum was two hours a day; for seventeen-year-olds, half an hour. Razel guessed that the younger children were watching educational shows, and, indeed, researchers have shown that a five-year-old boy who watches “Sesame Street” is likely to have higher grades even in high school. Razel noted, however, that fifty-five per cent of students were exceeding their optimal viewing time by three hours a day, thereby lowering their academic achievement by roughly one grade level.

The Internet, happily, does not so far seem to be antagonistic to literacy. Researchers recently gave Michigan children and teen-agers home computers in exchange for permission to monitor their Internet use. The study found that grades and reading scores rose with the amount of time spent online. Even visits to pornography Web sites improved academic performance. Of course, such synergies may disappear if the Internet continues its YouTube-fuelled evolution away from print and toward television.

No effort of will is likely to make reading popular again. Children may be browbeaten, but adults resist interference with their pleasures. It may simply be the case that many Americans prefer to learn about the world and to entertain themselves with television and other streaming media, rather than with the printed word, and that it is taking a few generations for them to shed old habits like newspapers and novels. The alternative is that we are nearing the end of a pendulum swing, and that reading will return, driven back by forces as complicated as those now driving it away.

But if the change is permanent, and especially if the slide continues, the world will feel different, even to those who still read. Because the change has been happening slowly for decades, everyone has a sense of what is at stake, though it is rarely put into words. There is something to gain, of course, or no one would ever put down a book and pick up a remote. Streaming media give actual pictures and sounds instead of mere descriptions of them. “Television completes the cycle of the human sensorium,” Marshall McLuhan proclaimed in 1967. Moving and talking images are much richer in information about a performer’s appearance, manner, and tone of voice, and they give us the impression that we know more about her health and mood, too. The viewer may not catch all the details of a candidate’s health-care plan, but he has a much more definite sense of her as a personality, and his response to her is therefore likely to be more full of emotion. There is nothing like this connection in print. A feeling for a writer never touches the fact of the writer herself, unless reader and writer happen to meet. In fact, from Shakespeare to Pynchon, the personalities of many writers have been mysterious.

Emotional responsiveness to streaming media harks back to the world of primary orality, and, as in Plato’s day, the solidarity amounts almost to a mutual possession. “Electronic technology fosters and encourages unification and involvement,” in McLuhan’s words. The viewer feels at home with his show, or else he changes the channel. The closeness makes it hard to negotiate differences of opinion. It can be amusing to read a magazine whose principles you despise, but it is almost unbearable to watch such a television show. And so, in a culture of secondary orality, we may be less likely to spend time with ideas we disagree with.

Self-doubt, therefore, becomes less likely. In fact, doubt of any kind is rarer. It is easy to notice inconsistencies in two written accounts placed side by side. With text, it is even easy to keep track of differing levels of authority behind different pieces of information. The trust that a reader grants to the New York Times, for example, may vary sentence by sentence. A comparison of two video reports, on the other hand, is cumbersome. Forced to choose between conflicting stories on television, the viewer falls back on hunches, or on what he believed before he started watching. Like the peasants studied by Luria, he thinks in terms of situations and story lines rather than abstractions.

And he may have even more trouble than Luria’s peasants in seeing himself as others do. After all, there is no one looking back at the television viewer. He is alone, though he, and his brain, may be too distracted to notice it. The reader is also alone, but the N.E.A. reports that readers are more likely than non-readers to play sports, exercise, visit art museums, attend theatre, paint, go to music events, take photographs, and volunteer. Proficient readers are also more likely to vote. Perhaps readers venture so readily outside because what they experience in solitude gives them confidence. Perhaps reading is a prototype of independence. No matter how much one worships an author, Proust wrote, “all he can do is give us desires.” Reading somehow gives us the boldness to act on them. Such a habit might be quite dangerous for a democracy to lose.

Free New Balances for everybody!: Inside Victory Records, the No Limit of punk rock

nicepants.jpg

Hollow Victory

Bands accuse the “anti-corporate” label of being corporate at its worst.

By Denise Grollmus

October 3, 2007 Cleveland Scene

This was it — the moment Hawthorne Heights had been waiting for.

It was 2003 and the five Dayton natives were about to take the stage for a Victory Records showcase in Chicago. Over the previous two years, Hawthorne Heights had been living like most fledgling bands, working minimum-wage convenience-store jobs between grueling tours in a battered van, barely compensated for performing their emo anthems in basements and half-empty clubs across the country. They would return to Ohio tired and broke, without a nibble of interest from record labels, agents, or managers.

After months of sending off demos and bombarding industry types with e-mails, Tony Brummel, owner of Victory Records, finally gave them a chance.

It was divine vindication for the earnest rockers. Victory was the pinnacle of indie cool, the nation’s second-largest independent label. Its roster read like the who’s who of modern American hardcore, from Bad Brains to Hatebreed, Earth Crisis to Grade. Brummel had become the official spokesman for angsty teens everywhere, constantly railing against the evils of “faceless” labels with a Maoist urgency. “Victory is you, it’s me, it is the street, the music,” he reminded his loyal followers. “. . . [It] cannot be bought or sold. You either embrace it or get the hell out of the way.”

For the pierced and inked, the label was the embodiment of what was right, pure, real.

The band took the stage before a handful of Victory staffers, who were dressed in their best black hoodies and studded belts. The group’s three guitarists shredded into a cacophony of fast, screechy riffs as singer J.T. Woodruff transitioned between flat-out screaming and painfully honest poetry. The crowd stood with arms crossed, says a former staffer. They never showed too much enthusiasm at these events, lest they unnecessarily lift the band’s hopes.

After the show, the Daytonites packed up and headed home, with only the promise of a phone call.

One former staffer remembers driving back from the showcase with Brummel. “All he kept saying was that they’re from a small town in Ohio, untainted by all the industry bullshit,” says the ex-staffer, who asked that his name not be used for fear of endangering his current job. “That was the most outlying aspect of the band for him — their naivety and purity.”

Brummel soon signed the group to a four-record deal. The next year, The Silence in Black and White was released.

As the band hit the road nonstop, touring with acts like Fall Out Boy, Victory pumped millions of dollars into marketing the record. Commercials for the album aired as frequently on MTV as ads for Fructis shampoo. Brummel paid handsomely for special promotions, making sure it was the most visible CD in record stores across the country. He was investing the kind of money reserved for major-label powers like Justin Timberlake and Aerosmith — not unknowns on an indie that touts its “anti-corporate” sensibilities.

It worked. The Silence in Black and White sold over 800,000 copies and sat on Billboard 200 for 60 weeks, a feat unheard of for an indie act. It was Victory’s best-selling debut. Hawthorne Heights had gone from slinging cigs at a Dayton convenience store to being adored by 14-year-old girls everywhere.

When the band’s follow-up record, If Only You Were Lonely, came out in 2006, Victory pushed even harder. The CD debuted at No. 3 on Billboard. The band’s van was quickly replaced with a decked-out tour bus.

But beneath the newfound stardom festered a less jubilant tale.

Last year, the band posted a “manifesto” on its MySpace page, announcing that it was leaving Victory “in part due to the actions of the man who sits at the head of the label, Tony Brummel.” Victory’s owner, the band asserted, “cares more about his ego and bank account than the bands themselves.”

Hawthorne Heights complained that Brummel hadn’t paid a cent in royalties, despite selling 1.2 million records. The group also claimed that Brummel’s aggressive marketing schemes had tarnished its image. It described working with Brummel as “being in an abusive relationship” in which he constantly threatened to cut off promotion of their records if the band questioned his moves. “We were afraid, as many of the bands on Victory are, to stick our neck out for fear of being ‘beaten,'” the manifesto said.

Success quickly devolved into a lawyer fight that’s still being waged today. Hawthorne Heights is seeking $1 million in damages, accusing Brummel of not simply withholding its royalties, but of “heavy-handed, overly aggressive, unethical and illegal schemes and tactics.”

Brummel has dismissed the band’s claims, stating that the case is “really just about greed,” according to court documents. But Hawthorne Heights isn’t the only band to have rebelled against Brummel. Many of Victory’s best-sellers — including Taking Back Sunday, Atreyu, Hatebreed, and Thursday — left the label after bitter fights over alleged unpaid royalties.

And it’s not just bands that say Brummel has become the corporate archvillain he so publicly loathes. Former employees speak of the Victory owner as a control freak prone to unhinged outbursts.

“There was an air of creepy big-brother surveillance,” says Kristin Bustamante, a former Victory saleswoman. “He bred a culture of fear in his employees. You were scared to leave and scared to stand up. It’s like an abusive relationship. It was, bar none, the worst experience of my life.”

Adds another former staffer: “All I can say is thank God I wasn’t in one of his bands.”
Continue reading