September 30, 2011

The Wall Street Protests

It began on September 17, and has now run for thirteen days. There are, perhaps, only a couple of hundred of them, but young people especially have shown up to protest the greed and corruption endemic to the American economy, epitomized by Wall Street financiers. The regular news media gave them almost no coverage; the NYPD overreacted with predictable violence. Many got arrested and hauled away, but the protest continues. Unions have pledged their support, and the movement is now spreading to other cities across the nation. Finally, it would seem, someone is standing up and saying No. Even more, “Go fuck yourself.”

I agree with Chris Hedges when he says that these folks are the best of society; I also think the crowd at Goldman Sachs and their ilk are the worst: bloodsuckers and leeches, to put it as politely as I can. Personally, I hope the protest grows from 200 to 2 million, and affects every city in the land. I hope it succeeds…but this is where I start to have certain problems. What is, in fact, the goal? What would success look like in this case? It’s not altogether clear; and beyond a desire to have an economy not run by vampires, by a gangster elite, the protesters’ message is rather muddy.

On one level, it would be great if the protesters could put it on their signs, and say it directly to the American public: socialism; we want a socialist economy. It’s not exactly the way to win friends and influence people in the U.S., and I’m not sure that is what they really want anyway. But there’s at least this, that they want a fairer society, one that does not have a huge gulf between the top 1% and the rest of us. Some form of redistribution of wealth, that presumably would resurrect aspects of the New Deal that the GOP has striven to destroy since Ronald Reagan (and actually, before). After all, we have millions now thrown out of their homes, millions with no prospect of a job, millions living in tent cities and on bread lines, millions without any health insurance, and so on. Re-instigating things such as the Glass-Steagall Act of 1934, real union strength, collective bargaining, workers’ benefits—all of this would be to the good, and I’m assuming that this is on the protesters’ agenda.

The problem is that we did have all this once, and to be sure, it was a much fairer and healthier society; but it was still capitalism. This, as most historians will tell you, was FDR’s historic role: he wanted to save capitalism, and he did. In the end, the mental framework, that of a society and way of life based on greed, was still the same. It was just that with the New Deal there were some constraints in place, and it is those that were unraveled in the ensuing decades. But as I argue in Why America Failed, greed has been the touchstone of the American experiment since 1584, since the earliest colonization of the continent (for its resources); it didn’t suddenly emerge 400 years later with Ronald Reagan and Gordon Gekko. Asked, on one occasion, what it was that the working man wanted, labor leader Samuel Gompers was quite explicit: “More.” Socialism doesn’t envision a different type of system; it envisions the same system with the goodies spread around more evenly.

That some labor unions have indicated their support for the protesters is therefore not surprising. Nor am I condemning them: in the face of Reaganism and Gekkoism run riot, fighting against a 1%-99% split in the wealth is obviously necessary. But when the dust settles, it will still be the United States, with the 400-year-old ideology of the United States; even if we could get the New Deal back, the slogan would still be More. Even so-called progressives think the American Dream is where it’s at. They see no problem with “growth” at all. They just want to extend its benefits to everyone. But suppose—radical thought—that the American Dream was the problem, not the solution? Unfortunately, the ideology of the Dream, of an endless frontier, casts a long shadow over all of us, so that grasping this possibility is quite difficult even for the most intelligent Americans.

Case in point: an article in the 10 October 2011 edition of The Nation by Robert Borosage and Katrina vanden Heuvel entitled “Can a Movement Save the American Dream?” The authors rightly describe how the very rich have screwed the rest of us out of the A.D., and argue that we need to restore it—redistribute wealth and benefits so that every American can live it. But again, there is no recognition that this Dream is conceptually grounded in the notion of a world without limits; that it is the core of what America is and has always been about; and that it is (as a result) the rock upon which we are now foundering. In spite of the identification (or excoriation) of this ideological pathology by a rather long list of eminent historians, including David Potter, Louis Hartz, C. Vann Woodward, Richard Hofstadter, William Appleman Williams, and Jackson Lears, “progressives” just don’t get it, any more than neoliberals do. Writing in the New Republic nearly twenty years ago, Lears stated that “myths of progress continue to mesmerize intellectuals at all points on the political spectrum, from The Nation to the National Review.” Thus Williams repeatedly pointed out that the Dream was based on a program of endless economic expansion, which eventually made imperialism, and thus the suffering of millions, inevitable. Cornell University economist Douglas Dowd made his own opinion of our way of life explicit in a book he published in 1974: The Twisted Dream. As the anthropologist Gregory Bateson argued many years ago, there is a great difference between the “ethics of maxima” and the “ethics of optima,” and the U.S. is definitely addicted to the former: “growth”. A more accurate word for it might be “cancer.” In recent times, only Jimmy Carter had the courage to tell the American people that this was the vision of those who were spiritually empty, and his audience wasted no time in voting him out of office in favor of a man who told them they could and should have it all; that the A.D. was Life Itself.

So I don’t really know what the protesters’ goals are, and I’m not sure they do either, beyond shipping Lloyd Blankfein out to Antarctica, to live among the penguins. The problem is that historically speaking, protest against the system is not really against the system as such. We like to talk in terms of a multicultural society, but women, blacks, Hispanics, union leaders, you name it: they all really share the same vision. The goal is to get my group a bigger cut of the pie; it’s not to suggest that the pie is rotten. The environmental movement excepted, there is very little thinking in America about getting beyond “growth” and “progress,” beyond a purely materialist-consumerist society, and this certainly applies to the poor as well. As John Steinbeck famously remarked, in the U.S. the poor regard themselves as “temporarily embarrassed millionaires.”

One protest leader who did understand the spiritual dimension lacking in all this was Martin Luther King. The story might be apocryphal, but one black colleague of mine told me that just before he died, King said to Harry Belafonte that he sometimes had the uneasy feeling that his activism was only serving to “herd people into a burning church.” Sure, he was saying: we might be able to get black people a larger share of the pie, of the American Dream; but the pie is an inferno, a hellish way of life.

Are the protesters saying that?

(c)Morris Berman, 2011

September 28, 2011

Jonathan Swift Revisited

Readers of this blog may remember a post I did a while ago entitled “Fork in the Road,” briefly referring to the deleterious effects of screens on the brain. The bulk of the article, however, dealt with the effects of anti-depressant drugs, as discussed by Marcia Angell in two essays in the New York Review of Books that pulled no punches on the subject. One thing that particularly impressed me was the impact of the “better living through chemistry” model of mental health on our children. During 1987-2007, the stats of mental disability among children multiplied by a factor of 35, such that mental illness is now the leading cause of disability among this segment of the population. Cruising the Net, one finds numerous studies regarding the effects of Prozac on infants derived from mothers taking the drug during pregnancy, or while breastfeeding: autism, heart defects, poor feeding, insomnia. Not the greatest way to come into the world, it would seem.

Even beyond this is the fact that a certain percentage of American preschoolers—and I was not able to determine what that figure currently is—are on anti-depressant drugs. I find the idea of a three-year-old on Zoloft absolutely chilling, in a Brave New World kind of way. This has got to be a terrible mistake; it’s got to be a way of destroying an infant’s self, so that dependency and psychological disorientation become the “normal” way of being in the world, for these poor kids. Research has suggested this in the case of adults: that the use of anti-psychotic drugs is associated with atrophy of the prefrontal cortex, and that after only a few weeks of drug use the brain begins to function in a different way. How much more powerful and long-lasting must these effects be in the case of toddlers?

The real motivation for getting very young children hooked on these meds is, of course, money: the use of such drugs from a very early age pretty much guarantees Big Pharma an endless supply of customers. It is not, à la Jonathan Swift (“A Modest Proposal”), that there is some kind of plot out there to destroy our children, wreck their intellectual and emotional functioning from age two or even earlier. But if that is the result, does it matter? If the percentage of the under-four age group on anti-depressants continues to grow, then it might be said that deliberately or not, we are eating our children alive. The jury is still out on all this, but the indications are certainly not encouraging.

When it comes to screens, however, so dramatically represented in American society by things such as Facebook and Twitter, there doesn’t seem to be much doubt: these are killers. As Sherry Turkle shows in her most recent book, Alone Together, the much-touted idea of “virtual community” proved to be a fraud. What we really have is increased alienation and depression. All of these social media and accompanying devices peddle a phony intimacy, because if you are at home alone with a screen, that’s where you actually are. Let’s take a look at some of the evidence.

In 1998 a research team at Carnegie Mellon University published an empirical study entitled “Internet Paradox,” demonstrating that within the first year or two online, people were experiencing less social engagement and poorer psychological well-being. The researchers also found that a greater use of the Internet was associated with less family communication, a reduction in local social circles, and an increase in loneliness, as well as higher rates of depression. The authors of the study concluded by suggesting that by using the Net, people were “substituting poorer quality social relationships for better relationships, that is, substituting weak ties for strong ones,” with consequent negative effects. One thinks of Mark Zuckerberg, poor rich asshole, destroying the one real friendship he had (with Facebook cofounder Eduardo Saverin), so that he could acquire a million meaningless ones.

A more recent study, conducted at the University of Michigan for the period 1979-2009, revealed a 48% decrease in empathy among college students during this time, and a 34% decrease in the ability to see things from another person’s perspective. Most of these declines, it turns out, occurred over the past decade, and the general interpretation is that this is related to the isolation involved in the use of personal technology and popular social networking sites that have become so much a part of student life. The study suggested that this was not surprising “in a world filled with rampant technology revolving around personal needs and self-expression.” But it is also the nature of the technology that is at issue, because the Internet and other electronic media are based on speed and distraction, on rapidly shifting attention. It turns out that the higher emotions, such as empathy and compassion, emerge from neural processes that are inherently slow. Various studies have shown that the more distracted we become, the less able we are to experience such emotions, or see things from the perspective of others. In a word, these technologies may be undermining our moral sense. At the very least, it becomes hard to argue that they are promoting community.

It also seems to be the case that the use of screens is creating a different type of human being, partly as a result of the neural rewiring of the brain that these devices engender. Much of the evidence for this argument has been collected and expanded upon by Nicholas Carr in The Shallows: What the Internet Is Doing to Our Brains. Marshall McLuhan had argued that the brain takes on the characteristics of the technology it uses, and we now see this in the cultural shift from print media to screens. For the Internet’s emphasis (and of course, that of Facebook and Twitter) is on searching and skimming, not on genuine reading or contemplation. As a result, given what we now know about the relative plasticity of the brain, the ability to reflect or to grasp the nuance of a situation is pushed to the margins. The Net, he says, is literally rerouting the pathways in our brains, making our thought processes increasingly shallow. It breaks up the content of a text into searchable chunks, and surrounds it with other content. This is why a page online is very different from a page of print. The concentration and attention factor are high for the latter, low for the former. Then there are the various links, which encourage us not to devote our attention to any single thing but rather to jump from item to item. Our attachment to any single item is thus provisional and fragmented. The Net and its related technologies thus constitute an “ecosystem of interruption technologies.”

Print, on the other hand, has (or should I say had?) a quality of calm attentiveness. “The quiet was part of the meaning,” as the poet Wallace Stevens once put it. When a printed text is transferred to an electronic device, says Carr, it turns into something like a website; the calm attentiveness disappears. Instead, the Net & Co. deliver repetitive, intense, and addictive stimuli, promoting very superficial understanding. Basically, you don’t really read on a screen; it’s a different kind of activity: browsing, scanning, keyword spotting, and so on. And the better you get at this, the less able you are to think deeply or creatively. We are, Carr concludes (quoting the playwright Richard Foreman), turning into “pancake people”—spread wide and thin. Facebook and Twitter are turning out such folks by the IHOP-load.

The lack of interest in printed material, and the corresponding upswing in interest in screens is, of course, especially pronounced among the young. In 2009 the average American teenager was sending or receiving 2,272 text messages a month(!). Meanwhile, the amount of time the average American between twenty-five and thirty-four years of age devoted to reading print in 2008 was forty-nine minutes a week. As Maryanne Wolf of Tufts University cogently puts it, “the digital world may be the greatest threat yet to the endangered reading brain as it has developed over the past five thousand years.” Collectively, adds author Christine Rosen, this is the endpoint of the tragedy we are now witnessing:

“Literacy, the most empowering achievement of our civilization, is to be replaced by a vague and ill-defined screen savvy. The paper book, the tool that built modernity, is to be phased out in favor of fractured, unfixed information. All in the name of progress.”

There is little room in this world, Carr points out, for “the pensive stillness of deep reading or the fuzzy indirection of contemplation.” In such a world, he goes on to say, “Ambiguity is not an opening for insight but a bug to be fixed.” The cultural impact follows upon the individual one, then: what we are witnessing is the replacement of a complex inner diversity with a new kind of self, one devoid of any sense of cultural inheritance. Screens are generating the emptiest people in the history of the world, and as in The Matrix, there is no way for these folks to get outside themselves and perceive this. This is the “frenzy” of technological society famously referred to by Martin Heidegger. In the pathological climate of “techno-social Darwinism,” as Rosen calls it, there is no time for stillness. All of these brave new people lack the ability to be alone with their thoughts, or to appreciate the importance of silence. I have found that even the brightest people don’t get it, have no idea what George Steiner meant when he called modernity “the systematic suppression of silence.” Silence, after all, is the source of all self-knowledge, and of much creativity as well. But it is hardly valued by societies that confuse creativity with productivity, and incessant noise with aliveness. As a result, we don’t notice that fundamental aspects of being human are disappearing. During his time at Yale, William Deresiewicz asked his students what place solitude had in their lives. In response, they seemed to be puzzled that anyone would want to be alone. “Young people today,” he concluded, “seem to have no desire for solitude, have never heard of it, [and] can’t imagine why it would be worth having. In fact, their use of technology…seems to involve a constant effort to stave off the possibility of solitude.” The world of creativity, of imagination, of depth of the self, is closing down.

The similarity of all this to toddlers on anti-depressants is thrown into stark relief when you realize that the corporate goal is to hook children as early as possible. Last month, Rullingnet Corp. (based in Canada) launched Vinci, a 7” touch-screen tablet for the under-four age group. It is the first tablet designed for babies as young as one week old—the product of a technological mindset that one can only call “creepy,” in my opinion, although the company’s tag line is, ironically enough, “Inspire the genius.” “We are just leveraging their curiosity,” says the inventor of the device. (Notice how a word from corporate finance gets imported into the world of child-rearing. It was leveraging that brought on the crash of 2008.) In fact, a recent study conducted by Parenting magazine and BlogHer found that 29% of Generation-X moms say their children were onto laptops by age two, and the figure rises to 34% for moms of Generation-Y. In the first month of its release, Rullingnet sold 600 Vincis.

In chapter 3 of Why America Failed I argue that technology has always functioned as America’s hidden religion, and that if you deprive Americans of their gadgets, they become depressed or enraged. What can one say when many users of Apple’s iPhone refer to it as “the Jesus phone”? This is not an accident. Technology in America has been associated with unlimited progress and therefore with utopia, with redemption, and when we are now giving touch-screens to one-week-old babies we are imprinting them in the same way that, say, a baptism might. But the reality of Facebook, Twitter, Vinci and the like is a story of false redemption. As the sociologist Zygmunt Bauman writes, what is omitted from public discussion today is the fact that almost every technological “advance” in recent years has deepened the “continuing decomposition and crumbling of social bonds and communal cohesion.” It goes way beyond the dumbing down of the culture, in other words (horrific as that is); it also involves increasing human disconnectedness, social atomization, rudeness, incivility. One effect of spending most of your time in a virtual world is that of “absent presence”: you treat the world as a mere backdrop, and devalue those around you. These are the hallmarks of a superficial, narcissistic society, one which possesses no inherent meaning, and whose Twittered citizens don’t as well. With techno-imprinting going on now at age one week, I think we can expect that things can only get worse. For there is no getting around it: eating our children alive means we are eating our society alive as well.

©Morris Berman, 2011

September 27, 2011

Zucker-Punched

(Partial disclaimer: in some cases my publicist didn’t say exactly what I have her saying, so I hope both she, and the reader, will allow me a bit of poetic license here. I do think I’ve preserved the spirit of our exchange, however. She’s just trying to drag me into the 22nd century, whereas I retain a certain fondness for the late 17th.)

OK, gather round, you DAA-ers; time to give you the low-down on publicity for WAF. It seems that Wiley, my publisher, finally came to the realization that in order to make money, you’ve got to spend it. Since Western Europe figured this out around A.D. 1500, I had hoped anxiously from the beginning that they would be onto the fundamentals of capitalism more quickly. No such luck (perhaps a bad case of cultural lag, hard to say). I kept sending them messages on the subject, reviewing the work of Ricardo, Smith, J.S. Mill, and Karl Marx for them, discussing the theory of surplus value, adding in Milton Friedman, Paul Samuelson, and the like, but they seemed to be clinging to the theory of clinging: if we hang on to our money, we’ll be OK. So they kept squeezing quarters till the eagles screamed. But finally—it may have been the winter storms in New Jersey, or the flooding that subsequently occurred there, or maybe a stray lightning bolt—they woke up one morning and said, “Let’s give the poor shmuck (i.e., me) a publicist.” When I heard that they had actually hired someone, and were even going to pay her—i.e., actual money; this was not a barter in New Jersey corn or whatever—I had to lie down for a couple of hours just to recover from the shock. Maybe there is a god, I thought; maybe he likes my books. (I was heavily sedated at the time.)

Anyway, that’s Step 1 in this strange adventure. Step 2: my new publicist says to me: what’s really crucial these days are the social media. Things such as magazines, reviews, bookstore presentations, radio and TV interviews—all of that has shrunk in influence, been marginalized. Americans don’t really read that much anymore (as you’ve documented in your previous books); instead, they Twitter and Facebook, so that’s where you’ve gotta be.

Me: But if they don’t read any more, aren’t they the wrong audience for us? I mean, let’s say you take stuff off my blog and put it on Twit and FB (since I’m not going to Twit or Face myself, because I have no interest in those social media, which I think should really be called anti-social media, and think they were designed for addicted, narcissistic morons whose main interest in life revolves around stuff like the fotos Kim Kardashian posted of her psoriasis, not to mention her rear end). Those folks aren’t going to run out and read WAF; no way! For one thing, it has polysyllabic words in it, not to mention—gasp—concepts. And then these media reduce one’s attention span to that of a gnat. It’s not merely that these people don’t read books anymore; they can’t.

Publicist: Not so fast, shmendrick. For better or worse, most Americans now get their information from the web, and this even includes a few intellectuals. The social media reach millions; there’s no such thing now as book promotion without them. We need the folks who are reading your book to be out there talking about it, and one place we can be sure to find them is online. In short, adapt or die, boychik.

Me: But what about the bookstores? Isn’t anyone going to show up to hear me at bookstores?

Publicist (shaking her ahead, in the sense of ‘What a yokel’): You’ll be lucky if you pull in 5 people in Seattle and 10 in LA. Don’t forget your famous appearance in downtown Philly in 2006: 3 people showed up for your talk, and one of them fell asleep during it. The bookstore also had you billed as “Dean of Optometry at UC Fullerton,” or something like that. It can’t get much worse than that, can it?

Me: Yeah, that was indeed a humbling moment, I have to admit. So your idea is that for the next two months I post various rants and raves on any subject I want, including Twit and FB and Kim’s behind, and then you feed these things into Twit and FB, in the hope that someone who reads it will also want to read WAF? Shit, I’d rather chew on razor blades. As far as I’m concerned, Twit and FB are further examples of the collapse of American culture, of our national decline. As someone recently said, screen people are “pancake people”—all breadth and no depth.

Publicist: Perhaps, but it still makes for good PR. Even anti-PR is good PR.

Me: Were you aware that a Canadian company just released a computer tablet for toddlers, designed for babies as young as one week old? It’s not enough that we are killing our infants with Prozac and Zoloft; now we are also going to do them in with screens and touch pads.

Publicist: That’s good! Write about that! Tell your blogfolks (all 65 of them; what a huge following you’ve managed to accumulate!) that the US and Canada, through meds and hi-tech, are deliberately trying to kill our children. I mean, even if it isn’t an actual conspiracy, it seems like they’re doing a good job of it, no? You remember that essay by Jonathan Swift, right? About how Ireland should start cooking and eating its children? Well, do a new post and call it “Jonathan Swift Revisited.” That’ll get the pancakes all a-Twittering.

Me: (Heavy sigh)

Publicist: Frankly, I’m a believer in Bermanism: any culture that is designing computer screens for one-week old babies, and feeding anti-depressants to toddlers, has no future at all. What could be more obvious? When they grow up, they won’t even know what a book is, fer chrissakes.

Me: Jesus…Well, this seems like a fool’s errand, but you’re the publicist, what can I say.

Publicist: You got that right. Now get busy, shmuck. And don’t forget to give your readers the crucial contact info:

Facebook: Whyamericafailed

Twitter: @Yamericafailed