Monday, April 30, 2012
May Day Directory:
Occupy General Strike In Over 125 Cities
See HERE for what we believe to be the most comprehensive list yet compiled of cities where Occupy May Day events are being planned, as well as other resources. Note: This is a living document. Check back for updates!
Sunday, April 29, 2012
Saturday, April 28, 2012
American Obscenity: Corporate CEOs make 380x the wage of the average American worker!
The AFL-CIO ‘s Executive Paywatch website has been updated with 2011 data and easy to digest graphics that you can share easily with others. One stand-out fact: The average CEO of an S&P 500 Index company earned a staggering 380 times the average American worker’s wage. Those same CEOs saw their compensation packages increase 13.9% in 2011.
By comparison, as you can see above, the average corporate CEO made “just” 42 times what American workers made during the “golden years” of the Reagan administration.
Don’t get me wrong, I’m sure these corporate CEOs, these god-men, are worth every penny they make, but this is getting to be rather untenable, don’t you think? Keep in mind that many (most) of these guys are paid in stock options that are only subject to a 15% capital gains tax when they cash out!
Via Daily Kos Labor:
The cream always rises to the top, right? It might be time to update that saying, but not with metaphorical crème fraiche, more like royal jelly made by the rank and file worker bees who then have the privilege of watching a Mr. Creosote-sized queen bee ravenously devour the fruits of their labor!The highest-paid CEO in the country was Apple’s Timothy Cook, whose total compensation was nearly $378 million. That’s more than 11,000 times the average worker’s income of $34,053. The 100th highest-paid CEO, Heinz’s W.R. Johnson, had total compensation of more than $18 million, 543 times the average worker’s income.
What we can’t know is how much CEOs make compared with the workers in their own companies; however, that’s something the Dodd-Frank Wall Street reform bill will soon require companies to disclose. And it turns out it might well be good for companies if transparency pushed them to bring CEO pay a little more in line with average worker pay.
It’s always fun to use Huey Long’s famous barbecue example from his Depression-era “Share the Wealth” speech in a situation like this:
Imagine that Timothy Cook was seated at one banquet table, teaming with gourmet food. A feast that would cause Caligula to blush, obviously, and one that would represent his executive pay relative to the 11,000 other people (not Apple’s mistreated Chinese factory workers who make $450 a MONTH, but average Americans making $34k a year) seated beside him at a table that is exactly the same size, but, with, relatively speaking, more meagre portions of food compared to the gentleman from Cupertino.
To be clear, I don’t have anything against Apple’s CEO, but for fuck’s sake, in the abstract, does Timothy Cook himself not see something inherently obscene about the shitty working conditions in their Chinese factories and ONE guy at the top making nearly half a billion a year? Apple Inc. has the highest value of any corporation in the history of man, but the people who are actually making the products receive a pittance for renting their lives out 12 hours a day, seven days a week while he has a take-home pay packet like that?
It goes to show, AGAIN, how absolutely correct Karl Marx was. That $400 million isn’t getting re-invested back into the factories and improving conditions for the workers in any way. It’s going to ONE GUY. ONE GUY!
Imagine how many AMERICAN JOBS Apple Inc. could afford to create—they could open computer factories with great paying jobs all across America—if the fellow at the top of the food chain there was making, oh, say only $10 million a year and the rest of it “trickled down”?
Who reading this would pity him?
The reason the rank and file workers in this country are taking home so little is because the CEOs and the stockholders are taking so much! DUH. The system is rigged. It’s not that fucking difficult to understand!
Friday, April 27, 2012
'Vegan Is Love': Children's Book By Ruby Roth Causes Controversy
from the Huntington Post:
A children's book that will be released next week is stirring up controversy among parents. It's called "Vegan is Love," and according to the publisher, is a young readers' introduction "to veganism as a lifestyle of compassion and action." The details, however, including images of animals behind bars in crowded cages and graphic passages about animal testing are being called unsuitable for children –- the book is intended for kids as young as 6-years-old.
The pro-vegan message of the book isn't in dispute. While there is debate about whether an animal-product-free diet from birth is appropriate, nutritionists (and activists including Alicia Silverstone) agree that a vegan regimen can be healthy for little kids as long as their meals include enough supplemental nutrients and proteins. That said, the tone and wording in "Vegan Is Love" has experts concerned.
Child psychologist Jennifer Hart Steen told Matt Lauer on the "Today" show this morning that, "there’s so much fear presented in the book and if you would just give it to a child as a children's book they don't understand it. So now they're just going to be afraid."
Nicole German, a registered dietitian wrote on her blog that "Vegan is Love" might scare impressionable children into becoming vegan and "without proper guidance, that child could become malnourished."
The author, Ruby Roth, is raising her 7-year-old stepdaughter, Akira, whose favorite food is kale, to be vegan. Roth told "Today" that it is not her intention to instill fear. "If it's too scary to talk about, the reality of where those pieces of meat come from, then it's certainly too scary to eat," she said. Instead, the book is supposed to encourage "compassion and action," Roth told ABC.
The book promotes a no meat, no diary diet, but also suggests that kids should boycott the zoo, the circus and aquariums because "animals belong to this earth just as we do." Hart Steen worries that the title, "Vegan is Love" can send a message to kids that, if you don't follow this lifestyle, you don’t get to feel love or "you're clearly creating hate or bad feelings."
Dr. David Katz, HuffPost blogger and director of the Yale Prevention Center supports Roth's efforts and told ABC that childhood might be "the best time to create awareness and change behavior accordingly."
I ordered it on Amazon more than a month ago, still waiting to get it for my son. We enjoyed her last book (That's Why We Don't Eat Animals: A Book About Vegans, Vegetarians, and All Living Things), although i must admit it was a bit dark for a preschooler, thinking by the title this one will make it's positive points well.
Thursday, April 26, 2012
How a culture of fear thrives in attention economies, and what that means for "radical transparency" and the Zuckerberg doctrine
Danah boyd's "The Power of Fear in Networked Publics" is a speech delivered at SXSW and Webstock New Zealand (that's where this video comes from). Danah first defines a culture of fear ("the ways in which fear is employed by marketers, politicians, technology designers [e.g., consider security narratives] and the media to regulate the public"), then shows how "attention economics" can exploit fear to bring in attention ("there is a long history of news media leveraging fear to grab attention") and how this leads fear to dominate many of our debates:Every day, I wake up to news reports about the plague of cyberbullying. If you didn't know the data, you'd be convinced that cyberbullying is spinning out of control. The funny thing is that we have a lot of data on this topic, data dating back for decades. Bullying is not on the rise and it has not risen dramatically with the onset of the internet. When asked about bullying measures, children and teens continue to report that school is the place where the most serious acts of bullying happen, where bullying happens the most frequently, and where they experience the greatest impact. This is not to say that young people aren't bullied online; they are. But rather, the bulk of the problem actually happens in adult-controlled spaces like schools.... Online, interactions leave traces.... The scale of visibility means that fear is magnified."And that's where her critique of "radical transparency" starts:Increasingly, the battles over identity are moving beyond geek culture into political battles. The same technologies that force people into the open are being used to expose people who are engaged in political speech. Consider, for example, how crowdsourcing is being used to identify people in a photograph. It just so happens that these people were engaged in a political protest.
Radical transparency is particularly tricky in light of the attention economy. Not all information is created equal. People are far more likely to pay attention to some kinds of information than others. And, by and large, they're more likely to pay attention to information that causes emotional reactions. Additionally, people are more likely to pay attention to some people. The person with the boring life is going to get far less attention than the person that seems like a trainwreck. Who gets attention – and who suffers the consequences of attention – is not evenly distributed.
And, unfortunately, oppressed and marginalized populations who are already under the microscope tend to suffer far more from the rise of radical transparency than those who already have privilege. The cost of radical transparency for someone who is gay or black or female is different in Western societies than it is for a straight white male. This is undoubtedly a question of privacy, but we should also look at it through the prism of the culture of fear.
The whole paper and the video are both worth your attention. "The Power of Fear in Networked Publics".
Wednesday, April 25, 2012
Tuesday, April 24, 2012
Monday, April 23, 2012
THE FORTY-YEAR ITCH
Is much of our culture really determined by things that happened four decades ago? In Comment this week, Adam Gopnik proposes a Golden Forty Year Rule:When the new season of “Mad Men” began, just a few weeks ago, it carried with it an argument about whether the spell it casts is largely a product of its beautifully detailed early-sixties setting or whether, as Matthew Weiner, its creator, insisted, it’s not backward-looking at all but a product of character, story line, and theme. So it seems time to pronounce a rule about American popular culture: the Golden Forty-Year Rule. The prime site of nostalgia is always whatever happened, or is thought to have happened, in the decade between forty and fifty years past. (And the particular force of nostalgia, one should bear in mind, is not simply that it is a good setting for a story but that it is a good setting for you.)
To cases. In the nineteen-forties—the first decade in which all the major components of mass culture were up and running, even early television—the beloved focus of nostalgia was the innocent aughts of the early century, a time imagined as one of perky girls in long dresses and shy boys in straw hats. “Meet Me in St. Louis,” a film made in 1944 about a fair held in 1904, was perhaps the most lovable of the many forties entertainments set in the aughts, from “The Magnificent Ambersons” to “Take Me Out to the Ball Game,” a musical made in 1948 about a song written in 1908. The nineteen-fifties saw lots of movies about the First World War—“The Seven Little Foys,” anyone?—and kicked off our Titanic romance, with “A Night to Remember.” The decade also brought the revival of the jazz of the teens, with the essentially serious music of Joe Oliver and Jelly Roll Morton recast by middle-aged white men in straw boaters and striped jackets as something softer, called Dixieland.
Twenties nostalgia ran right through the nineteen-sixties, beginning with the 1960 TV series “The Roaring 20’s.” In 1966, the very year “Mad Men” has now arrived at, the song that won the Grammy Award for best contemporary recording wasn’t “Good Vibrations” or “Paint It, Black” but the New Vaudeville Band’s twenties megaphone number “Winchester* Cathedral.” Each of the four last great Beatles albums included a twenties-pastiche number: “When I’m Sixty Four,” “Honey Pie,” “Your Mother Should Know,” “Maxwell’s Silver Hammer.” (Indeed, though Sgt. Pepper’s Lonely Hearts Club Band was supposedly taught to play “twenty years ago today,” its look comes right out of the gazebo-and-brass-band postwar lull.)
The seventies’ affection for the thirties—“The Sting,” “Paper Moon,” and so on—was one of the tonic notes of the decade, while the eighties somehow managed to give the Second World War a golden glow (“Raiders of the Lost Ark,” “Empire of the Sun,” “Hope and Glory,” “Biloxi Blues”), helped along by women working on the assembly line (“Swing Shift”). In the nineties, nostalgia for the fifties took a distinctly sumptuary turn: think of the revivalist fad for Hush Puppies and Converse All Stars, or the umpteen variations that the Gap rang on its “Kerouac Wore Khakis” campaign. In “Men in Black,” a perfect piece of nineties entertainment, Tommy Lee Jones and Will Smith showed how skinny ties could help defeat even the fiercest extraterrestrials.
Our own aughts arrived with the sixties as their lost Eden, right on schedule. That meant too many sixties-pastiche rock bands to mention (think only of Alex Turner, of Arctic Monkeys, sounding exactly like John Lennon), with the plangent postmodern twist that in some cases the original article was supplying its own nostalgia: there were the Stones and the Beach Boys on long stadium tours, doing their forty-year-old hits as though they were new. With the arrival of “Mad Men,” in 2007 (based on a pilot written earlier in the decade), sixties nostalgia was raised to an appropriately self-conscious and self-adoring forty-year peak.
That takes us to the current day, and, at last, to the reasons behind the rule. What drives the cycle isn’t, in the first instance, the people watching and listening; it’s the producers who help create and nurture the preferred past and then push their work on the audience. Though pop culture is most often performed by the young, the directors and programmers and gatekeepers—the suits who control and create its conditions, who make the calls and choose the players—are, and always have been, largely forty-somethings, and the four-decade interval brings us to a period just before the forty-something was born. Forty years past is the potently fascinating time just as we arrived, when our parents were youthful and in love, the Edenic period preceding the fallen state recorded in our actual memories. Although the stars of “Meet Me in St. Louis” were young, and its audience old and young both, Vincente Minnelli, its director, was born in 1903, just a year before the World’s Fair he made into a paradise. Matthew Weiner, born in 1965, is the baby in his own series. (The key variable behind the Beatles’ fondness for the twenties was the man they were pleasing and teasing: their great producer and arranger, George Martin, born in 1926.)
The forty-year rule is, of course, not immutable, and its cycle carries epicycles within it: the twenty-year cycle, for instance, by which the forty-somethings recall their teen-age years, producing in the seventies a smaller wave of fifties nostalgia to dance demurely alongside the longing for the thirties. But it is the forty-years-on reproduction of a thing that most often proves more concentrated and powerful than the original. Dixieland gets played more than archival jazz; people think that con men listened to Scott Joplin in Cicero in the nineteen-thirties. In the sixties, nobody quite knew that people were smoking or drinking; they just smoked and drank, often miserably, if the novels of the time are to be believed.
And so, if we can hang on, it will be in the twenty-fifties that the manners and meanings of the Obama era will be truly revealed: only then will we know our own essence. A small, attentive child, in a stroller on some Brooklyn playground or Minneapolis street, is already recording the stray images and sounds of this era: Michelle’s upper arms, the baritone crooning sound of NPR, people sipping lattes (which a later decade will know as poison) at 10 A.M.—manners as strange and beautiful as smoking in restaurants and drinking Scotch at 3 P.M. seem to us. A series or a movie must already be simmering in her head, with its characters showing off their iPads and staring at their flat screens: absurdly antiquated and dated, they will seem, but so touching in their aspiration to the absolutely modern. Forty years from now, we’ll know, at last, how we looked and sounded and made love, and who we really were. It will be those stroller children’s return on our investment, and, also, of course, a revenge taken on their time. ♦
Sunday, April 22, 2012
Sunday Sermon: EVERYTHING YOU KNOW ABOUT FREE MARKET CAPITALISM IS WRONG
from DangerousMinds
One of the big “sacred cows” of libertarian “free market” Capitalism is the supposed “invisible hand” of the marketplace keeping supply and demand in line with the price of a particular commodity or service.
The problem is, it’s just a myth, albeit a persistent one.
Jonathan Schlefer writes at the Harvard Business Review, that there is no evidence for the invisible hand:That’s Adam Smith talking there, about 75 years before Marx and Engels wrote The Communist Manifesto!One of the best-kept secrets in economics is that there is no case for the invisible hand. After more than a century trying to prove the opposite, economic theorists investigating the matter finally concluded in the 1970s that there is no reason to believe markets are led, as if by an invisible hand, to an optimal equilibrium — or any equilibrium at all. But the message never got through to their supposedly practical colleagues who so eagerly push advice about almost anything. Most never even heard what the theorists said, or else resolutely ignored it.
Of course, the dynamic but turbulent history of capitalism belies any invisible hand. The financial crisis that erupted in 2008 and the debt crises threatening Europe are just the latest evidence. Having lived in Mexico in the wake of its 1994 crisis and studied its politics, I just saw the absence of any invisible hand as a practical fact. What shocked me, when I later delved into economic theory, was to discover that, at least on this matter, theory supports practical evidence.
Adam Smith suggested the invisible hand in an otherwise obscure passage in his Inquiry Into the Nature and Causes of the Wealth of Nations in 1776. He mentioned it only once in the book, while he repeatedly noted situations where “natural liberty” does not work. Let banks charge much more than 5% interest, and they will lend to “prodigals and projectors,” precipitating bubbles and crashes. Let “people of the same trade” meet, and their conversation turns to “some contrivance to raise prices.” Let market competition continue to drive the division of labor, and it produces workers as “stupid and ignorant as it is possible for a human creature to become.”
Just saying….
The search by classical economists for a concrete and mathematically verifiable theory of economic equilibrium continued throughout the decades, but apparently no one could ever really find much evidence for it:Leon Walras, of the University of Lausanne in Switzerland, thought he had succeeded in 1874 with his Elements of Pure Economics, but economists concluded that he had fallen far short. Finally, in 1954, Kenneth Arrow, at Stanford, and Gerard Debreu, at the Cowles Commission at Yale, developed the canonical “general-equilibrium” model, for which they later won the Nobel Prize. Making assumptions to characterize competitive markets, they proved that there exists some set of prices that would balance supply and demand for all goods. However, no one ever showed that some invisible hand would actually move markets toward that level. It is just a situation that might balance supply and demand if by happenstance it occurred. [Emphasis added].
In 1960 Herbert Scarf of Yale showed that an Arrow-Debreu economy can cycle unstably. The picture steadily darkened. Seminal papers in the 1970s, one authored by Debreu, eliminated “any last forlorn hope,” as the MIT theorist Franklin Fisher says, of proving that markets would move an economy toward equilibrium. Frank Hahn, a prominent Cambridge University theorist, sums up the matter: “We have no good reason to suppose that there are forces which lead the economy to equilibrium.”
Schlefer concludes by accusing the Federal Reserve of assuming market equilibrium would avert something like the subprime mortgage crisis—and being so wrong about it—precisely because their models incorrectly assumed equilibrium. That’s a hell of an overlooked variable! Of course when the economy did implode, the Fed responded with a flood of money and Keynesian tactics aimed at propping up the system.
But the misconceptions about the free market don’t end there. Have you heard of the “Minsky Moment”? Hyman P. Minsky? Don’t worry, you’re not alone. Most orthodox economists have never heard of him, either.
Hyman Minsky is the name of a once obscure “contrarian” professor of macroeconomics who died in 1996. A “red diaper baby” born to socialist parents in Belarus, Minsky spent the 1950s and 60s studying the causes of poverty. Throughout his entire career Minsky’s theories carried almost no weight in the field of economics. When the global economic crisis shook faith in the Capitalist system, his star began to rise. Minsky came to be regarded as “perhaps the most prescient big-picture thinker about what, exactly, we are going through” as the Boston Globe’s Stephen Mihm described him.
Minsky, Mihn wrote in 2009: “[P]redicted, decades ago, almost exactly the kind of meltdown that recently hammered the global economy.”Sound in any way familiar?Minsky drew his own, far darker, lessons from Keynes’s landmark writings, which dealt not only with the problem of unemployment, but with money and banking. Although Keynes had never stated this explicitly, Minsky argued that Keynes’s collective work amounted to a powerful argument that capitalism was by its very nature unstable and prone to collapse. Far from trending toward some magical state of equilibrium, capitalism would inevitably do the opposite. It would lurch over a cliff.
This insight bore the stamp of his advisor Joseph Schumpeter, the noted Austrian economist now famous for documenting capitalism’s ceaseless process of “creative destruction.” But Minsky spent more time thinking about destruction than creation. In doing so, he formulated an intriguing theory: not only was capitalism prone to collapse, he argued, it was precisely its periods of economic stability that would set the stage for monumental crises.
Minsky called his idea the “Financial Instability Hypothesis.” In the wake of a depression, he noted, financial institutions are extraordinarily conservative, as are businesses. With the borrowers and the lenders who fuel the economy all steering clear of high-risk deals, things go smoothly: loans are almost always paid on time, businesses generally succeed, and everyone does well. That success, however, inevitably encourages borrowers and lenders to take on more risk in the reasonable hope of making more money. As Minsky observed, “Success breeds a disregard of the possibility of failure.”
As people forget that failure is a possibility, a “euphoric economy” eventually develops, fueled by the rise of far riskier borrowers—what he called speculative borrowers, those whose income would cover interest payments but not the principal; and those he called “Ponzi borrowers,” those whose income could cover neither, and could only pay their bills by borrowing still further. As these latter categories grew, the overall economy would shift from a conservative but profitable environment to a much more freewheeling system dominated by players whose survival depended not on sound business plans, but on borrowed money and freely available credit.
Once that kind of economy had developed, any panic could wreck the market. The failure of a single firm, for example, or the revelation of a staggering fraud could trigger fear and a sudden, economy-wide attempt to shed debt. This watershed moment—what was later dubbed the “Minsky moment”—would create an environment deeply inhospitable to all borrowers. The speculators and Ponzi borrowers would collapse first, as they lost access to the credit they needed to survive. Even the more stable players might find themselves unable to pay their debt without selling off assets; their forced sales would send asset prices spiraling downward, and inevitably, the entire rickety financial edifice would start to collapse. Businesses would falter, and the crisis would spill over to the “real” economy that depended on the now-collapsing financial system.
If the “invisible hand” is just hogwash and if Minsky is right, what’s to stop the economy from imploding again?
Saturday, April 21, 2012
The Liver Birds
from DangerousMinds
Originally named The Debutones, England’s The Liverbirds (aka The Liver Birds) moved from Liverpool to Hamburg, Germany in 1963 where they became a popular band on the Star-Club circuit. Although they never became big stars their contribution to rock and roll is historically significant in that they were the first serious all-girl rock band to play their own instruments and do it on the same turf as male rock n’ rollers.
“Girls with guitars? That’ll never work”. John Lennon.
Well, it did work for The Liverbirds who managed to record two albums, achieve a Top 10 hit in Germany with their single, “Diddley Daddy,” and last four years before splitting up in 1967.
Pamela Birch - guitar/vocals, Valerie Gell - guitar/vocals, Mary McGlory - bass guitar/vocals, Sylvia Saunders - drums.
from Psychorizon
Have you ever wondered, how The Beatles would have looked like, if they were women? Well, here you go: The Liverbirds.
They were a female British Rock ‘n’ Roll band from Liverpool. The four band members Pamela Birch (vocals / guitar), Valerie Gell (vocals / guitar), Mary McGlory (vocals / bass guitar) and Sylvia Saunders (drums) have been active between 1962 and 1967. They were one of very few female bands on the Merseybeat scene and one of the first rock band, which consisted only of female members.
Valerie Gell, Sylvia Saunders, singer Irene Green and guitarist Sheila McGlory founded the band in early 1962 under the name “The Debutones“. Irene Green and Sheila McGlory left the band very early on and joined other bands. They were replaced by Mary McGlory – the sister of Sheila McGlory – and Pamela Birch.
The band’s name derived from the fictional figure of the Liver Bird, the Tower of the Liver Building, which is the symbol of their native Liverpool.
The Liverbirds achieved more commercial success in Germany than in their homeland. Early in their career, they followed in the footsteps of colleagues such as The Beatles and Rory Storm & the Hurricanes. In May 1964 the four girls first appeared in Hamburgs legendary “Star Club” as “the female Beatles” from Liverpool. There they were one of the top attractions and released two albums and several singles. One of these singles, a cover of Bo Diddley’s “Diddley Daddy”, climbed to number five on the German charts. In 1967, the band broke up, but still inspired lots of young rock musicians in the world for years to come.
Friday, April 20, 2012
Thursday, April 19, 2012
Dick Clark w/ RunDMC & Jam Master Jay (1985)
R.I.P.
The photo above was taken backstage, just after Run-DMC's 1st ever performance (and in fact the 1st Hip-Hop performance ever) on Dick Clark's American Bandstand.
Below is another photo I took, of L.L. Cool J a couple of years later,
And here's a classic shot of Dick Clark, Run DMC, Jam Master Jay, Rick Rubin, Bill Adler and Russell Simmons taken that same 1st time in 1985:
* thanks to Bill Adler (Hip-Hop's O.G. publicist and archivist) for digging these out of his own files quicker than i could find them in mine!
Wednesday, April 18, 2012
Doing Well for Others
"A look into the humanitarian side of today's most celebrated action sport athletes, musicians, artists and foundations. A raw and transparent conversation of today's shocking realities and potential solutions."Here's episode 3 from season 2 as told by our friend Henry:
In church you pass a basket to collect money to keep the building lights on.
In alcoholics anonymous you pass a can to collect money to keep the coffee fresh.
A musician passes a hat keeping his passion alive.
A homeless man holds a cup hoping for handouts to survive.
All of the containers are filled or empty based on the compassion and generosity of others. Today, more than ever there is a need for people to become aware of the issues at hand and become pro-active in fighting for change. The opportunities to be selfless and contribute are endless and therefore the large bucket is passed.
Directed by Eliot Rausch, an award winning director for Best Documentary on Vimeo.
check out other episodes HERE
Tuesday, April 17, 2012
Ian MacKaye turned 50 Yesterday!
INCREDIBLE!
here's a few of my favorite Ian pix:
Here's a few great videos of Ian over the years:
Here's to 50 more!
Monday, April 16, 2012
GEORGE WASHINGTON SIGNED THE FIRST HEALTH INSURANCE MANDATE!
from Richard Metzger at DangerousMinds:
Medical insurance mandates are nothing new, as Einer Elhauge, a professor at Harvard Law Schoo, explains at The New Republic:In making the legal case against Obamacare’s individual mandate, challengers have argued that the framers of our Constitution would certainly have found such a measure to be unconstitutional. Nevermind that nothing in the text or history of the Constitution’s Commerce Clause indicates that Congress cannot mandate commercial purchases. The framers, challengers have claimed, thought a constitutional ban on purchase mandates was too “obvious” to mention. Their core basis for this claim is that purchase mandates are unprecedented, which they say would not be the case if it was understood this power existed.Elhauge joined an amicus brief supporting the constitutionality of the Affordable Care Act’s individual mandate. You can (and should) read his entire piece at TNR.
But there’s a major problem with this line of argument: It just isn’t true. The founding fathers, it turns out, passed several mandates of their own. In 1790, the very first Congress—which incidentally included 20 framers—passed a law that included a mandate: namely, a requirement that ship owners buy medical insurance for their seamen. This law was then signed by another framer: President George Washington. That’s right, the father of our country had no difficulty imposing a health insurance mandate.
This brings to light some extraordinary “lost history” that the Reichwing needs to consider as they hone their threadbare, tissue-thin arguments to deny healthcare to their fellow man…
Sunday, April 15, 2012
WHAT COMES AFTER THE END OF CAPITALISM?
(Sunday Sermon)
From Richard Metzger at DangerousMinds:
Noam Chomsky has long advocated simply reading the Wall Street Journal if you wanted to understand the mindset of the ruling class. No special detective work is necessary to divine the attitudes and intentions of the rich and powerful. In the pages of their house organ you could find what you were looking for, often with unvarnished bluntness.
It’s good advice, but today, the WSJ isn’t the only place to look for hints of ruling class attitudes. In a column published today at Huffington Post, Dr. Klaus Schwab, the founder and executive chairman of the World Economic Forum poses a salient question: If this is the end of Capitalism, then what’s next?One of the criticisms of capitalism centers on the widening gap between winners and losers due to the so-called turbocapitalism that is a result of global competition. In this context, the so-called Nordic model demonstrates that a high degree of labor market flexibility and social welfare systems do not have to be mutually exclusive—indeed, they can actually be combined to very good effect. This type of economic policy also enables countries to invest in innovation, childcare, education and training. The Scandinavian countries, which underwent a similar banking crisis in the 1990s to that which we are now experiencing in other Western economies, have shown that by reforming regulation and social welfare systems, flexible labor and capital markets really are compatible with social responsibility. So it is no coincidence that these countries are now among the most competitive economies in the world. [Emphasis added]If this is the sort of intellectual currency that was circulating around Davos this year, I think this is a pretty strong indication that the Occupy backlash is having a big effect. You’d hope that by now the elites must know that the natives are restless!
Other aspects of the criticism of capitalism that are worthy of serious consideration are excessive bonuses, the burgeoning market in alternative financial instruments and the imbalance that has emerged between finance and the real economy. However, we do see some progress in these areas thanks to mounting pressure from the general public, governments and also the market.
So even though capitalism was not laid to rest in Davos, it is fair to say that capital is losing its status as the most important factor of production in our economic system. As I outlined in my opening address in Davos, capital is being superseded by creativity and the ability to innovate—and therefore by human talents—as the most important factors of production. If talent is becoming the decisive competitive factor, we can be confident in stating that capitalism is being replaced by “talentism.” Just as capital replaced manual trades during the process of industrialization, capital is now giving way to human talent. I am convinced that this process of transformation will also lead to new approaches within the field of economics. It is indisputable that an ideology founded on personal freedom and social responsibility gives both individuals and the economy the greatest possible scope to develop.
Obviously a worldwide group-mind consensus is demanding, if not exactly the end of Capitalism, certainly a major rethink/reformation of the way it is practiced in the 21st century. The world is a different place than it was before the Industrial Revolution, it’s high time we updated the operating system to reflect those changes.
It’s just getting to be so fucking stupid, isn’t it?
Michael E. Porter, a Harvard University professor cited by Schwab in his essay, explains why business leaders must focus on “shared value creation.”
Saturday, April 14, 2012
Soviet anti-drunkenness posters
Here's a gallery of Soviet-era anti-drunkenness posters. Some of the illustrations are really fabulous, almost Boschean in their depiction of besotted debasement Антиалкогольные плакаты из СССР
Friday, April 13, 2012
Something for Nothing: The Art of Rap
THIS IS GOING TO BE THE BIGGEST DOCUMENTARY EVER.
I SAW IT AT SUNDANCE, I KNOW.
A must see for any hip hop fan and any hip hop hater, after this you'll have nothing but love.
Ice-T takes us on an intimate journey into the heart and soul of hip-hop with the legends of rap music. This performance documentary goes beyond the stardom and the bling to explore what goes on inside the minds, and erupts from the lips, of the grandmasters of rap. Recognized as the godfather of Gangsta rap, Ice-T is granted unparalleled access to the personal lives of the masters of this artform that he credits for saving his life. Interspersed with the performers' insightful, touching, and often funny revelations are classic raps, freestyle rhymes, and never before heard a cappellas straight from the mouths of the creators. What emerges is a better understanding of, and a tribute to, an original American art form that brought poetry to a new generation.
Thursday, April 12, 2012
The Assault on Public Education
By Noam Chomsky (from Nation of Change)
Public education is under attack around the world, and in response, student protests have recently been held in Britain, Canada, Chile, Taiwan and elsewhere.
California is also a battleground. The Los Angeles Times reports on another chapter in the campaign to destroy what had been the greatest public higher education system in the world: “California State University officials announced plans to freeze enrollment next spring at most campuses and to wait-list all applicants the following fall pending the outcome of a proposed tax initiative on the November ballot.”
Similar defunding is under way nationwide. “In most states,” The New York Times reports, “it is now tuition payments, not state appropriations, that cover most of the budget,” so that “the era of affordable four-year public universities, heavily subsidized by the state, may be over.”
Community colleges increasingly face similar prospects – and the shortfalls extend to grades K-12.
“There has been a shift from the belief that we as a nation benefit from higher education, to a belief that it’s the people receiving the education who primarily benefit and so they should foot the bill,” concludes Ronald G. Ehrenberg, a trustee of the State University system of New York and director of the Cornell Higher Education Research Institute.
A more accurate description, I think, is “Failure by Design,” the title of a recent study by the Economic Policy Institute, which has long been a major source of reliable information and analysis on the state of the economy.
The EPI study reviews the consequences of the transformation of the economy a generation ago from domestic production to financialization and offshoring. By design; there have always been alternatives.
One primary justification for the design is what Nobel laureate Joseph Stiglitz called the “religion” that “markets lead to efficient outcomes,” which was recently dealt yet another crushing blow by the collapse of the housing bubble that was ignored on doctrinal grounds, triggering the current financial crisis.
Claims are also made about the alleged benefits of the radical expansion of financial institutions since the 1970s. A more convincing description was provided by Martin Wolf, senior economic correspondent for The Financial Times: “An out-of-control financial sector is eating out the modern market economy from inside, just as the larva of the spider wasp eats out the host in which it has been laid.”
The EPI study observes that the “Failure of Design” is class-based. For the designers, it has been a stunning success, as revealed by the astonishing concentration of wealth in the top 1 percent, in fact the top 0.1 percent, while the majority has been reduced to virtual stagnation or decline.
In short, when they have the opportunity, “the Masters of Mankind” pursue their “vile maxim â(euro) [ all for ourselves and nothing for other people,” as Adam Smith explained long ago.
Mass public education is one of the great achievements of American society. It has had many dimensions. One purpose was to prepare independent farmers for life as wage laborers who would tolerate what they regarded as virtual slavery.
The coercive element did not pass without notice. Ralph Waldo Emerson observed that political leaders call for popular education because they fear that “This country is filling up with thousands and millions of voters, and you must educate them to keep them from our throats.” But educated the right way: Limit their perspectives and understanding, discourage free and independent thought, and train them for obedience.
The “vile maxim” and its implementation have regularly called forth resistance, which in turn evokes the same fears among the elite. Forty years ago there was deep concern that the population was breaking free of apathy and obedience.
At the liberal internationalist extreme, the Trilateral Commission – the nongovernmental policy group from which the Carter Administration was largely drawn – issued stern warnings in 1975 that there is too much democracy, in part due to the failures of the institutions responsible for “the indoctrination of the young.” On the right, an important 1971 memorandum by Lewis Powell, directed to the U.S. Chamber of Commerce, the main business lobby, wailed that radicals were taking over everything – universities, media, government, etc. – and called on the business community to use its economic power to reverse the attack on our prized way of life – which he knew well. As a lobbyist for the tobacco industry, he was quite familiar with the workings of the nanny state for the rich that he called “the free market.”
Since then, many measures have been taken to restore discipline. One is the crusade for privatization – placing control in reliable hands.
Another is sharp increases in tuition, up nearly 600 percent since 1980. These produce a higher education system with “far more economic stratification than is true of any other country,” according to Jane Wellman, former director of the Delta Cost Project, which monitors these issues. Tuition increases trap students into long-term debt and hence subordination to private power.
Justifications are offered on economic grounds, but are singularly unconvincing. In countries rich to poor, including Mexico next-door, tuition remains free or nominal. That was true as well in the United States itself when it was a much poorer country after World War II and huge numbers of students were able to enter college under the GI bill – a factor in uniquely high economic growth, even putting aside the significance in improving lives.
Another device is the corporatization of the universities. That has led to a dramatic increase in layers of administration, often professional instead of drawn from the faculty as before; and to imposition of a business culture of “efficiency” – an ideological notion, not just an economic one.
One illustration is the decision of state colleges to eliminate programs in nursing, engineering and computer science, because they are costly – and happen to be the professions where there is a labor shortage, as The New York Times reports. The decision harms the society but conforms to the business ideology of short-term gain without regard for human consequences, in accord with the vile maxim.
Some of the most insidious effects are on teaching and monitoring. The Enlightenment ideal of education was captured in the image of education as laying down a string ^ @that students follow in their own ways, developing their creativity and independence of mind.
The alternative, to be rejected, is the image of pouring water into a vessel – and a very leaky one, as all of us know from experience. The latter approach includes teaching to test and other mechanisms that destroy students’ interest and seek to fit them into a mold, easily controlled. All too familiar today.
Copyright 2012 Noam Chomsky
This article was published at NationofChange at: http://www.nationofchange.org/assault-public-education-1333634007. All rights are reserved.
Wednesday, April 11, 2012
Tuesday, April 10, 2012
Simpsons Street Art in Chernoby
On a recent trip to Chernobyl, French street artist Combo left behind something memorable: a mural of The Simpsons. Combo is known for using iconic characters to make a statement, and this image continues the trend.
The nuclear power plant in the background looks a lot more menacing here, but I also think there’s something really interesting about such a happy, cheery image juxtaposed against the aftermath of a horrible tragedy. You can see more of Combo’s trip to Chernobyl in the video below.
Monday, April 9, 2012
10 Big Mistakes People Make in Thinking About the Future
In a time of huge change and uncertainty, we need to think about the future as clearly as possible. Here's where we most often get it wrong.
Being a working futurist means that I think a lot about how people think about the future. It also means spending a lot of time with people who are also thinking about their own futures.
Typically, this involves a dialogue between three distinct groups.
First, there's usually a small handful of very foresighted people, who are aware of their own blind spots and biases, and who are eager and open about the prospect of soaring into a wild blue sky to gather a lot of exciting new information.
Second, there's a larger group of people who don't usually think at 50,000 feet -- but are willing to go there if they're with people they trust. Their wings aren't sturdy, and they are prone to some very common mistakes in thinking, but they're often the most gratifying group to work with. What they want is permission to let go, encouragement to go big, and a watchful eye to keep them out of the rocks and ditches.
And then there's a third small group that's very resistant to the idea that anything could or should change. I've spent a lot of time over the years thinking and writing about that last group, because I'm fascinated by the question of what drives change resisters. What they want from me is safety -- the reassurance that if they overcome their natural reticence and try to embrace some constructive thinking about change, they won't end up all alone somewhere terrifyingly unfamiliar.
Of that second group, I've found that there's a fairly short list of common mistakes that they make over and over again -- little assumptions that create big obstacles in their ability to see all the potential alternatives clearly. These are the same mistakes most people out on the street and in the media make, too; on any given day, one can open a newspaper to the op-ed page and find three or four of these mistakes, right there in black and white. When pundits and prognosticators are wrong, these assumptions are usually somewhere near the root of why they're wrong.
To the end of helping progressives think more productively about the future we're trying to create, here are 10 of the most common mental hiccups that keep people from seeing the bigger picture and planning for it with a full measure of courage and intelligence.
1. The future won't be like the past. And the most likely future isn't. In any group, there's usually a tacit set of assumptions about where the world is headed, and what their future holds. Life has gone on for a certain way for a while -- and the longer that trend continues, the more invested they get in the assumption that things will just keep on going that same direction. I always ask people early on to describe their Most Likely Future -- the one they and their friends assume will happen if nothing else changes. And I've never met anybody who had to hesitate a minute to fill me in on what that future looks like.
But the gotcha is: research by academic futurists has found that this expected future really isn't the most likely outcome at all. In fact, it only actually comes to pass somewhat less than half the time. Which means that somewhat more than half the time, you're going to be facing something else entirely. And if you're too settled on that vision of what's most likely, odds are that you won't do nearly enough to prepare for those alterative futures in which you are even more likely to actually end up living.
It's good to know what your expected future is. Objectively understanding what that story is and how you arrived at it is half the work. The other half lies in being willing to let go of that cherished vision long enough to figure out and prepare for some of the other things that could -- and probably will -- happen instead.
2. Trends end. This is related to #1, in that the most typical reason the expected future doesn't come is that one or more of the trends supporting it fails. As noted above: the longer a trend has been going on, the more we tend to assume that it will never end.
But all trends do end -- and, in fact, the longer it's been going on, the more overdue you are for it to change. One of the things I do is map the trends that are creating the current conditions, and figure out which trends are overripe (there are telltale signs) and thus most at risk of disrupting the existing circumstances. Imagining how these trends could fail or reverse -- what would happen if the opposite thing happened instead? -- is a useful way of revealing viable and realistic alternative futures.
3. Avoid groupthink. Another reason the Most Likely Future tends to obscure everything else is groupthink. Every group has basic assumptions about how the world works -- what's realistic, what's plausible, what's nonsense, what can't be discussed. The longer the existing set of operating rules has been in place, the more pressure people feel not to question that -- and the crazier you look if you do suggest that other futures are possible.
In fact, it's arguable that when a group that's reached a point where no other futures are even discussable, it's a clear red flag that they are vulnerable to being flattened by some new situation that comes out of the blue -- or any of the many other places they're no longer looking out for change. Which brings us to:
4. If it's taboo, it's probably important. The thing you are not discussing -- the elephant in the room -- has a very high probability of being the very thing that will put an end to the present era, and launch you into the next phase of your future. Worse: the longer you ignore or deny it, the more at its mercy you will ultimately be when the change does come down.
A big part of being a futurist is to gently get people to start thinking and talking about those taboo subjects, the ones that are too scary or painful to think about. The very act of bringing those hard issues out onto the table and beginning to grapple with them, all by itself, has tremendous power to make people more courageous and resilient. The elephant only has power as long as we refuse to talk about it. When we finally confront it, its power becomes ours.
5. Any useful idea about the future should sound ridiculous at first. This rule originated with Dr. Jim Dator, the founder of the graduate program in futures studies at the University of Hawaii. His point was: If you're not coming up with ideas that sound a little crazy on their surface, it's a sure sign that you're stuck in too many conventional assumptions -- and are therefore not thinking big enough about just how different the future could be. If things don't look a little weird, you're not reaching far enough.
An example: Six years ago, I wrote about the right-wing's up-and-coming assault on contraception. It sounded completely off-the-wall at the time, and I took a lot of grief for it. But look where we are now.
6. Ask: What stays the same? The world is big, and governed by huge interlocking chaotic systems whose behavior can be impossible to anticipate. But, at the same time, there are also constants -- the things that don't change from era to era, or that change so slowly that you can pretty much count on them staying the same even when everything else is going to hell.
Chief among these is human nature. Economies grow and shrink, nations rise and fall, the globe is getting hotter and world is getting smaller, but through it all, we are still Homo sapiens -- which means we will always be hungry, greedy, horny, infuriatingly stubborn, astonishingly kind, quick to take up arms against each other, and equally quick to bind each others' wounds. It's just how we are.
Any serious survey of a future landscape includes questions like: What stays the same? What will we carry forward with us? What will follow us, whether we want it to or not? What can we count on? What will we still need to guard against?
7. The other side is not always wrong. Right now, America is assertively separating itself into two vivid, strongly held visions of the future. In these situations, groupthink necessarily runs especially high; we are committed not only to progressive values, but also to a specific set of solutions, policies and outcomes that we believe will best make those values manifest in a world shaped by our vision.
The other side, of course, is equally passionate about their values, vision, solutions, policies, and outcomes. In his new book, The Republican Brain: The Science of Why They Deny Science -- and Reality, Chris Mooney points out that one of the defining traits of liberals is that we're far more open to evaluating conservative ideas on their merits, and adopting good ideas wherever we find them. Conservatives, on the other hand, will usually consider the source first. If an idea came from a liberal, it's immediately rejected as unacceptable, regardless of its merits.
This is gross generalization, though. It's plenty easy to find progressives rejecting potentially useful ideas out of hand because they came from sources we don't agree with. But when we're deep in the throes of transforming a country, an economy and a society -- long on problems and short on good ideas for fixing them -- that kind of narrowness of mind is a luxury we can't afford.
8. Be aware of different change theories. Everybody has their pet theories about how change happens. Almost all of these theories can be sorted into one of about 10 basic buckets, which are described here. Most of us have two or three of these buckets that we think explain just a whole lot about the world, and another one or two that make us very uneasy. And it's usually true that the people we disagree with most are working off of some fundamentally different assumptions about how change happens.
Understanding which theories you're drawn to and which you tend to reject can go a long way toward helping you notice your own blind spots, and also form strategies to deal with places that your pet theories collide with those of other groups.
9. Don't think in five or 10 years. Think in 100 or 500 years. A lot of Wall Streeters didn't see the 2008 meltdown coming because economic forecasters typically work with historical data that's anywhere from two to five years old. In 2008, that means that their models didn't include so much as the possibility of a financial disaster on the scale of 1989, let alone 1929. They were thinking on a scale of five years, not 50. And certainly not 100.
While we know abstractly that Bad Things happened to the generations before us, we seldom believe that anything that bad could actually happen to us. America is safe and stable, and has been for a long time (though this is no guarantee: see #1). We're comfortable thinking that there's no way we could face famine or drought, epidemic or genocide, or other catastrophes on that kind of culture-changing scale.
This is a common blind spot. Anything that's happened to humans before is quite likely to happen at some point again. And there's always the possibility that it could happen to us. Simply recognizing our true vulnerabilities -- however unlikely those catastrophes may seem -- and giving them their due in our future planning enables us to become stronger and more robust, even if the full catastrophe never arrives.
10. Don't assume it will be hard. Don't assume it will be easy. Better yet: don't assume anything, ever. Most of the above cautions are aimed at a final big one: Be acutely aware of every assumption you're making, and don't leave any of them unexamined.
Clear thinking about the future is all about carefully choosing which assumptions you're going to work off of. Often, futurists' main role is to point out and re-examine assumptions that everybody else just accepts as gospel. Someone will say, "Oh, we can't do that; it's too hard." And my job -- made easier, because I'm an outsider -- is to ask them why that's just assumed. As they explain the obstacles, we can unpack each piece of that assumption together -- about how this person won't go along, or how that agency's rules work, or why the time frame just won't allow for it.
This exercise is worth doing because, frequently enough, the assessment that something's just not possible (or, conversely, will be trivially easy -- one should question those, too) falls apart when you take a second, deeper look at things. Maybe that assessment was true when you last visited this question six months ago. But maybe things have changed since then -- and now, your options are different. Or maybe it turns out that that old assessment is still true, after all. In that case, revisiting it increases your confidence that you can still trust it -- though it doesn't exempt you from continually poking at your assumptions, every time you revisit them, to make sure they're still true.
Whether the assumption changes or not, the exercise is worth doing because basing your assumptions about the future on outdated information that already belongs to the past isn't useful. The future is ambiguous enough when you start working from the baseline of the present. You only increase that ambiguity when you put yourself several steps back of that line -- especially when you don't have to.
In a time when the stakes are so high, and the margin of error so thin, it's more important than ever that we make the right decisions about the future the first time. Being mindful of these 10 mistakes can go a long way toward increasing the quality of our strategy and planning, and improve the choices we make about our country's future.
Sara Robinson, MS, APF is a social futurist and the editor of AlterNet's Vision page. Follow her on Twitter, or subscribe to AlterNet's Vision newsletter for weekly updates.
© 2012 Independent Media Institute. All rights reserved.
Sunday, April 8, 2012
Yellow Submarine Restoration +
Review for Your Sunday
From Richard Metzger over at DangerousMinds:
“Once upon a time…or maybe twice…there was an unearthly paradise called Pepperland…”
The Beatles’ classic 1968 animated feature film, Yellow Submarine, has been restored in 4K digital resolution for the first time by Paul Rutan Jr. and his team at Triage Motion Picture Services. No automated software was used in the clean-up of the film’s restored photochemical elements. This was a job painstakingly done by hand, a single frame at a time. The absolutely stunning Yellow Submarine restoration premiered last weekend at the SXSW festival and will be coming on Blu-Ray DVD at the end of May with a new 5.1 multi-channel audio soundtrack. Seeing the film unspool on the big screen of Austin’s historic Paramount Theatre was like watching a series of moving stained glass windows.
Directed by George Dunning, and written by Lee Minoff, Al Brodax, Jack Mendelsohn and future best-selling Love Story novelist Erich Segal, Yellow Submarine, based upon the song by John Lennon and Paul McCartney, is a basically incomprehensible series of musical vignettes, groan-worthy puns and lysergically-inspired kaleidoscopic eye-candy that sees John, Paul, George and Ringo saving the world from the evil Blue Meanies.
When Yellow Submarine originally premiered in 1968, the film was regarded as an artistic marvel. With its innovative animation techniques, it represented the most technologically advanced animation work since Disney’s masterpiece, Fantasia. Inspired by the Pop Art of Andy Warhol, Peter Max and Peter Blake, art director Heinz Edelmann’s work on Yellow Submarine is now considered among the classics of animated cinema. Yellow Submarine also showcases the creative work of animation directors Robert Balser and Jack Stokes along with a team of the best animators and technical artists that money could hire. The ground-breaking animation styles included 3-D sequences and the highly detailed “rotoscoping” (tracing film frame by frame) of the celebrated “Eleanor Rigby” sequence. The production process took nearly two years and employed 40 animators and 140 technical artists.
I must say, though, as happy as I was to be one of the first people to see the restored Yellow Submarine, I couldn’t help be to think that—with all of its merits—the film is just a little bit boring. If you responded negatively to the news of the (now shelved) Yellow Submarine 3-D remake, consider that not only did the Fab Four have precious little to do with the actual making of the original film (it’s not even their own voices) but that today’s kids—your kids—won’t have the patience to sit through it. Nor will they even understand what’s being said onscreen. Yellow Submarine, I hate to say it, was ripe for a remake. Sacrilege, I know, but it’s not like I’m suggesting that they remake A Hard Day’s Night or anything!
Below, a decidedly low res version of Yellow Submarine in its entirety. This isn’t really the way to watch it, of course…
Now fact is I disagree with Richard on the point of " today’s kids—your kids—won’t have the patience to sit through it." because in fact my four and a half year old was fascinated by it! And of course loves all the music. But Richard doesn't have kids yet so you can't really blame him for assuming, or maybe my boy really is a genius ;-)
Now here's another angle from another Dangerous Minds contributor:
PREVIEW OF THE BEAUTIFULLY RESTORED ‘YELLOW SUBMARINE"
Here’s the trailer for the newly restored Yellow Submarine.
The digital clean-up of the film’s photochemical elements was lovingly done entirely by hand, frame by frame. Having seen the world premiere of the restored version at this year’s SXSW, I can attest to its eye-searing intensity and lysergic beauty. While the story obviously remains the same, rather thin with a script comprised of surreal non sequiturs and bad puns, the overall experience of watching the film in a pristine digital format overwhelms the narrative with colors and artwork so you rich you can practically taste it. And the stereo soundtrack sounded wonderful.
Coming out on Blu-Ray and DVD on May 29 with 5.1 surround sound. Expect to be astonished.
Mod Odyssey is a groovy short documentary on the creation of Yellow Submarine. Enjoy.
Saturday, April 7, 2012
Climate change isn't liberal or conservative: It's reality
Paul Douglas is a Minneapolis/St.Paul meteorologist. Meteorologists don't study the same things as climate scientists—remember, weather and climate are different things—but Douglas is a meteorologist who has taken the time to look at research published by climate scientists and listen to their expertise. Combined with the patterns he's seen in weather, that information has led Douglas to accept that climate change is real, and that it's something we need to be addressing.
Paul Douglas is also a conservative. In a recent guest blog post on Climate Progress, he explains why climate isn't (or, anyway, shouldn't be) a matter of political identity. We'll get back to that, but first I want to call attention to a really great analogy that Douglas uses to explain weather, climate, and the relationship between the two.You can’t point to any one weather extreme and say “that’s climate change”. But a warmer atmosphere loads the dice, increasing the potential for historic spikes in temperature and more frequent and bizarre weather extremes. You can’t prove that any one of Barry Bond’s 762 home runs was sparked by (alleged) steroid use. But it did increase his “base state,” raising the overall odds of hitting a home run.
Mr. Douglas, I'm going to be stealing that analogy. (Don't worry, I credit!)
A few weeks ago, I linked you to the introduction from my new book, Before the Lights Go Out, where I argue that there are reasons for people to care about energy, even if they don't believe in climate change—and that we need to use those points of overlap to start making energy changes that everyone can agree on, even if we all don't agree on why we're changing.
But there's another, related, idea, which Paul Douglas' essay gets right to the heart of. Just like there's more than one reason to care about energy, there's also more than one way to care about climate. Concern for the environment—and for the impact changes to the environment could have on us—is not a concept that can only be expressed in the terms of lefty environmentalism.
You and I can think about the environment in very different ways. We can have very different identities, and disagree on lots of cultural and political issues. All of those things can be true—and, yet, we can still come to the same, basic conclusions about climate, risk, and what must be done. Here's Douglas' perspective:This concept—Creation Care—is something that I've summed up as, "Your heavenly father wants you clean up after yourself." It's not a message that is going to make sense to everybody. But it's an important message, nonetheless, because it has the potential to reach people who might not otherwise see a place for themselves at this table.
I’m a Christian, and I can’t understand how people who profess to love and follow God roll their eyes when the subject of climate change comes up. Actions have consequences. Were we really put here to plunder the Earth, no questions asked? Isn’t that the definition of greed? In the Bible, Luke 16:2 says, “Man has been appointed as a steward for the management of God’s property, and ultimately he will give account for his stewardship.” Future generations will hold us responsible for today’s decisions.
Too often, both liberals and conservatives approach climate change as something that is tangled up in a lot of lifestyle, political, and cultural choices it has nothing to do with. Those assumptions lead the right to feel like they can't accept the reality of climate change without rejecting every other part of their identities and belief systems. Those same assumptions lead the left to spend way too much time preaching to choir—while being confused about why people outside the congregation aren't responding to their message.
That's why essays like Douglas' are so important. We look at the world in different ways. We come by our values for different reasons. But even though we might take different paths, we can come to some of the same places. Let's respect that. And let's have those conversations. Climate change is about facts, not ideologies. It's about risks that affect everyone. We need to do a better job of discussing climate change in a way that makes this clear. And that means reaching out to people with language and perspectives that they can identify with.
Read Paul Douglas' full post on Climate Progress.
Read more about energy, climate, and what we can do to make the message of climate science more universal in my book, Before the Lights Go Out.
Image: Weather, a Creative Commons Attribution (2.0) image from 66770481@N02's photostream