Wednesday, February 08, 2006

America's own West Bank

Everyone over the age of ten is aware of the Middle East conflict between Israel and the Palestinians. Their individual level of attention or interest varies, but exposure to the basic facts of the situation is still there. And middle-age Americans of my generation also remember a time when South Africa was white majority-ruled, with its citizens legally classified by degree of color and black South Africans segregated into townships. And if you were paying attention to politics at the time, you can also remember the excuses and phony communist-baiting offered by the Reagan administration as justification for the US policy of supporting the white government. (I can remember the college protests and the calls for institutional divestiture at the University of Washington; those were good days to be a progressive). Anyhow, this all climaxed with egg on America's face when white minority rule in South Africa changed to one-man, one-vote democracy.

But what most people don't realize is that the United States has its own West Bank, its own townships: the native Americans.

The average person is aware that many Indians died during the American colonization and expansion westward to the Pacific. But to be fair to history, many (if not the majority) of native Americans died not from muskets but from disease. This is not to downplay the dishonesty of the American government in its dealings with native Americans during the past three centuries, nor does it excuse the actions of so-called Christian Europeans. I note this merely to put things into proper perspective. In his recent masterpiece, Guns, Germs and Steel: the Fate of Human Societies, Jared Diamond discusses the reasons why European society grew up around such diseases (domesticated animals); how they survived the diseases (resistance gained through natural selection), and the unintended but yet devastating consequences of first contact between Europeans and people of the New World. So when the English Pilgrims met the native American delegation of Wampanoags headed by their chief, Massasoit, near modern day Providence, the damage had already been done. The English had no way of knowing that Massasoit's tribe had recently been reduced by disease from 1,000 indidividuals to barely 60. The culprit? Most probably an outbreak of food-borne viral hepatitis.

The modern day issue concerns not death, but intentional poverty. The US government has been complicit in a particularly heinous act that keeps native Americans in a state of poverty; namely, failure to properly manage and account for fees and rents on Indian land:
The accounts hold money that belongs to individual Indians who have earned it from such sources as oil and gas production, grazing leases, coal production and timber sales on their allotted lands. Most of the account holders are so poor that they need their money just for basic subsistence. Revenues are held in more than 387,000 Individual Indian Money (IIM) accounts managed - rather, mismanaged - by the BIA. "The BIA has spent more than 100 years mismanaging, diverting and losing money that belongs to Indians," Echohawk says. "They have no idea how much has been collected and are unable to provide even a basic, regular statement to Indian account holders."
The United States has recognized the special and sovereign relationship it has with Indian tribes since the time of the US Constitution. But in this particular situation, the United States decided not to allow the Indians to run their own affairs. The US government insisted on setting up these accounts on behalf of the Indians. So the federal government did not have its arm twisted in any way, shape or form. But this issue is not merely about the law. Collection and payment from these accounts especially impacts tribes in the Midwest and the West. Which, predictably, means that it also runs afoul of the historical triumvirate of powerful Western political constituencies: timber companies, mining corporations, and ranchers. In addition, resolving this issue would require an admission by the federal government that it was corrupt and incompetent. And therein lies the resistance to doing the obvious and the right thing: the federal government simply didn't realize how lucrative these accounts were going to become, and how "useful" that money could be, if it could be creatively managed for other non-Indian purposes.

The Quakers, a Christian denomination that I have tremendous respect for, has also taken up the cause of these native Americans:
For the past century, the federal government has held hundreds of thousands of acres of Indian land (some with valuable mineral and other resources) “in trust” for the land’s Native American owners. The government was supposed to collect the land-use money for individual Indian property owners and turn it over to them. It appears that it usually did the first half okay but often failed to do the second. Congress scrutinized the situation and passed corrective legislation. When that still did not solve the problem, Native Americans went to court, and the Cobell v. Norton case (also known as the Indian Trust Fund scandal case) is still not resolved. Now members of Congress are threatening to take the matter into their own hands because billions of dollars are involved. Even though the trust funds were never taxpayer dollars, the replacement funds would probably have to come from taxpayers.
This is a basic issue of fairness: the government is simply not doing its job. What is worse, the shame of poverty that many native Americans feel could be solved if the government would simply keep its word:
6. If we hadn’t taken their land in the first place, they would not need to rely on the federal government to manage their funds for them, and instead they would be getting paid directly by the companies involved. These Individual Indian Money accounts were created by U.S. governmental policy that was in place under the Dawes Act between 1887 and 1934. Known as the General Allotment Act, the policy compelled the breakup of a great deal of Indian territory into small parcels placed in the name of individual owners, and simultaneously made the “surplus land” available for settlement by non-Indian farmers and ranchers. That policy preserved for the federal government the right to manage the sale of the natural resources located on the land. Ironically, the government had put the land in trust to protect “ignorant savages.”
Lest anyone think I selected this issue in order to bash Republicans, I would like to point out -- as the Quakers also do -- that this is not a conservative vs progressive issue. The lawsuit in question was originally filed against Clinton's own Secretary of the Interior, Bruce Babbitt.
7. This is not a partisan issue in any way. American Indian plaintiffs first filed suit during the Clinton administration. Interior Department officials at the time were just as recalcitrant in responding as officials in the Bush administration have been. In 1999 a U.S. district court judge held Secretary of the Interior Bruce Babbitt and Secretary of the Treasury Robert Rubin in contempt for failing to follow court orders. He entered similar findings against the current Secretary of the Interior Gale Norton in 2002 and again in 2004. The judge was a Reagan appointee.
But instead of admitting its error, the federal government filed motion after motion for dismissal and continuance, trying its best to wear down the plaintiffs. But Indians have been waiting for generations; in a waiting game, they have more practice.

Friday, January 27, 2006

Education in America - or the Tale of Two Pies

This blog entry is a reflection of some thinking that I have done over the years about Americans and their attitudes towards public education. I'm not a teacher; could never be one. Ask anyone who knows me, and they'll all tell you: I don't have the patience to deal with 35 of someone else's kids. But I'm also grateful that some people do take on this thankless job and go on to perform miracles on a daily basis.

Anyhow, international comparisons of student performance show an interesting trend. For the sake of simplicity, such comparisons usually focus on performance in two subject, math and science. That seems appropriate; in today's increasingly technological world, growth areas such as computers, medicine, genetic engineering, networking, and the business administration are all rooted in science, mathematics, or both. I'm not trying to downplay the role that liberal arts plays in education; I earned both of my bachelor degrees in liberal arts fields. But it's just that science and math allow most of us to make a living, while liberal arts makes our lives worth living.

On to the study:
  • The fourth grade comparison looked at performance among 26 countries. In the fourth grade, American students scored 3rd place in science and 12th place in math - roughly the top and the middle of the spread for the respective subjects.
  • The eighth grade comparison looked at performance among 41 countries. In the eighth grade, US students have started falling behind. They only score 17th place in science and 28th place in math - approximately the middle, and the lower half, respectively.
  • The twelfth grade comparison looked at peformance among 21 countries. Here US students continue to fall behind. US students scored 16th place in science, and 19th place in math - the lower half in both subjects.
Not good. In trying to ascertain the causes for America's embarrassing scorecard -- compared to say, the Czech Republic -- the same study ruled out the usual suspects: differences in classroom time, amount of homework, television, and after-school jobs. In the first three categories, American students were not atypical. In the final category, the behavioral differences were deemed not great enough to account for the academic differences. So what is left?

It seems to me that what we need here is a hypothesis that explains two things:
  1. The relative parity in performance that American students enjoy early in their public school careers; and
  2. The gradual losing ground they experience as they continue through to the last year of school
I haven't done any in-depth research, but one substantial possible cause come to mind: organization of the school calendar year. Public schools in most other countries are on a year-round system. Students get a short break in between each quarter, but essentially are full time students. The United States, on the other hand, still clings to an extended summer vacation; a holdover from times past when a more rural America needed extra hands in summer to work the farm. This puts American children at a disadvantage, vis-a-vis their international counterparts. As a recent article from the Chicago Sun-Times notes:
School is in session 180 days in Chicago. That compares to 250 days in parts of China, 225 days in India, 210 days in Australia over 42 weeks and about 200 days in parts of Europe.

Illinois requires a minimum of 176 days, just below the 180-day average for most states. No U.S. school districts have significantly extended their school year, say officials with Education Commission of the States and the National Association for Year-Round Education.
These graphs from the National Association for Year Round Education show the difference in how academic years are organized:





It's not merely a question of total number of days spent on education, although that clearly is a factor. It's also how those days are spent. In the American system, sometime around the month of May the students begin anticipating the long summer break and often lose interest in their classes. Then in the autumn, classroom time has to be spent on remedial learning, refreshing students on everything they forgot over to the summer. I remember my own teachers complaining that September and half of October were essentially wasted time. So anywhere from 2 to 4 weeks in May are potentially lost, with another 4-6 weeks in the following autumn. So between 1 to 2 months out of the 9 month classroom year are essentially lost time to students - or at least, the value of that time is diminished.

So how does this explain the fact that, at least in the early years, American students are performing academically well, but then start sliding as the years go by? Let's do a thought experiment with two students, one American and the other international. Let's also say that they are equally matched in intelligence and aptitude. They start out at the same time, at the same level, in the first grade. Under the American system, our US student finishes the year and gets ready for the next one - but what do we find? At the start of second grade, he's already 1-2 months behind his international counterpart. At the start of third grade, he's an additional 1-2 months behind. The difference isn't noticeable at first, because the difference is small and expectations of students are not as great in the early years. With a little endurance, our American student can persevere. But by the eighth grade, the situation starts to grow serious: the American student will have accumulated 8-16 months of educational lag time. By the senior year, total lag time grows to 12-24 months.

It's often commented anecdotally that Japanese 9th grade science students are performing at a level equal to American high school seniors. This is taken as proof that Japanese schools are harder and turn out more diligent students. There may be some truth to that. But overall, I believe that would be looking at the issue through the wrong end of the telescope. In fact, American high school seniors are so far behind that they can only perform at a Japanese 9th grade level. So this anecdotal observation is not a compliment to Japanese schools; it is a measurement of how much lag time that American students have accumulated by the time they are seniors.

I believe that slowly accumulated lag time from our academic year explains why American students start out well, but then slowly lose ground over the years. Although I have no concrete data, I think it's a strong possibility. Suppose I am right - then what? Resistance to making the obvious change to a full-time public school year is amazingly strong; the summer vacation has almost become a entitlement program among American students and their families. Yet these same students (and parents) expect American public schools to keep pace in an increasingly competitive international environment. I'm not an educational warrior, and you won't see me at any rallies or committee meetings. But it never fails to amaze me how Americans claim to value their children's education - providing that it doesn't cost them anything in terms of time, money, or convenience.

Small Fry

Scientists announced today that they had discovered the world's smallest fish.

It is so small, in fact, that it has no skeleton protecting its head (I have no idea how that works, sorry).

Imagine a fish the size of a single grain of rice - and it leaves in murky peat bogs that are far too acidic for other fish to tolerate.

Alas, like most of the amazing wonders in nature, it is endangered by human activity. In fact, it's a miracle that this one was discovered before it, too, was unknowingly wiped out:
Peat swamps are under threat in Indonesia from fires lit by plantation owners and farmers as well as unchecked development and farming. Several populations of Paedocypris have already been lost, researchers say, according to the Natural History Museum.

Thoughts on Vegetarianism


I'm taking a break from political analysis to discuss a topic that has casually been occupying my thoughts lately; namely, vegetarianism.

Growing up in rural West Virginia, there was always some kind of meat on the table - we kept hogs, chickens, and beef, much like our neighbors did. And throughout the seasons of the year, we'd sometimes have the occasional deer, catfish, rabbit or squirrel. Needless to say vegetarianism wasn't much in vogue in the 1970s, certainly not at our dinner table.

My father was an avid outdoorsman, so I was very much at home with hunting and fishing. But I was always reluctant to kill anything. I really didn't want to see it die - I had too many pets as a child, and I knew I wouldn't eat the gamekill, so what was the point of killing it? I also saw how most of the hunters I grew up around killed far more than they would ever eat, and wound up throwing the surplus to their dogs. For most of them, the sport - the thrill of the hunt and the kill - seemed to be its own justification.

My first introduction to real, live vegetarians came through my father, who had been nursing a dream for years of trying his hand at self-sufficiency. If someone had asked him what he meant by that term, he would have replied: "living off the land, raising all our own food, and minimizing our reliance upon fossil fuels and technology." Of course, to me it meant another thing entirely - loss of creature comforts, no telephone, and just one more thing that made me stand out from my peers. In a small rural town, that is never a good idea. When Dad finally decided to launch the family into this enterprise, I met several kinds of "hippies", or homesteaders, as they preferred to be called. Among them were several vegetarians, and their "bible" was a magazine called The Mother Earth News.

I spent my college years at the University of Washington in the great Northwest - the epicenter of ecotopia and all things environmental. It was there that I met the political kind of vegetarians: people for whom vegetarianism was a statement of belief, with nutritional aspects coming secondary. I also met the religious vegetarians: those for whom vegetarianism was a moral choice, or a question of faith. I learned that there are different kinds of vegetarianism: lacto-ovo; those who permit themselves to eat dairy products and eggs, as opposed to vegans, who really mean no animal products whatsoever. Then during my years at a Major Northwest Software Company (ahem), my co-workers from India and Sri Lanka introduced me to a kind of ordinary, daily vegetarianism that was totally removed from the political and counter-culture version I saw in college.

I was beginning to see that vegetarianism was multi-faceted.

I've always had this nagging belief that the vegetarians may be right. I'm not a nutritionist, but if vegetarianism had any really substantial nutritional drawbacks, I suspect that the several hundred million Hindus would have discovered that by now. And before I acquired a better understanding of evolution and evolutionary processes, I also use to think that the presence of canine teeth in humans somehow "proved" that we were meant to consume meat. It does not, actually; gorillas have far larger canine teeth than humans do, and yet are 100% vegetarians. (They use their teeth in dominance displays and self-defense.)

What about the resource questions? In a world of scarce resources, can we really justify allocating 100 pounds of grain to feed to one cow, to create 1 pound of beef? Wouldn't it be more effective to just consume the 100 pounds of grain directly? Skip the middlemoo and go right to the grain? Not to mention the additional petroleum products it takes to raise the cow, the destruction that cattle cause to the environment, especially to salmon streams in the Pacific Northwest? And how about the fact that most Americans (myself certainly included) are overweight, and could easily forego some red meat with no bad side-effects (and probably some good ones)?

The morality of vegetarianism is also compelling. Of cours it's true that in some countries, eating meat may be a case of life or death. But how many of us in North America can truthfully claim that situation? Or in Europe?

Using history to justify eating meat is a dangerous game. Does the fact that history has always seen humans eating meat somehow justify continuing the practice? That argument can be used for continuing human slavery.

If you expected to find a clear-cut answer at the end of this essay, you're going to be disappointed. I have not made up my mind on this. In truth, the best vegetarian I could ever become would be the lacto-ovo kind, but I doubt I could even pull that off. Instead, I have satisfied myself that I am probably always going to eat meat. Therefore, I should try to do so in the most responsible and humane fashion:

1. I will not kill anything for sport;
2. I will try to buy organic and free range products when I can;
3. I will not eat endangered seafood when dining out at a restaurant

I'm not sure if this is a compromise (middle ground) or a compromise (selling out). I guess I'll have to keep thinking about it - after my next hamburger.

Blowback, Kosher style

My grandmother was a simple country woman, raised in the hills of West Virginia, without any formal education. A farmer's wife, she helped my grandfather work a small farm on a sunny hillside facing an expanse of rolling blue mountains. Grandma possessed an old-fashioned sense of the inevitability of justice that had nothing to do with religion or fear of legal consequences. For her, there were simply Things You Do Not Do, because the consequences would invariably come back to haunt you later on. So whenever she saw someone Doing one of those Things, it would always be followed by a comment in her Appalachian mountain drawl, "You reap what ye sow."

In the analysis circles that military minds and political wonks travel in, there exists a useful concept known as "blowback", discussed at length in a recent book by Chalmers Johnson. It refers to unintended, negative consequences of a country's covert operations which, after several years, boomerang against that country. In the latter half of the 20th century, the United States saw several such occurrences of blowback. The classic example is Iran, where we supported the former Shah because (1) he provided us with cheap oil, (2) told us he was anti-communist, (3) provided a useful counterweight against some of the Baathist and Arab nationalist regimes, and (4) gave the US military basing rights in a location in the heart of the traditional Soviet sphere of influence. In exchange, the United States provided the Shah with military and political support including arms, training, and intelligence. Suffice it to say that things did not go as planned. For decades, the Shah oppressed the Iranian people, amassed an unspeakable amount of personal wealth, and turned his CIA-trained security service, Savak, into a tool of the state to put down dissent and silence anyone who disagreed with his goals or tactics. The end result, as most Americans are aware, was the 1979 Iranian revolution, the ensuing hostage crisis, and the past 25 years of hair-trigger theocracy in Iran.

Just as the United States was reeling from that reversal in the Middle East, another such blowback opportunity was waiting in the wing. In 1979, the former USSR invaded Afghanistan to prop up a friendly regime that had taken power the year before in a coup. As America watched the Carter administration leave and the Reagan/Bush administration arrive, a new policy approach to the Soviets was unveiled. Under Reagan, the United States would have a more aggressive, muscular policy towards containment of communism - even if no one could articulate what the US strategic interests in Afghanistan happen to be. The practical problems of projecting American military power into Afghanistan, however, were formidable; Afghanistan was part of what both sides had always conceded was the Soviet sphere of influence; the US had just lost the most obvious ally (Iran) from which to operate; and the American public -- fresh off the Vietnam experience and with a hostage crisis in Tehran started only six weeks before -- were in no mood to see US ground troops engaged in a protracted land war on the other side of the globe. The alternative solution was to ignore the previous lessons about blowback and borrow from the Vietnam playbook. The Reagan administration chose to train, support and supply local insurgencies opposed to the Soviet invasion. Their choice? A group of Islamic fundamentalists known as the Taliban, under Osama bin Laden. The Reagan administration continued to fund and train the Taliban throughout the decade of the 1980s, until the Soviets finally withdrew their military from Afghanistan in 1989. Having accomplished their joint goal with the United States of wearing down the Soviet military, the Taliban were liberated to pursue their primary goal of creating an Islamic theocracy. The new administration of George Bush, Sr., like many American administrations, simply had more important things to worry about now that the monochrome threat of communism had been dealt with. Twelve years later the United States would watch as airplanes crashed into buildings and wonder who these people were, and where they came from. But even before 9/11, it was known that Osama bin Laden would be trouble.

Other examples of blowback exist: Vietnam, Saddam Hussein, The Philippines, etc. but lest anyone think that Americans have a monopoly on blowback, consider recent events in the Middle East. As it turns out, the Hamas organization was funded and most likely created by Israel. That is not the ravings of a conspiracy; it reflects the comments of rather well-connected people who are in a position to know:
Israel "aided Hamas directly -- the Israelis wanted to use it as a counterbalance to the PLO (Palestinian Liberation Organization)," said Tony Cordesman, Middle East analyst for the Center for Strategic Studies.

Israel's support for Hamas "was a direct attempt to divide and dilute support for a strong, secular PLO by using a competing religious alternative," said a former senior CIA official.

I have often remarked on how useful the existence of Hamas has been to the Likud party and to the right wing in Israel. Hamas provides an external enemy to rally the public and excuse the sins of the government. Hamas gives the settlers - who ally themselves with the conservative political groups in Israel - something to focus their fanaticism. Israeli politics is tightly wrapped; without an external threat, the wrappings would relax and the internal divisions that have been smoothed over by the external threat would surface. The Israeli public might have to stop and consider some of the contradictions between its declared intent to be (1) a uniquely Jewish state and (2) a democracy at the same time. Without Hamas, Israel might enter in a period of soul-searching about the inequities of its society and the different way it treats Jewish Israelis vs. Arab Israelis and non-Jews. Hard questions about the morality of occupying and trying to absorb towns with Arab populations of 90% or more would have to be answered. But fortunately, Hamas is there to prevent all that introspection; my how convenient. Cui bono?

Now it appears that the Israelis are experiencing blowback of their own.
"The thinking on the part of some of the right-wing Israeli establishment was that Hamas and the others, if they gained control, would refuse to have any part of the peace process and would torpedo any agreements put in place," said a U.S. government official who asked not to be named.

"Israel would still be the only democracy in the region for the United States to deal with," he said.

Let's linger over that point for a moment. As cervantes states it quite clearly, the Israelis did not in fact want peace, but rather the permanent acquisition of Palestinian territory, and so they did not want to be placed in a position where international pressure to deal with the PLO became irresistible.
All of which disgusts some former U.S. intelligence officials.

"The thing wrong with so many Israeli operations is that they try to be too sexy," said former CIA official Vincent Cannestraro.

According to former State Department counter-terrorism official Larry Johnson, "the Israelis are their own worst enemies when it comes to fighting terrorism."

"The Israelis are like a guy who sets fire to his hair and then tries to put it out by hitting it with a hammer."

"They do more to incite and sustain terrorism than curb it," he said.

Apparently Israel is getting ready to learn the lesson my grandmother already knew: consequences - like the two dozen or so chickens she raised -- always come home to roost.

Who will likely be the benefactor of this blowback, other than Hamas? The Likud party, whose former leader, Ariel Sharon, lies in a coma, leaving his newly formed party, Kadima, rudderless and facing elections. Likud's current leader, Benjamin Netanyahu, has wasted no time positioning the Hamas victory as an indictment against Sharon's policies, the withdrawal from Gaza, and the removal of settlements on the West Bank.

Thursday, January 26, 2006

Hamas wins Palestinian election by landslide

Like many people who follow Mideast affairs, I am digesting the news today that Hamas has won a stunning landslide, 76 out of 132 seats, in the Palestinian elections. Hamas can now form a majority government, and not a weaker coalition government as had been widely expected. I know that I certainly did not expect this, and it appears clear that Israel did not expect it, either.

Over the coming days and weeks, there will be lots of hyper-analysis, handwringing, and attempts at prognostication. I'll contribute my own two cents here. This is not a commentary upon terrorism, or upon Hamas' methods. It is intended as a dispassionate abstract analysis of how these events transpired. What can we attribute the success of Hamas to?

1. The United States validated Hamas by interejecting itself into the election process and trying to influence the outcome. It seems that the Bush administration feared that Hamas might become a real player in a political process that the United States itself had already endorsed - thus leaving the United States in the embarrassing position of having to accept the unacceptable outcome. To avert that, the Bush administration poured funds into the election, hoping to bolster Fatah's chances of winning. For an US administration that is used to skirting constitutional processes, this attempt to influence another country's election probably seemed like just another day at the office. But to the rest of the world in general, and the Palestinians in particular, it was yet more evidence of American imperialism and hypocrisy. And to many Palestinians, it confirmed their suspicions that Fatah had grown too complacent with the status quo, and gotten too friendly with the United States. The benefactor of this heavy-handed and not-very-well-concealed American intervention? Hamas, which emerged as the party that America clearly feared more than Fatah.

2. Israel - never one to learn from the mistakes of the United States - likewise validated Hamas. Israel declared that if Hamas were on the ballot, the election would not take place - thus reinforcing the impression that Hamas is the certified opposite number to the Israeli government. The Israelis followed up that threat with a similar ultimatum; if Hamas candidates were on the ballot, they would not be permitted to campaign. Every action that Israel has taken has backfired on them, to the benefit of Hamas.

3. The Palestinian Authority and the Fatah party were undeniably corrupt. After decades of power and nepotism, Fatah had grown into another Mideast regime, albeit without all the ordinary trappings. The average Palestinian burned inwardly with rage at seeing Fatah politicians and PA officials driving Mercedes, using their political connections to start businesses, mismanaging UN funds, and ignoring the poverty of the common man.

But that only describes the problem. Why was Hamas the solution?

At a time when Yasser Arafat and Fatah were mired in corruption or aloof neglect, the leaders of Hamas were devout and socially minded. They enforced strict discipline among their ranks, respected the Islamic traditions of the common man and woman, and avoided any show of ostentatious wealth. While the existing Palestinian authorities were busy lining their own pockets, Hamas focused on meeting the needs of the ordinary people by building schools and hospitals.

This did not happen overnight; Hamas spent decades laying the foundation for building grassroots support. Of course, they could not have enjoyed such spectacular success if the ground had not been so fertile in the first place. Had the United States and Israel honestly engaged the Palestinians in the 1980s, 1990s, and in our current decade, then the popular impression of Arafat and Fatah as ineffectual would not have lingered. Had the United States used its considerable influence with Israel - instead of simply sitting back and watching as Israeli tanks and helicopters destroyed entire Palestinian neighborhoods -- then the seeds of rage might not have been planted for future generations to reap. Had the world community spoken more loudly as Israel continued its policy of appropriating Palestinian land and handing it over to right-wing settlers, then perhaps the events of this election could have been changed. But as things sit now, the outcome was entirely predictable - even if few people actually predicted it.

Arafat has been dead for over a year now, and Sharon lies in a coma from which he will probably never recover. The old guard is dead; the new guard stands poised to take over. This will be interesting to watch.

Monday, June 06, 2005

The Myth of Economic Mobility

Today's New York Times had an interesting editorial that recaps some data from a research article that paper published earlier.

http://www.nytimes.com/2005/06/06/opinion/06herbert.html
Consider, for example, two separate eras in the lifetime of the baby-boom generation. For every additional dollar earned by the bottom 90 percent of the population between 1950 and 1970, those in the top 0.01 percent earned an additional $162. That gap has since skyrocketed. For every additional dollar earned by the bottom 90 percent between 1990 and 2002, Mr. Johnston wrote, each taxpayer in that top bracket brought in an extra $18,000.

And the Bush tax cuts? The ones that were intended to create jobs?


"Under the Bush tax cuts, the 400 taxpayers with the highest incomes - a minimum of $87 million in 2000, the last year for which the government will release such data - now pay income, Medicare and Social Security taxes amounting to virtually the same percentage of their incomes as people making $50,000 to $75,000. Those earning more than $10 million a year now pay a lesser share of their income in these taxes than those making $100,000 to $200,000."

The social dislocations resulting from this war that nobody mentions have been under way for some time. But the Bush economic policies have accelerated the consequences and intensified the pain.


Economists will tell you that our economy is approximately two-thirds driven by consumer spending, and one-third investment driven. But what do the uber-rich do with tax cuts? They do not rush out into a fit of consumer spending. The super-rich are far more likely to invest such a windfall, or squirrel it away -- neither of which has the growth effect on the American economy that broad-based consumer spending does.

Am I opposed to tax cuts for lower income and middle class Americans? No. However, the cost of the tax cut to the upper class would have more than paid for propping up the Social Security system. And currently we are spending $5 billion per month in Iraq, with nothing to show for it.

Because the neo-conservatives have taken the early lead in framing the debate, liberals have been scared away from supporting the idea of regressive taxation. But Regressive taxation is not a recent idea, or the result of some liberal east coast thinktank. It describes a sliding tax scale with increasingly higher brackets, and it has been part of American politics since Abraham Lincoln.

Under the guise of opportunity, the wealthy and corporate class have sold the American public a phony bill of goods. And they used our own tax money to give themselves a big reward for their cleverness.