Iraq might be off the front pages, but our legacy is still unsettled
By Joe Bruns — The United Nations mission in Iraq reports that 979 Iraqis were killed and another 2133 wounded in acts of terrorism and violence just in the month of September, bringing the death total to 5740 so far this year.
In Baghdad, on September 30 alone, there were 15 car bombs detonated as the violence continued to wreck devastation on what is left of the country. Car bombings have become a way of life, averaging more than two per day, with multiple coordinated car bombings frequent.
Nor is the violence confined to Baghdad. Just this week, suicide bombers, probably Sunni, killed 19 people of the small Shabak sect in attacks near the city of Mosul. And on September 29, ten people were killed and 62 injured in bombings in the generally peaceful Kurdish autonomous region.
Violence in Iraq is pervasive and comes from multiple quarters. The traditional animosity between Sunni and Shia has existed for centuries. Regional differences have diced the country into ethnic enclaves often under the control of local militias. The Kurdish autonomous area is becoming virtual Kurdistan. Even within sectarian groups, sub-factions battle for control. In Sadr City, Moktada al-Sadr finds himself in a power struggle with Asaib al-Haq, who has ties to Iran and has sent support to Syria to aid President Bashar al-Assad. Added to this are the professional insurgents, Baathists (dead-enders as they were once known), Iranian backed Hezbollah, resurgent al-Qaeda related groups, and clans seeking to settle old grievances. While the violence has not yet reached the stage of the bloody years of 2006-2007 when there was a virtual civil war it is increasing.
Iraq Ethnic and Religious Distribution — WorldPress
Meanwhile, the Nouri al-Maliki government celebrates oil production goals and buys Russian war materials. But al-Maliki’s regime is rife with corruption, is unable to regularly provide basic services and, importantly, lacks an independent judiciary. In an act the United Nations human rights chief Navi Pillay called “obscene and inhuman,” the al-Maliki regime this month carried out mass executions of 42 prisoners. Ms. Pillay went on to characterize Iraq’s justice system as too corrupt to warrant use of the death penalty at all, let alone on such a large scale.
An American Legacy
Ten years ago, the United States invaded Iraq for the purpose of overthrowing Saddam Hussein, securing his [non-existent] weapons of mass destruction, and establishing a democratic beachhead in the middle east.
It is not a neocon pipedream … that Israel, Palestine, Turkey, Afghanistan and Iraq can create [democratic] momentum that Egypt, Saudi Arabia, Lebanon, and … even Syria or Iran would find hard to resist,” said Victor Hanson (Hoover Institution) in 2005.
This grand crusade began with a spectacular display of American military might: shock and awe.
March 19- 20, 2003 Shock and Awe in Baghdad (Photo BBC)
General Tommy Franks commanded the invasion, and executed it with precision, demonstrating the particular strength of the American led coalition forces in striking hard and fast. The southern oil fields were quickly secured, and resistance to the invading forces, while at times fierce, was quickly overcome. By April, Baghdad fell.
By May 1, the invasion of Iraq had gone so well that President Bush declared on the deck of the USS Abraham Lincoln, “Major combat operations have ended. In the battle of Iraq, the United States and our allies have prevailed.”
USS Abraham Lincoln (5/1/2013)
But we quickly learned that defeating an opposing army and bringing peace, stability and democracy to a complex country were two quite different things. The problems started almost immediately.
The very nature of the invasion strategy left the coalition ill prepared to stabilize the country in the aftermath of military victory. We deliberately went in ‘light.’ Army Chief of Staff Gen. Eric K Shinseki in Congressional testimony estimated that an occupying force of several hundred thousand troops would be necessary to secure Iraq post-invasion. Deputy Secretary of Defense Paul Wolfowitz ridiculed Gen. Shinseki’s estimate, sticking by an earlier estimate of about 100,000 troops. We will never know if even Gen. Shinseki’s number would have stabilized Iraq. What we do know is that the troop level was insufficient to create civil order. Perhaps more important than sheer numbers, troops specialized in civil administration were insufficiently mobilized, and Arabic language speakers were in short supply. This, when combined with purge of Baathist bureaucrats and skilled workers and the administrative ineptitude of the Coalition Provisional Authority under Pro Consul Paul Bremer soon led to rioting, looting, and outright insurgency. By 2004, a full-scale civil war was breaking out.
You break it, you own it” — Colin Powell on the danger of regime change in Iraq.
Second Battle of Fallujah, November 2004 (photo BBC)
By the time the United States withdrew in 2011, as many as 461,000 Iraqis had died, although this figure is disputed. What is not disputed is that American and allied forces suffered 4804 deaths. More than 32,000 American troops were injured. According to Associated Press, an astounding 45% of the 1.6 million veterans of Iraq an Afghanistan have filed disability claims with the Veteran’s Administration.
Wars also leave impressions on the national psyche. In any war there are examples of jaw-dropping heroism as well as disgusting examples of atrocity. Unfortunately, it is the atrocities that we easily remember. Lynndie England and Abu Ghraib come easily to mind, but many other worse atrocities have been documented. Far less well known are the individual acts of heroism, including the four Americans – a Marine, two Soldiers and a Navy Seal – who were awarded the Medal of Honor, all posthumously. Wars also lead to new tactics and technology. World War I gave us air combat and poison gas, WWII gave us Kamikaze flights, the blitzkrieg and the atomic bomb, Viet Nam offered large-scale irregular warfare, napalm bombings and Agent Orange. Iraq introduced improvised explosive devices, the surge and the widespread use of drones, which have now found new use for killing terrorists wherever they hide, along with too many innocent civilians. Like poison gas, the atomic bomb and napalm before it, the use of drones has become an issue of moral debate.
Wars have unforeseen consequences. World War I led inevitably to World War II, as well as the unfortunate configuration of the Middle East. WWII led to the division of Europe and the Cold War. Defeating Iraq’s army and overthrowing Saddam Hussein was the easy part. It was a mission for which our military was designed. Nation building is the hard part, and is one that our army is ill-suited for, even when supplemented by vast support of contractors and multiple civilian agencies. The overthrow of Saddam and subsequent instability in Iraq has clearly changed the balance of power in the Middle East, and particularly with regard to Iran. It may also have contributed to the “Arab Spring,” with outcomes still unsettled—just like our own legacy in the region.
We have spent trillions of dollars equipping and manning the most powerful armed force the world has ever known. Our machines can defeat their machines, our men and women can outfight any foe. What we continue to demonstrate, though, is our inability to win hearts and minds — to win the peace.
Worsening Health Outcomes for Poor Women (especially in states refusing to expand Medicaid under Obamacare)
By Joe Bruns — In the September issue of The American Prospect, Monica Potts memorably tells the story of Crystal Wilson, of Cave City, Arkansas, who died this past May two months shy of her thirty-ninth birthday. Ms. Potts paints an intimate portrait of Crystal to personalize a demographic mystery posed in the title of her piece: What’s Killing Poor White Women?
Demographers have long been intrigued by the differences in mortality rates of segments of the population. Women live longer than men and whites live longer than blacks, even as those gaps are narrowing. But ever since such statistics have been recorded, with very few exceptions such as the 1918 flu epidemic, the trend among all major segments of the population has been upward.
That is until health statisticians began noticing a new trend in American longevity: there was a wide disparity between male and female longevity growth rates when looked at county-by-county. In fact, according to a study in Health Affairs, (Kindig and Cheng 2013) female mortality rose in 42.8% of US counties between 1992 and 2006. By way of comparison, male mortality rates increased in only 3.4% of US counties.
Change In Female Mortality Rates from 1992-96 to 2002-06
Furthermore many of those counties where female longevity fell lay in an arc, from West Virginia, through the states of the old Confederacy to Texas, along with the rural mid-west and counties with Native American reservations. While more recent data show that the downward county-by-county trend has leveled off, there is no dispute that female longevity is significantly disadvantaged in the poor/rural South.
In fact, a woman living in Montgomery County, Maryland has a life expectancy twelve years longer than one living in McDowell, West Virginia.”
The twenty US counties with the worst female longevity are all in the states of the old Confederacy with the exception of two in West Virginia. Among the possible contributing causes are obesity, smoking, and poor nutrition leading to diabetes and heart conditions, as well as limited access to quality health care. What is even more striking, though, is that the loss of longevity is particularly seen among white women.
It also turns out that there is a significant correlation between high school education and longevity among all groups. In a paper written in August, 2012, lead researcher S. Jay Olshansky found “that in 2008, US adult men and women with fewer than twelve years of education had life expectancies not much better than those of all adults in the 1950s and 1960s.” These effects appear to have a compounding effect on white women. Olshansky found a five-year drop in life expectancy among white women high-school dropouts compared to a three-year drop for white men.
Crystal Wilson fit into a pattern. At the time of her death, Crystal weighed over 200 pounds, and had been diagnosed with diabetes, but was ‘waiting to get medicine’ for it. Her primary source of income was her husband’s disability check. She met her husband, Possum, while she was in high school. He was 28; Crystal was 16. They married and Crystal dropped out of school.
Forbes Magazine, among others, regularly comes out with articles claiming that poverty in the United States is not so bad, that it is more a matter of bad choices or even bad luck, and at any rate, they are not as poor as those in other countries.
But it is also a fact that while the United States spends more per capita on health care than any other industrialized nation, we are slipping badly in outcomes. According to a major study in the Journal of the American Medical Association, the United States ranks 26th out of 34 OECD countries in ‘healthy life expectancy.’ And we are losing ground to other industrialized countries.
Despite a level of health expenditures that would have seemed unthinkable a generation ago, the health of the U.S. population has improved only gradually and has fallen behind the pace of progress in many other wealthy nations,” writes Harvey V. Fineberg, M.D., Ph.D., of the Institute of Medicine, Washington, D.C
So, how is this interesting from a public policy perspective? It’s worth looking at two additional graphics.The first depicts health insurance coverage by state.
The second map shows where states are with regard to Medicaid expansion.
To me, while obviously not identical, the patterns on these two maps are striking in their similarity to the first map showing where women had shortening life expectancies. America may have “the best healthcare system in the world,” as Sen. Jeff Sessions (R-AL) said, but if so, like wealth, it is unequally distributed.
If I were a politician from a county or state with worsening health outcomes for poor women I don’t know how I would explain to my constituents, or even myself, a failure to support the expansion of Medicaid under the Affordable Care Act, or the rejection of the goal of universal health insurance.
One of the Best Predictors of SAT Scores is Your Family’s Income
By Joe Bruns — The results for the 2013 SAT scores are out, and the annual ritual handwringing, finger-pointing, calls for reform and some occasional bragging are in full throat.
While the Orlando Sentinel reported that Florida scores were below the national average (again), and the Los Angeles Times lamented a 2% drop in California student scores, the Washington Post headlined score increases for Virginia and the District of Columbia, while also noting declines in Maryland. National Public Radio reported with the glass-three-fifths empty lead-in that “roughly 6 in 10 college-bound high school students who took the test were so lacking in their…skills, they were unprepared for college-level work.”
Getting the basics out of the way, 43% of those students who took the SAT attained scores indicative of being prepared for college work, according to The College Board, which administers the test. This has been steady for five years.
The College Board, in their press release, goes on to state some findings that seem pretty obvious. Students who take core college preparation curricula are more likely to do well, as are students who take Advanced Placement courses, and, seemingly the most obvious, students with high GPAs.
More African-American and Hispanic students took the SAT in 2013, and they did slightly better than last year. But, overall, performance lags compared to other groups.
College admissions officers are quick to point out that SAT scores are only one of many factors they take into consideration. Some colleges are dropping the requirement for the test entirely, or offering other, sometimes imaginative ways to show academic promise, as Leon Botstein, President of Bard College has done, while declaring ‘war on the whole rigmarole of college admissions.’
Still, the SAT is used as a tool by admissions officers, and while some disagree, it purports to be an indicator of future academic success. Otherwise, what value is it?
Looking at the College Board data one item that really caught my attention was the distribution of SAT scores by family income level. Perhaps not surprisingly, but with astonishing consistency, one of the best predictors of SAT scores is your family’s income.
Graphic by Joe Bruns; Data from The College Board
It’s often difficult to conclude cause and effect. But it seems obvious that families with more money can afford better schools in the form of private schools, or living in more affluent communities with better public schools. They can also afford tutors, summer enrichment programs and SAT preparation classes, and enjoy better health care and even better nutrition for their children. So, it is not at all surprising that the more wealth your family has, the better you should do on the SAT.
But, if wealth is being concentrated more and more at the top, graphically displayed in the clip below, and SAT scores follow wealth, what does that mean for economic mobility?
But, a ProPublica study, reported in The Chronicle of Higher Education, shows that since 1996, a smaller proportion of grants are going to students in the lowest family income brackets though the recession hit those in lower income brackets the hardest.
To state again, I am not suggesting that college admissions is inherently unfair. The wealthy kids are better prepared, and they get the best seats. The real problem though, is that the pool of college qualified high school students is tilted toward wealth. As this continues over time, generation by generation, the effect further concentrates elite education opportunity among the already high-wealth demographic.
It should come as no surprise, then, that our higher-education system is becoming increasingly stratified by race and income.” — Awilda Rodriguez in The Chronicle for Higher Education
Affirmative action programs based on income rather than race might help in the short run, but if it means drawing from a pool that is not well prepared for college, it can also lead to drop-outs, failure and frustration, to say nothing of student debt. And, adding to the problem, elite colleges are actually reducing need-based financial aid.
We Need a National Commitment to Fix our Schools, Particularly in Economically Disadvantaged Neighborhoods
The problem is systemic, and is not permanently addressed by affirmative action or other similar programs. We need to start fixing the problem at an early stage if we are to affect outcomes in high school. Poor schooling in the basics leads to poor performance in high school. The root cause of high school dropout and poor SAT scores is the same. Too often, poor children go to ‘poor’ schools. These schools lack the resources, leadership and teaching of schools that are being attended by rich kids. Compare a school in a poor urban district to that in an affluent suburban neighborhood. Most often, the difference is striking. And that is just comparing public schools. The wealthy also have access to private schools to turbocharge their built-in advantage. Testing and slogans won’t change this dynamic. Money to attract and retain the best and brightest teachers and administrators, money to modernize and renovate facilities, money for nutrition, after-school and summer learning programs is needed. While dollars alone can never get to the real differences between the schools it would certainly help. In return for higher salaries and better working conditions, teacher’s unions would need to jettison rules that impede the removal of bad teachers. And, of course, we need greater parental involvement.
Unless and until we begin to get serious about providing all of our children in all of our schools the advantages provided to the more fortunate in society, we will keep them at a disadvantage at every step, including depriving them of access to the American dream of working hard, getting a good education, and moving up the economic ladder.
Chances are, you don’t know who Chad Dixon is. He was sentenced to jail on Friday for teaching others how to trick lie detectors. Depending upon your perspective, Mr. Davis is a petty criminal, an entrepreneur, a threat to our national security or, in the words of a federal prosecutor, an individual who has admitted a “career of criminal deceit.” Or, maybe he’s just a con-artist.
It seems Mr. Dixon was an unemployed electrical contractor, with a pregnant fiancé in school, and unpaid bills on his desk. He had remembered seeing a television program about the flaws in polygraph examinations, and that trained individuals could easily ‘fool’ the polygraph into accepting lies as truthful answers. After some on-line research, Mr. Dixon purchased, for $49.95, a training manual from another polygraph countermeasure trainer, and, supplementing that with manuals published by, among others, the Department of Defense, again openly available on the Internet, soon set up shop as Polygraph Consultants of America, advertising on his Web site that, for a fee of $1000 and up, he could teach people to “Always Pass your Polygraph Test – Nervous or Not – Lying or not –No Matter What.”
At this point, it is probably important to note, as Mr. Dixon’s defense attorney did, that providing polygraph countermeasure training to anyone is not a crime. And, advertising that individuals can be taught to give apparently truthful polygraph results even if they are “flat out lying,” is protected speech. Many of the techniques taught by countermeasure trainers are pretty obvious: controlled breathing, counting by threes backward, contracting your muscles at certain points, and so on.
The number of Mr. Dixon’s clients was a matter of dispute, but it was between 50 and 100. It was a mixed bag. His defense contended that most of his clientele were people preparing for marital infidelity screenings. Others were registered sex offenders, including one J.O. of Fairfax, VA, who was a peeping-tom. After having failed all but one of eight court-ordered polygraphs, J.O. availed himself of Mr. Dixon’s training and managed to pass three tests in a row.
Mr. Dixon also worked with individuals on pre-employment polygraph screening. He admitted to training seven clients with four federal agencies. He also trained several undercover investigators, which led to his demise.
Interestingly, again as pointed out by his defense team, Mr. Dixon was not prosecuted for training “adulterers or convicted sex offenders.” His mistake was in advising four of his clients to deny that they had received countermeasure training. And, by doing so, he interfered with federal agencies pre-hiring security screening. Furthermore, by using the Internet to assist in this crime, he committed wire-fraud.
Now, I’m not defending Mr. Dixon’s actions. He apparently did help some nasty people beat the system, and seemed to be without scruples in assisting real and undercover people conceal deleterious information about themselves in applying for jobs of trust with the government.
But, then again, how is it that we have come to rely on a device that the National Academy of Science says is only better than chance in favorable conditions at detecting lies to protect our nation’s most important secrets? And is it a coincidence that the widespread use of polygraph coincides with what appears to be a proliferation of secret-spilling?”
In the annals of spies and traitors there’s a whole gallery of individuals who passed polygraph exams repeatedly, perhaps most notoriously Aldrich H. Ames, who was passed by the CIA polygraph interpreters, but later, only after he was caught spying, the FBI re-examined the charts and said they figured he was lying.
Dr. Wen Ho Lee
Then there was Dr. Wen Ho Lee who worked for the Department of Energy weapons program in Los Alamos, NM. Dr. Lee was accused of spying and subjected to three polygraph examinations. He passed all three, according to the DOE. However, the FBI concluded, from the same test, that he was lying. Dr. Lee was ultimately convicted only of mishandling classified documents, one of 59 original charges against him. The federal judge apologized to Dr. Lee for his treatment and ridiculed the government’s handling of the case.
How can any device that is so easily fooled, and so randomly subject to interpretation have any value in safe-guarding our national security? There is no reputable scientific body that has found the polygraph to be effective in detecting lies. In using this flim-flam as a means of protecting our national security we are only lying to ourselves.
Before making arbitrary cuts to programs that feed the hungry, go where the money really is.”
By Joe Bruns — The U.S. House of Representatives is going after food stamps again. In July, the House voted to separate SNAP — as the food stamp program is now known — from the ten-year farm bill, upending a longstanding bipartisan compromise to link support for farmers with support for poor Americans.
The history of this arrangement dates back to The Great Depression when farmers were overproducing commodity products for which there was an insufficient market. By buying surplus commodity products and distributing them to the poor, both farmers and the poor benefited.
Surplus Food Commodities
With no chance for the House action to be sustained in the Senate, House Republican Leader Eric Cantor took a different tack, voting instead to reduce the program by some $4 billion annually, a five per-cent reduction from its $80 billion base. With floor leadership from Rep. Steve Southerland (R-FL) the Bill passed 217–210. No Democrat supported the Bill.
Many of these individuals, the argument goes, could, and should, be able to meet their needs by joining the workforce. For Rep. Southerland, who was recently profiled inThe Washington Post, it is a moral hazard: work was given to men by God in the Garden of Eden, it is an obligation, and keeping those who could work on federal assistance is corrosive both to the individual and to society. Democrats, and other critics, point out that many recipients of SNAP already are working but are still unable to provide a basic level of nutrition for themselves and their families. But, beneath the wonkish details there is another less polite question.
Recently, Fox News ran a piece about an unemployed California surfer who dines on lobster and sushi courtesy of you, the taxpayer. This story, and others like it, has also provided food for talk radio and TV, raising memories of Ronald Reagan’s infamouswelfare queen.
The story told by then presidential candidate Reagan was a myth built around a set of assumptions, and they continue to this day, despite significant reforms of the welfare system.
President Bill Clinton Signs the Welfare Reform Act
But how much cheating takes place in the SNAP program?
It is actually difficult to find a straightforward answer to the question of how much cheating and fraud exist in the SNAP program. The USDA states that the amount oftrafficking or conversion of SNAP benefits to cash is less than 1%. They also find that the amount of benefit is correctly calculated more than 96% of the time. But a basic premise of the program is that an able bodied person who can find work, should do so. Or they should enroll in a training program that will give them the necessary skills for employment.
The Great Recession Leads to Expansion of SNAP
The Great Recession increased the importance of the SNAP program, and expanded its rolls. Two key elements contributed to this expansion. First, broad based categorical eligibility was expanded by States to include some individuals with incomes greater than 130% of the poverty line as long as they met other State poverty program criteria. In addition, single able-bodied individuals with no children had their eligibility extended beyond the existing three-month limit. Both of these required State waivers to take effect, and are available only during periods of high unemployment.
Generally, it is people falling into the expanded coverage category who are targeted by the SNAP budget cuts.
It is important to note that these people constitute a relatively small portion of overall SNAP recipients. According to The Center on Budget and Policy Priorities, only 2% of those receiving SNAP had qualified disposable income above the poverty line, and the extension of eligibility for able individuals with no children added between 1 and 2 million to the monthly rolls.
Twenty-three million SNAP beneficiaries are children
But what about the Cheaters?
If the USDA inspector general finds an error and trafficking rate of about 3.6%, how many more individuals fall into the surfer bum category? That is, how many people game the system to get a benefit that reasonable people would agree was not intended for them? The answer is hard to come by. But let’s make some assumptions.
Let’s say, for argument sake, 25% of the 2 million able/single individuals, such as our surfer friend, are gamers, and should be out making a living. That seems high to me, but let’s make that call. That’s 500,000 ‘cheaters’ out of a program that provides benefits to 50 million, or 1%. Let’s further say that fully half of the people who have disposable income over the poverty line are also cheaters. That amounts to another 1%. So altogether, and rounding up for good measure, for argument sake, let’s say the cheater rate for SNAP is 6%, or $4.8 billion.
Most Beneficiaries of SNAP are Children, the Disabled and the Elderly
$4.8 billion is a lot of money. But, before we start cutting every loose thread, consider also the following:
I’m not suggesting that fraud and waste be tolerated in the SNAP program, or any other government program. What I am suggesting, though, is before making arbitrary cuts to programs that feed the hungry, we might be better advised, in the words of Willie Sutton, to go where the money is.