Time To Reign In 21st Century Loan Sharks: Payday Lenders Prey On People In Need – Monday, December 10, 2007 – Concord Monitor
While it would be easy to think payday lending is some new phenomenon, nothing could be further from the truth. Payday lending is the modern form of usury. Usury and its regulation have been the subject of civil and religious debate for literally thousands of years.
Usury is not a word you hear used very often now. It has a musty, 19th-century quality. Usury can mean the price paid for the use of money. It can also simply mean excessive interest.
State usury laws refer to a body of law regulating the amount of interest charged by lenders. Most states have long had laws specifying the maximum legal interest rates at which loans can be made. For almost our entire history as a state, until 1999, New Hampshire has had such laws protecting consumers.
In January, the Legislature will take up House Bill 267, a bill placing an interest rate cap of 36 percent APR on payday and auto title loans. The bill is modeled on legislation passed by Congress in 2006 to protect our military service members who were being victimized by payday lenders. It would restore an interest rate cap which has been our state norm.
There is a long history dating back to before the American revolution of the use of interest rate caps to protect against usury. In his excellent book, Taming the Sharks, law professor Christopher Peterson recounts this history.
Originally, the colonies imported English law which included an interest rate cap statute called the Statute of Anne. The Statute of Anne imposed a maximum allowable interest rate of 5 percent per year. Most of the states initially imposed caps between 4 and 10 percent per year although after independence, most states set their maximum rate at 6 percent per year.
Early American society featured a very strong thrift ethic. Reckless borrowing for personal consumption was extremely frowned upon. The public had little sympathy for debtors. State law rigorously enforced debts and a sense of shame attached to personal debt. This was the era of debtors’ prisons. Even though low interest rates were the norm, imprisonment for debt was very common. In Massachusetts in 1830, there were three to five times as many persons imprisoned for debt as for crime.
After the Civil War, attitudes toward personal debt loosened. A new lending practice developed called salary lending. Salary lending is the historic precursor of payday lending. The principle was the same. A debtor would borrow $5 and repay $6 at the end of the week.
While that might not sound too bad to modern ears, it led to chain debt, an early version of the repeat borrowing trap characteristic of payday loans. Manipulative lender practices like the imposition of staggering late fees and shady calculation of interest trapped debtors into endless payments.
Salary lending was characterized by lenders collecting the most money while reducing the overall debt owed as little as possible. If the debtor lost his job or suffered illness or could not pay for some other reason, interest compounded and debt swelled.
The salary lenders targeted employed and married working class white men, seeing them as good credit risks and likely to repay because of their steady employment histories.
1,000 percent interest
The term “loan-sharking” did not originate with the 20th-century Mafia. It actually comes from the period after the Civil War. In the Eastern cities, the salary lenders were infamous for charging interest rates over 1,000 percent annually.
The abuses and horror stories about this loan-sharking led to a series of governmental reforms and policy responses. Federal bankruptcy law reform allowing more discharge of debt, an increase in cooperative and charitable lending, and new small loan laws were all responses to the havoc created by usury.
The 20th century saw an explosion of credit and greatly expanded consumer debt. Old American stigmas about reckless borrowing disappeared and the credit card made its appearance.
In contrast to the New Deal period when government took a wide range of regulatory, protective steps, government since the Reagan era has promoted deregulation. Deregulation opened the door to exploitation of the poor.
When interest rate caps were scrapped in our state in 1999, the legislative history shows that the Legislature expected interest rates to go up to only 20 to 25 percent. There was no expectation interest rates would soar into the stratospheric rates of the payday lenders where the sky is the limit.
Five hundred percent APR? No problem if you are a payday lender. It is a sign of our moral confusion that we do not readily see payday lending as a gross form of usury. There are reasons the Christian, Jewish and Islamic traditions have all rejected usury as pure, unchecked greed. I expect ethical atheists would too.
This is not some moderately priced consumer credit. While payday lenders and auto title lenders strive for acceptance and legitimacy, they are 21st-century loan sharks. This is the dark side of deregulation where the administrative state has failed and consumer protection has been junked.
The issues around payday lending are just the latest round in an American fight over predatory lending that has been longstanding. No strategy will completely stop the payday lenders. They are foxy and will try to work around legislation. Still, restoring an interest rate cap is the time-tested reform which has proven most effective in curbing abusive money lending.
Bush’s Legacy: Disrespect For International Law President Has Misread History, Constitution – Monday, September 17, 2007 – Concord Monitor
Disrespect for international law has been a hallmark of the Bush presidency. Whether the issue has been the Iraq war, the treatment of detainees or global warming, the Bush administration has looked at international law as a nuisance and an unacceptable constraint on American power.
This view connects to a longstanding tradition of American skepticism about international law. Many on the political right see international law as a threat to American sovereignty. They see our treaty obligations as political and not legally binding.
In practice, the Bush administration has picked and chosen which international rules it has wanted to follow. Free trade agreements protecting foreign investments have been supported. It has junked much else while thumbing its nose at the international community.
For a nation of law and lawyers, this approach must be considered an unmitigated disaster. It misunderstands international law, its history and the increasing need for a working set of international legal rules in an era of globalization.
The United States must follow international law because of our own Constitution. The Constitution decrees that treaties are “the supreme Law of the Land” and that the president “shall take care that the Laws be faithfully executed.”
When the United States became a party to the United Nations charter, it explicitly agreed to abide by its provisions. This is also true of other legal obligations like the Convention Against Torture, the International Covenant on Civil and Political Rights, and the Geneva Conventions.
Since World War II, the United States has actually played a leading role in setting up an international rules-based system. First with the Atlantic Charter and then the UN, the United States committed to three broad principles. These were: a general obligation on states to refrain from use of force in international relations except in self-defense or where authorized by the UN Security Council, protection of human rights, and economic liberalization through the adoption of free trade rules.
Much of the impetus toward the development of international law was the Nazi experience. The appalling crimes of the Nazi war criminals led to increasing awareness of the need to address future Holocaust-like crimes.
The result was the Nuremberg Military Tribunal. The United States, Britain, France, the Soviet Union and 17 other states agreed at Nuremberg to codify the rules of international law on war crimes and crimes against humanity. Nuremberg defined aggressive war in violation of international treaties as the supreme international crime.
These days when we consider the war in Iraq, focus seems to be on issues like whether or not the so-called surge is working. I would suggest that a more fundamental place to begin is with the question of the war’s legality. That issue has been brushed over for years.
United Nations Charter law did not permit the president to launch the Iraq war unless there had first been an armed attack by Iraq against the United States or unless the UN Security Council authorized the use of force. Neither condition was met.
The United States had no legal authority to intervene in the affairs of the Iraqi people. Arguably this is quite different than our response to al-Qaeda where both conditions were met before force was used.
The lessons of the Nuremberg tribunal have continuing relevance today. Heads of state must know they will be held accountable for war crimes and massive violations of fundamental human rights law.
Bush and Cheney deserve to be placed on trial for war crimes. This could happen after Bush leaves office. They unleashed a fantastically destructive war of aggression based on false reasons. Their war-making violated the UN Charter, not to mention the Torture Convention and the Geneva Conventions. Being ultimately responsible, they must be held accountable for torture and for extraordinary renditions where suspects are sent to other countries to be tortured.
For those who believe that prosecution of Bush or Cheney is ridiculous or impossible, I would offer the example of General Augusto Pinochet, the former dictator of Chile. When Pinochet was arrested in Britain almost a decade ago, he argued immunity as a former head of state for all crimes committed after the military coup he led in 1973. Pinochet lost that argument before the British House of Lords although he was later released. The Pinochet example should worry Bush and Cheney.
In articulating the need for international law, I would certainly acknowledge its failures and shortcomings. International law has not been able to stop many oppressive regimes from brutalizing and oppressing their populations. It has not prevented torture, slavery or starvation. Still it is better than the alternative – a world of might makes right where states and non-state actors have no obligations to meet minimum standards of acceptable behavior.
While the U.S. Supreme Court is sharply divided, it is worth noting that a number of justices recognize international law as a legitimate form of legal authority in their decisions. Certainly, the Court’s decision last year in Hamdan v. Rumsfeld relied on the Geneva Conventions.
It is hard to believe that future administrations, either Democratic or Republican, will not reconsider the Bush Administration’s antagonistic stance toward international law. An increasingly interconnected, globalized world will necessitate a more pragmatic approach.
The influence of the religious right has reached its highest point under the presidency of George W. Bush. Take your pick of indicators. Billions of dollars directed to faith-based organizations. Science rejected, whether it is global warming, embryonic stem cell research or evolution. An administration stuffed full of fundamentalist graduates of Regent University, the law school founded by Pat Robertson which was formerly known as Christian Broadcasting Network (CBN) University.
No less than President Bush himself claimed during one of his debates with John Kerry that the United States was founded as a Christian nation. This claim has been repeatedly made by the religious right.
The claim is contrary to the facts. The founding fathers were broad-minded, cosmopolitan intellectuals – not pious Bible-believing men. They were products of the European Enlightenment. They looked skeptically at religious zeal as a form of irrationality.
They certainly did not found America to be a Christian nation or any kind of theocracy. The 18th century they inhabited was a time when science and skepticism held sway over faith. The absence of the mention of God in our federal constitution was no accident.
The founders were largely Deists. Among them: George Washington, Thomas Jefferson, Benjamin Franklin, James Madison and Tom Paine.
Deism is a little known philosophical system that was popular around the time of the American revolution. Deists believed in a supreme deity who created the universe. They did not believe that deity exerted influence in the world.
Deists rejected supernatural beliefs and thought the exercise of human reason was the means to solve society’s problems.
The record is clear that Deists rejected Christian doctrine like the virgin birth, the divinity and resurrection of Jesus and miracles in the Bible. They saw Jesus as a great philosopher rather than a divine figure.
In her book Moral Minority, Brooke Allen explores the religious beliefs of the major founding fathers. It will surprise a lot of people how skeptical of religion and how even anti-clerical the founders actually were.
George Washington almost never spoke about religion. In the 37 volumes of his collected papers, there is little mention of religion. He was very cagey in avoiding declaring himself on religion.
Allen concluded that religion played a small role in his life. He was nominally a member of the Episcopal Church but he could certainly not be classified as an avid churchgoer. Unlike his wife Martha, he did not take communion. During the last three years of his life, he attended church only three times. No minister was present at his deathbed.
Hide your bibles
Thomas Jefferson referred to Christianity as “our particular superstition.” He actually wrote his own version of the gospel that became known as the Jefferson Bible. It eliminated all miracles attributed to Jesus, contained no resurrection and it ended with Jesus’s burial.
Jefferson once famously wrote, “In every country and in every age, the priest has been hostile to liberty. He is always in alliance with the despot, abetting his abuses in return for protection to his own.”
Jefferson was hated and feared by the religious right of his day. When he was elected president, New Englanders actually hid their bibles. He harbored a deep-seated hatred of the clergy of every brand of religion.
At the same time, he believed passionately in universal freedom of religion. He advocated full religious liberty so that believers of all stripes and atheists would all have rights that would be respected.
James Madison was, along with Jefferson, the strongest supporter of the separation of church and state. Madison had a negative assessment of the Christian church’s relation to civil power. He wrote, “During almost 15 centuries has the legal establishment of Christianity been on trial. What have been its fruits? More or less in all places, pride and indolence in the clergy, ignorance and servility in the laity, in both, superstition, bigotry, and persecution.”
The original Constitution, before the Bill of Rights, contained only one mention of religion in Article VI. That mention was a radical novelty at the time. It was the injunction “no religious test shall ever be required as a qualification to any office or public trust under the United States.” That was Madison’s view. The intent was promotion of merit over adherence to an ideology or faith.
If the founding fathers had wanted to designate America a Christian nation, there can be little doubt they would have included in the Constitution references to God, Jesus, Christianity and the Bible. They might have made it a requirement that only Christians could hold public office. They knew what they were doing. They did none of those things.
The founders protected religious pluralism by keeping religion out of the legal framework of the United States. Looking at the violence done in the name of religion since that time, it is hard to deny the founders’ wisdom and vision on the separation of church and state.
Pass This Law To Stop Foreclosure Predators: People Facing A Loss Of Their Homes Need Protection Saturday, May 19, 2007 – Concord Monitor-
Imagine this scenario: You are a homeowner who has fallen behind on your mortgage payments. You receive notice that your home will be subject to a foreclosure sale. You lack the money to catch up. As the date of foreclosure approaches, your desperation deepens. There seems to be no way out. You pray for some financial savior to appear at the 11th hour.
The growing frequency of this problem has spawned a new form of consumer fraud: foreclosure rescue scams. The rescuer promises to save desperate homeowners. Instead of a rescue, the typical result is that the homeowners lose their home and are cheated out of all their equity.
Between the first quarter of 2006 and the first quarter of 2007, there has been a 95 percent increase in foreclosures statewide. We have gone from 639 to 1,247 foreclosures in the state. More alarming are the projections, with many experts forecasting that foreclosures will peak this summer and fall. New Hampshire Legal Assistance is seeing an increase in clients facing foreclosure and related scams. The scams come in several variants.
Probably the most common fraud is a bait and switch where the homeowner is not aware that the rescue actually means loss of ownership of the house. The homeowner thinks he or she is signing new loan documents to pay off missed payments on the mortgage. In fact, the documents give ownership to the rescuer.
The rescuer turns the homeowner into a tenant, sets oppressive rental terms and proceeds to evict the former owner. In the process, even though the total of the missed payments is relatively small, the homeowner-turned-tenant can lose hundreds of thousands of dollars of equity that has been built up over the years.
A variant of this scam is where the homeowner agrees to a sale/leaseback. The homeowner consents to transferring ownership in the belief that ownership will be regained at a later time. The homeowner ultimately finds out the rental terms are impossible to afford and ends up evicted.
A third variant is the foreclosure consultant who promises help in exchange for payment of exorbitant fees. A common result is that the consultant takes the money and runs. Or the consultant does some inconsequential and ineffective calling while promising help that never materializes.
Mortgage rescuers locate financially distressed homeowners through public foreclosure notices, which are widely accessible in newspapers or online. They make contact with seductive promises of saving the home.
Our state law is inadequate to address the new fraud schemes. Nor is any state agency equipped to handle the volume of likely claims. Because the forms of fraud are new, there is a need for new statutory remedies to protect the public against these predators.
The anti-regulatory tendency in New Hampshire is extremely problematic in this context. It encourages the con artists. The scams are multiplying at a time when foreclosures essentially doubled within a year. More foreclosures probably mean more scams.
Fortunately, a bill sponsored by Claremont Rep. Tom Donovan, a Democrat, takes up the matter of foreclosure rescues. While the bill was originally drafted to form a study commission, there was strong bipartisan support on the House Commerce Committee to do something substantive now. Many legislators wanted action.
As a result, an amended version of House Bill 365 has been drafted. It contains strong rules prohibiting unfair transactions and requiring disclosures to homeowners. The bill allows remedies under the state Consumer Protection Act to deter the rescuers. It does its best to anticipate the growing variety of scams.
It is depressing enough that so many homeowners face foreclosure. It is even worse that there are people low enough to fleece homeowners when they are on the verge of losing their homes.
Fraud is as American as cherry pie. Foreclosure rescue scams have to be among the most insidious, heartless forms of fraud. While consumer protection law has not been an important part of the political agenda of either political party for a long time, it is encouraging to see legislators from different sides come together to tackle such an obvious wrong.
Minimum Wage Increase Is Overdue: Legislature Has Shown Disregard For Workers – Saturday, February 24, 2007 – Concord Monitor
The Legislature is considering raising the state’s minimum wage. This will be the fifth state effort to raise the minimum wage over $5.15 an hour. It has been 10 years since Congress last increased it.
New Hampshire remains the only state in New England that has failed to raise the minimum wage in recent years. Vermont is at $7.25 an hour, and its minimum wage is automatically adjusted upward for inflation. Maine is at $6.75 and will increase to $7 on Oct. 1. Massachusetts is at $7.50 and will go to $8 in 2008.
So why hasn’t New Hampshire raised its minimum wage? A callous disregard for the needs of low-wage workers.
Even when it became clear that surrounding states did not suffer the adverse consequences predicted by opponents, no raise was forthcoming. For policymakers, these workers have not counted. They have no political organization or clout. There was no apparent fallout from ignoring their needs.
It is worth considering why such an uncaring attitude toward poor working-class people has predominated for such a long time. And not just in New Hampshire. Neglect has followed from non-recognition. Being written off has been a species of out of sight, out of mind.
The fact that the minimum wage has returned as a live issue is significant beyond the number of workers directly affected by an increase. Beginning to take the needs of working people seriously is long overdue for both political parties. It remains to be seen
whether either party will go beyond lip service.
We have lived through a period where the fantasy of the free market has reigned. The market will allegedly take care of all issues without any need for government regulation. The only problem is that 10 years has passed and the market did nothing for minimum-wage workers. In this instance, the market failed the public interest.
While the minimum wage stagnated, the pay of CEOs rose spectacularly. In 1978, the average CEO earned 78 times as much as a full-time year-round worker earning the minimum wage. By 2005, that same average CEO earned 821 times as much as a minimum wage worker.
The Economic Policy Institute, a Washington, D.C., think tank, has reported that an average CEO now earns more before lunchtime on the very first day of work in the year than a minimum wage worker earns all year.
Such extreme wage inequality has not always been the American way. An old tradition of the labor movement called for “a fair day’s work for a fair day’s pay.” More generally, it is an American value that not only should jobs be out there for people who want to work, but the jobs should be able to provide for families.
Rather than waiting and hoping Congress will act – which it might – our state legislators are moving forward. The good news is that Rep. Marjorie Smith, chairwoman of the House Finance Committee, is sponsoring House Bill 514, which would raise the minimum wage to $7.25 an hour by Sept. 1, 2008.
The fiscal note attached to the bill states that approximately 30,000 workers in New Hampshire would benefit if the bill passed, but I believe a far broader group of workers would benefit.
Being poor in New Hampshire is about not having money to pay for life’s necessities. New Hampshire is an expensive place to live. The cost of rent and utilities increased by 35 percent during the last six years. Child-care costs increased 88 percent over the last five years. Health insurance costs increased 103 percent over the last seven years.
A minimum-wage increase translates into more money to pay for necessary items like food or rent and less need for assistance from safety net programs. Only government action will protect these workers. The time to act is now.
Bush’s torture policies: Un-american This Is Just What The Founders Warned About – Saturday, July 8, 2006 – Concord Monitor
It has been more than two years since the photographs from Abu Ghraib prison surfaced and the subject of torture re-emerged in our public life. Despite continuing human rights reports about detainee abuse in Iraq, Afghanistan and Guantanamo, public response has been muted and largely indifferent.
The lack of uproar resulted in no independent investigation by Congress. No ranking government official has been held accountable for any torture allegations. Instead, legions of government lawyers have been deployed to explain how abuse did not fit the definition of torture. Their mission has been to extend the legal limits of permissible physical and psychological pain.
The standard explanation from the Bush administration has been the few bad apples theory. A handful of inadequately trained and supervised soldiers went out of control. Higher-ups had nothing to do with it. They have also been arguing that the president has the constitutional power to permit torture.
My perception, and probably the reality, is that most Americans have wanted a president who would torture suspected terrorists. Interestingly, in the 2004 election, John Kerry did not raise Abu Ghraib. Torture policy did not become an issue in the campaign.
To the extent there has been a public debate, it has been obscure. With all due respect to international law, partisan disagreements in legalese about whether international conventions have been violated almost guarantees a limited audience of lawyers and legal observers.
There is a more basic reason people of all political stripes should oppose torture. The use of torture is completely contrary to traditional American values. The drafters of the Bill of Rights were concerned that government might be tempted to cruelty and created a constitutional amendment to safeguard against it.
The Eighth Amendment to the Constitution contains a prohibition against the infliction of cruel and unusual punishment. It protects against and bars the type of interrogation torture conducted at Abu Ghraib and Guantanamo.
Bush administration torture policies are examples of what the founding fathers warned against. They opposed unusual cruelty in the method of punishment as well as disproportionate or excessive punishment. As much as the administration wraps itself in the flag, its torture policies are un-American.
The phrase “cruel and unusual punishment” goes back to the English Bill of Rights of 1689. It shows up in our New Hampshire constitution and most of the other original state constitutions.
At the time of the drafting of the federal Constitution, Patrick Henry and George Mason insisted on the Bill of Rights, including the Eighth Amendment. Henry forcefully spoke out against the use of torture to elicit confessions.
The Eighth Amendment reflected Enlightenment thinking. It introduced broad and idealistic concepts of dignity, civilized standards, humanity and decency. During medieval times and up to the 18th century, torture had been widely used in civil and ecclesiastical courts as a way to obtain confessions. The evaluation of evidence on its own merits rather than forced confession was a legal vision of the founding fathers.
The U.S. Supreme Court has not allowed a frozen or static view of the Eighth Amendment. While all amendments are open to interpretation, the court has considered “evolving standards of decency” in its Eighth Amendment jurisprudence. Unfortunately, one disturbing note must be mentioned. In considering Eighth Amendment violations, the court seems to be paying more attention to the motive or intention of perpetrators than cruel or inhumane treatment.
The American way
Because much information has been released to the public, we know quite a bit about American torture practices. Our current use of torture has its roots in the Cold War. Over the last 50 years, the CIA comprehensively researched and studied the most effective means of torture.
It turns out that psychological torture is the best way to exploit vulnerabilities and break people. The history of American torture research is elegantly laid out in Alfred McCoy’s book, A Question of Torture. McCoy explains that from 1950 to 1962 the CIA ran a massive research project spending over $1 billion a year to perfect torture methods. The result was a breakthrough how-to book, the Kubark Counterintelligence Manual, published in 1963.
There is a direct line between the Kubark manual and the torture policies at Abu Ghraib prison and Guantanamo. The American torture paradigm focuses on sensory deprivation and self-inflicted pain. McCoy shows how the use of isolation, standing, exposure to hot and cold, light and dark, noise and silence, sleep deprivation, hooding and stress positions creates a total assault on all senses and sensibilities.
Abu Ghraib was the result of systemic policies, not a few bad apples. The famous hooded figure with arms outstretched and electrodes attached reflected policy -not its abuse.
Last year, when Sen. John McCain authored a bill to bar all inhumane and cruel treatment to detainees, the White House fought the bill all the way. The president issued a signing statement at the same time the bill passed in order to disregard its content.
In recent weeks, Bush’s civilian lawyers omitted from new detainee policies a ban on cruel and degrading treatment.
When all is said and done, this administration will have earned a place in history as the torture administration. From the top down, it has publicly mouthed opposition to torture while always reserving the right to conduct it.
Torture is not now and never has been an American value. The true American tradition is defense of human dignity.
A little-explored cost of the Iraq war is the extent of psychiatric injury to our troops. While the Vietnam War led to more awareness about the mental-health consequences of combat and a recognition of post-traumatic stress disorder, the likely psychiatric cost of our involvement in Iraq was never considered in the run-up to the war.
Probably this is like other wars. Architects of war have a way of rationalizing and minimizing the staggering level of violence and death wars unleash. If thousands of deaths and grievous bodily injuries are considered an acceptable sacrifice, unseen and private psychiatric injuries are not even part of the calculation.
Three years after the start of the Iraq war, all indications point to a heavy psychiatric toll on our troops. Dr. Jonathan Shay, a psychiatrist at the Boston VA and an expert on post-traumatic stress disorder, estimates that about one-third of our combat veterans will have significant psychiatric injury after the war. That is many thousands of suffering veterans.
The nature of the Iraq war seems particularly problematic. Americans are strangers in a strange land, neither speaking the native language nor comprehending an alien culture. Because it is a guerrilla conflict, random and unexpected death often lurks nearby. Soldiers are subject to the stresses of not knowing when danger will emerge and not even knowing who the enemy is. They must be mobilized to respond immediately if anything is out of place in their environment. Life may well depend on an instant response.
These stresses return home with the troops. Post-traumatic stress disorder is a persistence on the home front of behavior truly needed to cope with war. It is a misperception of a present situation as being a past situation.
In my legal career, I have represented many veterans seeking Social Security benefits for post-traumatic stress disorder. I’d like to tell you about one of them, whom I’ll call Jim. Although his problem was the result of an earlier conflict, I believe Jim is typical.
Jim lives in Claremont. He is a veteran of both the Vietnam era and the first Gulf War. He first started having symptoms of post-traumatic stress in the mid-1990s. He had flashbacks of buildings blowing up and dead bodies on the side of the road. He reported significant irritability and episodes of rage.
Jim experienced memory loss, mood swings and hand tremors. He lost track of time and could not remember things he did. He had great difficulty sleeping. There were instances when he struck his wife in his sleep. Jim’s wife said he had tried to strangle her once when he was asleep. Compounding these issues, Jim developed a serious alcohol problem.
After his return from the Gulf,
Jim started many jobs but could hold none of them for long. For years he had minimal earnings. Because Jim had a back injury from his time in the Gulf, he received some compensation from the VA. However, his claim for disability based on post-traumatic stress disorder was denied. The VA ruled he had insufficient proof.
Jim did get important counseling and medicine from the VA. Still, he had trouble restraining his anger.
No time limit
At least Jim sought help – and received some. A Defense Department study of returning Iraq combat troops found that many soldiers who need counseling the most are least likely to seek it due to the stigma of mental health care in the military. It is common for soldiers to worry that disclosure of symptoms will adversely affect their careers.
Unfortunately, there appears to be no time limit on when post-traumatic stress symptoms surface. It may be weeks, months or years after combat. The military has been willfully blind in lengthening deployments and in allowing multiple combat tours.
There are no adequate studies on the effect of multiple tours of duty and post-traumatic stress disorder. It seems likely, however, that the greater the exposure to combat, the higher the risk of mental-health injury.
Because of the overriding need to retain soldiers, the military has focused on patching them up and returning them to Iraq to finish their deployments. Is it wise to return veterans with post-traumatic stress disorder to combat? Is it ethically proper for mental-health workers to place the manpower needs of the military ahead of the mental-health needs of their patients? How much trauma is too much before a soldier is sent home?
Maybe someday war planners will consider likely psychiatric harm (along with death and physical injury) before they send our troops on a mission. Maybe they will actually keep track of all the mentally wounded and figure a way to quantify the pain and suffering. But I would not bet on it.