The War on Poverty and the War on Drugs



drug_war_crimes_180x270As an apparently war-minded people, Americans (or at least, our American political leaders) have been comfortable framing parts of the domestic policy agenda as wars for decades. Two of the most prominent have been the War on Poverty and the War on Drugs.

Despite the similarity in their names, there is an important difference between the two. The War on Poverty is not a real war. The War on Drugs is.

READ MORE

Muckraker or Special Pleader?



titan-chernowIn “A Brief History of Media Muckraking”, the Wall Street Journal’s Amanda Foreman traces the contributions of “reform-minded journalists from Ida Tarbell to [Bob] Woodward” and a few others who spilled newspaper ink writing about abuses of power by the private and the public sector.

Obviously a fan of the progress made during the Progressive Era (“the golden age for crusading journalism”), Ms. Foreman, like virtually everyone who shares that political view, gets some key facts wrong and misses the big picture when it comes to thinking about the origins of reformist spirit.

Foreman credits Ida Tarbell’s History of the Standard Oil Company (1904) and her earlier series of articles published in McClure’s magazine with helping push the federal government into initiating antitrust action against the Standard Oil “trust”, which ultimately led the U.S. Supreme Court to order the company’s dissolution in 1911. Here, Foreman mistakenly says that the dissolution order was issued under “the 1911 Sherman Anti-Trust Act” (the Sherman Act was passed and signed into law in 1890).

More seriously, Ms. Foreman does not mention that Ida Tarbell was far from being a disinterested observer of John D. Rockefeller, Sr.’s allegedly anticompetitive business practices. Ida’s brother William was treasurer of the Pure Oil Company, a major rival of Standard Oil; he supplied possibly biased information to his sister and helped vet her articles for McClure’s. Ida also nursed a longstanding grudge against the company, blaming Rockefeller for ruining her father’s business as a maker of the wooden barrels used early on to transport crude oil from the field to refineries. Replacing wooden barrels with railroad tanker cars and underground pipelines was one of Rockefeller’s many cost-cutting innovations, which drove down the prices of kerosene to final customers and ended the then-looming shortage of whale oil, but made wooden barrels obsolete.

Information about those and other personal axes Tarbell had to grind is readily accessible in Ron Chenow’s Titan, his monumental biography of John D. Rockefeller, Sr.—a volume I have relied on heavily in my own work, in collaboration with Michael Reksulak and others, on the origins and effects of the government’s case against Standard Oil (our most recent contribution to that literature is “Tarring the Trust”).

One interesting, still unexplained, consequence of Tarbell’s and the Justice Department’s antitrust attack on Standard Oil is that Rockefeller’s wealth tripled (to almost $1 billion) soon after the company was broken up. I hesitate to call this “crony capitalism”. It nevertheless is another example of how progressive ideas backfire, achieving results that were perhaps “unintended”, but the actual effects of the dissolution could have been, as George Stigler taught us long ago, the intended effects.

Sweatshops: Misunderstood Paths Out of Poverty



41IeZrEszcL._SY344_BO1,204,203,200_The collapse of a garment factory in Bangladesh’s Rana Plaza last year killed more than 1,100 workers and reignited an international movement calling for the regulation of so-called sweatshops in the developing world. Unfortunately, the activists often try to promote better working conditions the wrong way because they overlook the harm that boycotts and costly regulations impose on factory workers. They also fail to recognize the positive role that low-wage factory jobs played in the West’s rise from poverty.

“Poor countries today would be better served if anti-sweatshop scholars and activists had a better understanding of how the historical process played out in wealthy countries,” Independent Institute Senior Fellow Benjamin Powell writes in the Summer 2014 issue of The Independent Review.

Before workplace safety regulations were enacted, textile and apparel factories with poor working conditions were economic springboards to prosperity in what is now the developed world, Powell explains. Sweatshops contributed to economic development for about 100 years in the United States (and 30 to 60 years longer in Great Britain), but they eventually closed down largely because the progress they helped foster made them obsolete: by contributing to capital accumulation in the West, the sweatshops helped shift the demand for labor toward higher-productivity jobs. In addition, the rising prosperity meant that fewer and fewer workers were willing to take lower-wage jobs with less-desirable workplace conditions.

Other countries, particularly in East Asia, followed the path out of poverty pioneered by the West—a trail paved with low-wage factory jobs, property-rights enforcement, a market price system, and economic freedom. One difference, however, is that they often attained in only two generations the same general living standards that it took the United States and Great Britain several generations to reach. Sadly, activists who fail to heed this history lesson inadvertently act to hold down workers in the developing world struggling to make ends meet.

* * *

Meet the Old Sweatshops: Same as the New, by Benjamin Powell (The Independent Review, Summer 2014)

Out of Poverty: Sweatshops in the Global Economy, by Benjamin Powell

Making Poor Nations Rich: Entrepreneurship and the Process of Economic Development, edited by Benjamin W. Powell

The Independent Review: Please be sure to take advantage of our special offer of your choice of a FREE book when you renew or order a new subscription online.

[This post first appeared in the July 2p, 2014, issue of The Lighthouse. For a free subscription to this weekly newsletter of current affairs, public-policy analysis, and event announcements, enter your email address here.]

Ban Government—Not Sweets—in Schools to Combat Bureaucratic Obesity



12050387_SIn recent weeks states have been grappling with a host of unintended consequences stemming from new USDA regulations affecting food and beverages available in schools. Chocolate milk was a near casualty in Connecticut. Earlier this month one Washington state school district threw in the towel and banned birthday cupcakes in classrooms. Instead of baked treats, students can share gifts of pencils with their classmates instead, according to school officials.

Just weeks after the new food rules went into effect on July 1, schools in 12 states are working their way around them. As the National Journal reports:

Twelve states have established their own policies to circumvent regulations in the Healthy, Hunger-Free Kids Act of 2010 [here] that apply to “competitive snacks,” or any foods and beverages sold to students on school grounds that are not part of the Agriculture Department’s school meal programs, according to the National Association of State Boards of Education. Competitive snacks appear in vending machines, school stores, and food and beverages, including items sold at bake sales.

Georgia is the latest state to announce an exemption to the federal regulations, which became effective July 1 for thousands of public schools across the country. Its rule would allow 30 food-related fundraising days per school year that wouldn’t meet the new healthy nutritional standards. ...

Tennessee also plans to allow 30 food-fundraising days that don’t comply with federal standards per school year. Idaho will allow 10, while Illinois is slowly weaning schools off their bake sales, hoping to shrink them from an annual 36 days to nine days in the next three years. Florida and Alabama are considering creating their own exemption policies.

Under the new regulations, there are some exemptions for school fundraisers (p. 7), including allowing state education agencies to define what constitutes “a limited number “of school fundraisers (p. 39).

However, it’s worth considering why the USDA has any authority over foods offered outside of its school lunch and breakfast programs (p. 8), and why it has the power to ban fundraisers foods that compete with its meals to be sold during breakfast or lunch time (p. 41). As the school year approaches, expect more news reports about absurd policies resulting from this latest government intrusion into schools.

Maintaining a healthy weight is a goal we can all share, but burying schools, students, and parents in tons of red tape is no way to combat obesity. Perhaps the best way to shed some pounds at school is to shrink the federal government’s involvement back down to its constitutional size.

Are Lawsuits Ending or Mending Teacher Tenure?



Last month Los Angeles Superior Court Judge Rolf M. Treu handed down a landmark decision in Vergara v. California. A group of student plaintiffs supported by a Silicon Valley entrepreneur argued that state tenure laws violated the State Constitution, kept bad teachers on the job, and deprived them of a quality education.

A similar lawsuit is making its way through the State Supreme Court in New York and other state courts across the country, according to the New York Times:

Challenges to teacher tenure laws are moving to the courts since efforts in state legislatures have repeatedly been turned back. Critics of the existing rules say tenure essentially guarantees teachers a job for life. According to the New York suit, only 12 teachers in New York City were fired for poor performance from 1997 to 2007 because of a legally guaranteed hearing process that frequently consumes years and hundreds of thousands of dollars in legal fees. ...

In New York, teachers can earn tenure after a three-year probationary period, which city school officials can extend for another year, and often do. That represents one big difference with California, where teachers can win tenure after 18 months, and even before being certified.

Larry Sand, a retired teacher and president of the California Teachers Empowerment Network explains that even if an anticipated Vergara appeal by the California Teachers Association fails, a new law will have to replace the stricken one. One may already be in the works based on a pending Los Angeles legal settlement, Reed v. the State of California. Seniority-based teacher layoffs, also referred to as last-in, first-out or LIFO, disproportionately affected teachers in 45 of LA’s poorest schools, since the newest teachers are often assigned to schools where more experienced teachers don’t want to work (a longstanding teacher union practice).

After years of wrangling between the local United Teachers of Los Angeles union and the ACLU, who sued on behalf of students, both sides reached a settlement that awards about $25 million annually to affected schools for three years to pay for more administrators, teacher training, and mentor teachers; however, the LIFO issue was never addressed. Reps from both sides applauded the decision, but as Sand notes in his latest City Journal article:

...the agreement never mentions the words “seniority” or “last in/first out.”

What boosters of the Reed settlement can’t explain is how adding administrators to underperforming schools would help retain good teachers. In L.A. Unified, administrators are “at-will” employees, but they’re treated like unionized teachers, and they’re almost never fired for incompetence. ...

“What these 37 schools need urgently is stability in teacher staffs, and this settlement is tailored to achieve that result,” Mark Rosenbaum, the ACLU’s chief counsel in Southern California, told me in an e-mail. “And should budget-based layoffs have to take place in the future, then it will be a no-brainer under current law that the teachers who have been specially trained and taught on these campuses will keep their jobs, no matter their years of seniority.” Rosenbaum is alluding to a part of the education code stipulating that, if a teacher has “special training and experience,” seniority can be waived. This exemption prioritizes teacher credentials, elevating an “input” (training) over an “output” (effectiveness in the classroom). Should layoffs be necessary, schools need to hold on to their best teachers, regardless of whether they have “special training.”

Sand speculates that given the prevailing political climate, a new California seniority law will like be “LIFO lite,” requiring additional teacher training as an alternative to dismissal. He’s right that hiring and firing of teachers should be based on outputs such as teachers’ demonstrated contribution to improved student learning, not more inputs such as time served, training, or additional credentials—none of which have a demonstrated positive impact on improved student achievement. Sand predicts:

Most likely, the legislature would craft a one-size-fits-all state law making small changes to the current system, satisfying the minimum requirements of Vergara and leaving the problem of seniority largely unsolved. Until California has a system that evaluates teachers on how well their students learn, the state’s public education will suffer.

And, until California parents start demanding the right to choose their children’s education providers no matter where they happen to live, don’t expect Sacramento politicians to enact any teacher quality requirements that would benefit students instead of teacher union members.

 

Obamacare Architect Warned That Tax Credits Would Be Available Only in States with Exchanges



ObamacareScreenHartWebHalbig versus Burwell is the famous lawsuit that claims that Obamacare federal health-insurance exchanges cannot pay tax credits to health insurers. The plain language of the law is that only state-based Obamacare health-insurance exchanges can channel these tax credits. The real champions of this argument are Michael Cannon and Jonathan Adler of the Cato Institute, who recently encapsulated their argument in the Wall Street Journal.

The question is still unsettled. Last week, two different Circuit Appeals Court panels came to different conclusions: The DC Circuit agreed that the subsidies could go only to insurers in state exchanges; while the 4th Circuit ruled that they could go through federal exchanges too.

The Obama administration is horrified that the Supreme Court could decide that it is illegal to subsidize insurers in federal exchanges. Most states have declined to set up their own exchanges. Further, some of those that did are closing up shop.

So, imagine the surprise when a researcher at the Competitive Enterprise Institute dug up a 2012 video of Jonathan Gruber, who earned about $400,000 from taxpayers as the “architect” of Obamacare, stating the obvious:

“What’s important to remember politically about this is if you’re a state and you don’t set up an exchange, that means your citizens don’t get their tax credits...”

Of course, Mr. Gruber is now trying to wriggle out of his previous comments. Read the whole story at the CEI blog.

* * *

For the pivotal alternative to Obamacare, please see the Independent Institute’s widely acclaimed book: Priceless: Curing the Healthcare Crisis, by John C. Goodman.

 

A Hundred Years of War



WWImontage2One hundred years ago today, Austria-Hungary fired the first shots of World War I, sparking its conflict with Serbia. Gavrilo Princip, a Bosnian Serb, had assassinated Archduke Franz Ferdinand. Mutual defense agreements ensured that the political clash did not remain regional. Austria-Hungary got support from Germany, the Ottoman Empire, and Bulgaria. Serbia found allies in Britain, France, Belgium, Greece, Romania, Italy, Russia, Portugal, Montenegro, Japan, Brazil and the United States. The global war likely qualified as the worst bloodbath the world had yet to see, certainly in such a short duration. Fifteen million or more died in less than five years. Tens of millions were wounded, displaced, or orphaned. Disease spread. The international trade and exchange that existed before the war never fully recovered.

World War I was a low point for liberties within the United States, once America finally got engaged in the battle. People went to prison for criticizing the military or opposing the draft. Surveillance of the citizenry and crackdowns on dissent became normal. Domestic regulation and taxation skyrocketed. Almost nothing that happened during the New Deal did not have some precursor in Wilson’s wartime domestic governance. There was hope for the United States becoming freer and freer in the early 20th century. World War I altered that picture dramatically. Almost everything the federal government has done in the last century has roots in the 1910s.

For the West and the rest of the world, World War I was in a sense the worst tragedy of all time. World War II was bigger and badder, but had distinct causes in the First World War and its aftermath. During the war, both the Allies and Central Powers committed horrific atrocities. The Germans conducted ghastly submarine warfare. The British starved German civilians. Chemical weapons and trench fighting took the lives of millions in totally pointless battles over useless stretches of territory.

When the United States finally mobilized fully for war, the effort was sold as a way to save democracy and end war for all time. Out of the ashes of World War I emerged not peace and liberty, but reaction and totalitarianism. Communism took root in Eastern Europe. Fascism captured Italy and Germany. Militarists took over Japan. Managerial statism dominated in the West. The conditions for mass tyranny and another world war, even more atrocious and larger than its predecessor, were now in place. The instability of the Middle East also traces back to the work of Westerners in marking territorial boundaries according to their own priorities and bad assumptions.

The worst regimes and cataclysms of the first half of the twentieth century had roots in the international war that began a hundred years ago today. It was the beginning of three decades of unspeakable suffering, what some scholars collectively call the hemocylsm—World War I, the Soviet atrocities, World War II and its atrocities from the Holocaust to the atomic bombings. These terrors of course gave way to the Cold War, the fears of MAD, the mutually reinforcing cycle of violence between Islamic fundamentalism and Western imperialism.

Next time a war is recommended to secure peace and freedom, I recommend we all say, no thank you. We’re still suffering from the first one.

More States Abandoning the Sinking Common Core Ship



pare_o_nucleo_comum_podre_ao_poster_do_nucleo-r5b1ed5648bc64059ad8ab6f0498fd5db_a4ndz_8byvr_324Barbarians at the gate.” That’s what Arizona Superintendent of Public Instruction John Huppenthal called opponents of Common Core national standards several weeks ago. His remarks are symptomatic of just how far elected officials within and outside Arizona have strayed from our Constitution, which doesn’t even contain the word “education.”

Supporters claim Common Core will provide a consistent, clear understanding of what students should know to be prepared for college and their future careers. On the contrary, many experts serving on Common Core review committees warn that academic rigor was compromised for the sake of political buy-in from the various political interest groups involved—including teachers unions.

Unsurprisingly, the curriculum is being used to advance a partisan political agenda, showcasing one-sided labor union, ObamaCare, and global warming materials, along with more graphic, adult-themed books under the auspices of promoting diversity and toleration. But the politicization doesn’t stop there.

Non-academic, personal information is being collected through federally funded Common Core testing consortia about students and their parents, including family income, parents’ political affiliations, their religion, and students’ disciplinary records—all without parental consent. That information, including Social Security numbers of students in at least one state, is being shared with third-party data collection firms, prompting a growing number of parents to opt their children out of Common Core.

But they’re not alone.

Originally, 45 states signed on to Common Core, but so far four states have formally pulled out. Indiana recently became the first one to reverse course and implement state standards instead. This decision earned a threatening letter from the U.S. Department of Education about withholding funds and revoking Indiana’s waiver from onerous federal No Child Left Behind Act mandates.

South Carolina, Missouri, and Oklahoma have also ditched Common Core standards. In fact, Oklahoma’s legislation is considered the strictest to date for expressly reinstating previous standards for a two-year review period and prohibiting any aligning between assessments and Common Core. Seven additional states have pulled out of their federally subsidized testing consortia, and four more are considering doing the same—although one testing consortium, Partnership for Assessment of Readiness for College and Careers (PARCC), still lists several withdrawn states as members.

Common Core is publicized as a state-led, voluntary initiative, but in reality it’s an offer states can’t refuse if they want their share of billions of federal dollars for education programs.

So much for Common Core being “voluntary” or “state-led.” So much, too, for the notion that federal education aid, which historically has averaged at around just 10 percent of all education funding, is “free.”

It’s a sad state of affairs when Americans striving to rid their children’s schools of educational barbarism are vilified for wanting to end federal intrusion in education. Elected state officials like Superintendent Huppenthal should recall that for decades the feds have been effectively bribing them with additional cash (which actually comes from their own constituents’ pockets) and far-fetched promises, including these whoppers:

By 1984 they will eliminate illiteracy (p. 35). That didn’t work.

By 2000 high school graduation rates would reach 90 percent. Nope. Wrong again.

By 2000 again American students were supposed to be global leaders in math and science. Well, not so much based on recent results.

Finally, by 2014 all students will be proficient in reading and math. Not even close.

Over-promising and under-delivering seems to be the legacy of the federal government’s “leadership” in education. With virtually no exceptions, major programs of the Elementary and Secondary Education Act of 1965 (ESEA), currently dubbed No Child Left Behind (NCLB), have not worked after decades of tinkering.

One Senator from Arizona certainly saw this coming. Nearly 60 years ago U.S. Sen. Barry Goldwater opposed the National Defense Education Act of 1958, which included 12 federal mandates on the states—a regulatory pittance by 21st century standards. He rightly predicted that “federal aid to education invariably means federal control of education” (p. 76, emphasis original).

Children need to learn the basics, but there are better ways to accomplish that goal than embracing a national curriculum developed by politicians, special interest groups, and private companies that have a lot of financial skin in the game.

Parental choice programs educate students to high standards, without limiting the diverse schooling options needed to meet their unique, individual needs. Importantly, unlike accountability initiatives involving rigid federal mandates, all parental choice schools face immediate rewards for success or consequences for failure, since parents are empowered to enroll or transfer their children in schools as they see fit.

Ultimately, Common Core rests on the faulty premise that a single, centralized entity knows what’s best for all 55 million students nationwide. Raising the education bar starts with putting the real experts in charge: students’ parents.

A Hell of a Pinpoint Operation



30024012_SSecretary of State John Kerry was right to call Israel’s Operation Protective Edge against Hamas “a hell of a pinpoint operation” in an apparently private comment that had the hallmark of a diplomatic move aimed at putting pressure on Tel Aviv. Except that he was referring ironically to the military aspect of the operation, and it is the political aspect that truly expresses the “pinpoint” nature of what Israel is doing—without the irony.

Israeli Prime Minister Bibi Netanyahu’s precise target is the alliance between Fatah—led by Mahmud Abbas, the president of the Palestinian Authority—and Hamas, formed in April after seven years of conflict in the occupied territories. His strategy has always been to make unviable any Palestinian entity (let alone the possibility of sharing the same land with the Arabs under equality before the law). His tactics are at the service of that strategy. All he needs is to gain time until the “fait accompli” makes things irreversible. Operation Protective Edge serves that purpose.

Netanyahu knows three things work in his favor. The nature of Hamas, an organization that has engaged in terrorism, makes the atrocities arising from the land, air, and naval attacks easy to justify with the argument that the Palestinians use civilians as shields and that leaving their capability intact will expose Israelis to rockets. The tragic Jewish history confers impunity on Tel Aviv’s authorities: criticizing Israel can easily be construed as anti-Semitism. Finally, no U.S. administration can afford, domestically, to really distance itself from Tel Aviv.

Let’s remember how we got to Operation Protective Edge. In July 2013 the Obama administration launched a Middle East initiative and set a nine-month deadline for Israel and the Palestinians to reach an agreement. But the Israelis continued to expand the settlements (thousands of permits for new units were issued). When the deadline was near, Netanyahu reneged on his commitment to free hundreds of prisoners. He got the response he wanted from Abbas, who gave up and engaged in unilateral initiatives aimed at conveying the impression that the Palestinian Authority, which gained observer status at the UN in 2012, is a state in process. It was only a matter of time before an incident would trigger violence in Gaza, which houses not only Hamas but also a wing of the ruthless Islamic Jihad.

READ MORE

How to Pay for the Next Sovaldi?



9288299_SImagine a pill that could cure cancer with one course of therapy or reverse an inherited, deadly disease. If it cost $1 million, could you access it?

This was the question asked at a recent panel discussion held by the American Enterprise Institute. The panel discussed a couple of new proposals to finance new medicines that come at a high price. Because these medicines address the needs of only a small number of patients, manufacturers contend that prices need to be high to make the investment worthwhile.

One proposal was put forward by Scott Gottlieb, MD, (of the American Enterprise Institute) and Tanisha Carino (of Avalere Health). They put forward a redefinition of spending on specialty drugs as capital investment rather than consumption spending. This is because, for example in the case of Sovaldi, the expensive upfront costs of the drug are more than paid for by dramatically reducing costs over the next twenty or thirty years for a patient who might otherwise require a liver transplant.

Gottlieb and Tarino’s paper is not technical, and one way to envision the outcome would be a mechanism whereby the patient or insurer would pay the (estimated) $84,000 cost of Sovaldi over twenty years in smaller pieces, rather than all in three months. (Gottlieb and Carino do not actually give an illustration, but I believe my example is an accurate representation of a potential version of what they describe.)

Another proposal was put forward by Professor Tomas J. Philipson and Andrew C. von Eschenbach, both of Precision Health Economics, LLC. Philipson and von Eschenbach are interested in using credit markets to reduce the immediate cost of paying for drugs. Their model suggests that government should incur a significant fraction of such debt, given that these specialty drugs will benefit future patients (so they should bear a share of the burden through an increase in public debt). READ MORE