By Randall Holcombe •
Monday January 9, 2017 4:45 PM PST •
The tragic shooting at the Fort Lauderdale airport on January 6 occurred in a “gun-free” zone. Florida is one of six states that make it illegal for individuals—even those who have concealed carry permits—to carry guns in any part of an airport terminal.
The killer’s motive is at this point undetermined, but he did fly to the Fort Lauderdale airport with his gun legally checked in his luggage, and after arriving took it out to shoot people in the terminal. Some people speculate that shooters deliberately choose “gun-free” zones for their attacks to minimize the probability that their attacks will be interrupted by legally armed citizens. Would this shooting have happened had the baggage claim area in the Fort Lauderdale airport not been a “gun-free” zone?
This is a policy-relevant question because prior to the shooting, Florida State Senator Greg Steube introduced SB 140, legislation that would allow concealed weapon permit holders to carry firearms in airport terminals, college campuses, and other places that the state now declares “gun-free.”
As Second Amendment advocates often say, declaring a place to be “gun-free” only keeps law-abiding people from carrying guns there. Someone who wants to engage in a mass shooting surely will not be deterred by a location being designated “gun-free,” and might be encouraged because it lessens the probability of armed opposition.
By Alvaro Vargas Llosa •
Monday January 9, 2017 11:30 AM PST •
Vladimir Putin’s intervention in the U.S. election is and will continue to be a matter of controversy because we don’t know all the facts and therefore the full extent of what his government did (nor do we know the extent to which the U.S. intelligence community’s reports and leaks are devoid of political intention). But we do have the full facts about Putin’s evildoing in Syria, and we should not forget them.
To put it very simply, Moscow is the reason why Bashar al-Assad, that blood-thirsty tyrant, has pretty much won Syria’s internal war. Two foreign interventions have been decisive in turning what was an unsustainable situation for Assad’s regime into its current condition, which, despite the ongoing fighting, virtually guarantees that he will remain in power in the foreseeable future.
One is Iran or, to be more precise, the myriad Shiite militias that are in Iran’s orbit, including Hezbollah, the Lebanese organization; Badr, the military wing of an Iraqi political party; and the Fatimid Brigade, an Afghan group. Tehran’s Revolutionary Guard helped Assad set up a structure of Syrian militias that runs parallel to the regular army and was decisive in Syria’s ability to take back significant swaths of land and key cities such as Aleppo.
By John R. Graham •
Friday January 6, 2017 10:25 AM PST •
The Centers for Disease Control and Prevention (CDC), a federal agency, has reported the remarkable news that U.S. life expectancy has dropped for the first time since 1993. According to Mortality in the United States, 2015 (NCHS Data Brief No. 267, December 2016):
- Life expectancy for the U.S. population in 2015 was 78.8 years, a decrease of 0.1 year from 2014.
- The age-adjusted death rate increased 1.2% from 724.6 deaths per 100,000 standard population in 2014 to 733.1 in 2015.
- The 10 leading causes of death in 2015 remained the same as in 2014. Age-adjusted death rates increased for eight leading causes and decreased for one.
The one death rate which improved was for cancer. So, we are “winning” that war, at least relatively speaking.
The entire decrease was for life expectancy at birth. Life expectancy at age 65 was unchanged from the previous year. In other words, children and working-age people are bearing the burden of this decline.
However, the worst (by far) contributor to the decline was an increase in deaths attributable to Alzheimer’s disease, which (although not described in the data brief) is concentrated in people 65 and older. These deaths accounted for almost half (47 percent) of the decline in age-adjusted mortality.
The only way this terrible increase in the burden of Alzheimer’s disease could not have reduced life expectancy at age 65 is if the elderly have enjoyed significant improvement in outcomes for cancer and other diseases.
The next worst contributor to the decline was “unintentional injuries,” which accounted for just under one-third of the increase in the death rate, and must almost certainly be concentrated among those under 65. Suicides also increased significantly, although they do not account for a large absolute share of deaths. Researchers often include both unintentional injuries and suicides as related outcomes for people suffering mental illness and homelessness.
Given the extreme safety of our modern American environment, it would be remarkable if the increase in deaths due to unintentional injuries were concentrated among mentally healthy people. The data brief suggests the harmful behaviors that have been observed increasing among white men are also happening in the rest of the population, because the decline in life expectancy happened for both sexes and all races.
By Sheldon Richman •
Thursday January 5, 2017 12:21 PM PST •
This week, thanks to the Independent Institute, I was interviewed by NPR’s Marketplace for a piece on Donald Trump’s threat to impose tariffs on goods that come from China. (It’s the first story for the January 3 show here at 2:44.) The interviewer wanted to look back at the effects of the Reagan administration’s protectionist policies against Japan. (In 1988 I wrote a paper for the Cato Institute on Reagan’s appalling protectionism.)
I’ve done many media interviews, but this one really drove home the media’s lack of interest in informing their listeners and viewers on important economic topics. Of course, the producers of the show would themselves have to understand economics in order to separate what’s important from what’s unimportant. This may be a case of the blind leading the blind. At any rate, what follows is a lightly edited transcript of the interview and what was aired from the interview. (The questions are paraphrase since my audio files have only my answers.)
Would you say that Reagan’s trade restrictions against Japan worked?
You have to define the word worked, don’t you? If worked means that they raised prices to American consumers and also to American producers who needed to buy some inputs from Japan, yes, they worked. But that was a bad thing. Raising prices through the political system is not a good, and Americans should not support that. Did they work to restore the health of the American economy? I would say there were no grounds for thinking that.
Should we have trade agreements?
By John R. Graham •
Thursday January 5, 2017 10:21 AM PST •
Other than anarcho-libertarians, most agree that government has a role to play in preventing and suppressing epidemics, a classic public-health problem. Viral or bacterial infections are not passed from animal to person, or person to person, by voluntary exchange. Instead, proximity to another’s infection can lead to an individual’s becoming infected, notwithstanding any market interaction.
So, even the most freedom-oriented individuals accept government spending and restrictions on individual choice when the threat of epidemic increases. In 2014, the arrival at Dallas-Fort Worth International Airport of a man carrying the Ebola virus caused some lawmakers to seek a ban on air travel from countries where Ebola had broken out.
Indeed, the federal Centers for Disease Control and Prevention maintain twenty quarantine stations at ports of entry, where public-health officials have the power to detain arriving passengers suspected of carrying communicable diseases.
Fortunately, we do not have to worry too much about these risks today. People in the United States no longer worry about contracting malaria or polio when walking near or swimming in still water. So, it is remarkable that the American people are not outraged that the U.S. government has let mosquitoes carrying the Zika virus enter Florida, where they continue to infect people. This has happened while the federal government’s energy has focused on controlling people’s private health choices, such as forcing Catholic nuns to pay for artificial contraceptives.
By Abigail R. Hall Blanco •
Wednesday January 4, 2017 12:10 PM PST •
A few weeks ago, a friend of mine linked to an article on Facebook titled, “Sexism in Hollywood is Rampant, and Emma Watson Says her Career Proves It.” The article was published in late 2015, but similar pieces pop up from time to time with similar themes.
Despite my better judgment, I clicked on the article. (Why do I do this to myself?)
When I tell people about the policies I discuss in my economics principles classes, I often say it’s like the economic version of the movie Groundhog Day. Every semester we debunk popular economic fallacies. We learn that free trade creates jobs as opposed to killing them, and that the minimum wage harms low-skilled workers as opposed to helping them.
These “economic zombies” come back again time after time. While it’s sometimes depressing to see the same fallacious thinking over and over, I don’t consider our class discussions a Sisyphean exercise. As any teacher knows, students often need to be exposed to an idea several times before it “sticks.”
Alas, here we are again—more terrible arguments in need of serious correction. The article begins discussing actress Emma Watson’s career. She’s starred in the Harry Potter series as well as other films. She’s done work for the U.N. and is a college graduate.
Then comes what makes me want to bang my head against a wall. In an interview, Watson said,
I have experienced sexism in that I have been directed by male directors 17 times and only twice by women. Of the producers I’ve worked with 13 have been male and one has been a woman. I am lucky: I have always insisted on being treated equally and have generally won that equality.... I think my work with the UN has probably made me even more aware of the problems. I went out for a work dinner recently. It was seven men...and me.
By John R. Graham •
Tuesday January 3, 2017 1:07 PM PST •
If your Christmas dinner table had a cross-border contingent, different national characteristics almost certainly came up for discussion. I enjoyed Christmas in Naples, Florida, with a mixed group of Americans and Canadians. One couple consisted of a Canadian husband and an American wife. She insisted Canada’s single-payer health system was superior in every way (despite the couple’s living in Florida, not Canada).
I had sailed with her husband the day before, and he had invited me to play tennis and golf, too. I was exhausted. How did he have so much energy? “Ever since I was five years old, I was blind as a bat, wearing Coke-bottle thick glasses,” he told me. “I could never play any sports. About seven years ago I had surgery to replace my lenses, and since then I play every sport I can. It has been a liberation.”
Because my friend had the surgery in Toronto, this encouraged his wife to resume her praise of single-payer health care. The surgery happened before they had met, so he had to correct her: “No, I paid about $1,000 per eye.”
(The Ontario Health Plan covers cataract and intraocular lens surgery if medically necessary, but my friend’s surgery must not have been medically necessary because his vision was amenable to correction by spectacles.)
This clarification deflated his wife, who announced she had paid about the same for lens surgery at about the same time, but in Florida! Everybody at the table thought the price was worth it. When asked for my opinion, all I could say was: “We all agree a thousand bucks an eye is a fair price for such a miracle. It appears markets work for health care, whether in Canada or the United States. Can you imagine how much more accessible all health services would be if we allowed them to be available through markets, too?”
* * *
For the pivotal alternative to Obamacare, see Priceless: Curing the Healthcare Crisis and A Better Choice: Healthcare Solutions for America, by John C. Goodman, published by Independent Institute.
By Robert Higgs •
Friday December 30, 2016 11:24 AM PST •
Identity politics is hardly a new development. In one form or another, it has been around for millennia. But beginning in the 1960s in the United States of America, identity politics began to take on greater importance in the marshaling of support for political candidates and policies. The civil rights movement represented a revolt by blacks as such (along with their nonblack supporters) against the denial of political equality that had been their lot for centuries. The politics of black identity, however, quickly spilled over onto other groups, giving rise to a revitalized women’s movement, a Chicano movement, a homosexuals’ movement, and a variety of others based on an ascribed or avowed personal identity. In each case, the supporters of an identity movement made the identity as such the principal if not the only basis for the expression of their political interests and engagement. This narrow focus put them at odds with previous political interest groups such as the Democratic and Republican parties, each of which attempted to gather a variety of self-identified persons under a “big tent” that would seek political power and split the loot among all those in the tent.
As identity politics developed after the 1960s a parallel but related development gave rise to what would become known, especially among opponents, as political correctness, an attempt to control speech and conduct that would (or so it was alleged) demean or disadvantage the members of one or more identity-political groups. This development became most conspicuous on college campuses, where zealous leftist faculty members and weak-kneed administrators imposed increasingly stringent control of speech and action by students and faculty members. Kangaroo courts were created where those charged with violations of political correctness could be punished and before which they were generally presumed guilty and often denied the opportunity to confront or cross-examine those who had charged them with violations of the college code. In the wider society, political correctness became increasingly entrenched in workplaces, news media, government offices, and other public and private areas.
By the end of the twentieth century, white heterosexual men had become almost the only group not assigned a protected status, and indeed the one generally presumed to be guilty of the oppression of all the others, often by assumption rather than according to ordinary standards of proof. Needless to say, straight white men and traditional women did not appreciate having been turned into the presumptive guilty parties in a panoply of condemned speech and behavior.
Their resentment came especially to the fore in the presidential campaign of 2016, when Donald Trump made himself the declared champion of their grievances and the unashamed mocker-in-chief of political correctness in general. This stance, along with his egregious views on international trade and immigration, allowed him to marshal enough support to win the presidency. But the establishment, which had long ago embraced political correctness across the board, was not about to fade away gracefully, and its horror at Trump’s election is already being transformed into counter-revolutionary efforts to stymie or derail many of Trump’s initiatives even before he takes office.
The decisive role of identity politics in the recent election and in reactions to it, especially by progressives, seems to represent an instance of what James Buchanan, one of the principal creators (along with Gordon Tullock) of the modern discipline of public choice analysis, called romanticism. For Buchanan, romanticism was the presumption that voters can select office-holders and public policies (e.g., by means of referendums) that will benefit large sectors of the electorate, rather than benefit mainly the office-holders themselves and their principal financial supporters. The general assumption in public choice analysis is that political actors are self-interested fully as much as actors in the market or other areas of private life outside the governmental realm. This assumption, however, has been difficult to square with certain actions aimed at securing a large-scale collective good (e.g., voting itself when the electorate numbers many thousands or millions and hence the probability that anyone’s vote will be decisive in determining the election’s outcome is infinitesimal).
Public choice analysts and economists in general have tended to view such problematic actions as irrational. After all, it seems that actors are bearing positive personal costs in order to obtain a collective good even though the expected value of their action is effectively zero, owing to the tiny likelihood of their action’s having a decisive effect and hence creating a benefit.
In chapter 3 of my book Crisis and Leviathan, I argue that this standard view of the irrationality of individual actions aimed at securing a large-scale collective good is incorrect because it fails to take into account the way in which each actor actually benefits from his actions. The resolution of this puzzle requires that we recognize the importance of each actor’s self-perceived identity. Because the establishment and maintenance of one’s identity requires that one act publicly in solidarity with like-minded comrades, such action does confer a benefit even for a single individual. Those who free-ride on the cost-bearing of others do not obtain the solidarity benefits required to validate their identity. Hence, millions of people do act—for example, they go to the polls and mark a ballot notwithstanding the negligible probability of their vote’s being decisive.
Buchanan and his followers were too quick to ascribe irrationality—or romanticism—to those who follow the dictates of identity politics. This is not to say that the collective goods they seek to be seen as assisting in obtaining are indeed virtuous or valuable or outcomes that will actually benefit them in any way other than the fulfillment of their psychic yearnings for comradeship. But there is nothing at all novel about people’s sacrificing for such indirect ends; indeed, history is replete with such sacrifices. People want what they want, and often they want simply to be seen as standing faithfully among the ranks of the “good guys,” however they conceive of such a group. It is worse than unfortunate that nowadays so many people are willing to go to the political barricades merely in solidarity with those who support political correctness, anti-immigration measures, and restrictions of international trade. But such actions do not warrant a characterization as irrational or romantic, as opposed to merely misguided.
By Randall Holcombe •
Thursday December 29, 2016 9:31 AM PST •
In the bitter aftermath of the 2016 presidential election, Progressives are lamenting Donald Trump’s victory over self-described Progressive Hillary Clinton. Trump’s victory places him in a position of being able to use the power of the presidency to impose his anti-Progressive agenda, displacing the more enlightened Progressive agenda that Clinton would have pursued.
The irony is that the power of the presidency that Trump will assume is the result of more than a century of Progressive reforms that have given the government increasing control over people’s lives. Progressivism began in the late 1800s with the explicit idea that a proper role of government is to favor some at the expense of others.
The earliest Progressive policies included the regulation of the railroads and other businesses along with antitrust laws that were explicitly designed to impose costs on some—who were often labeled “Robber Barons”—to benefit others. Redistribution programs have the same obvious orientation. Some people pay for benefits received by others. The implementation of these Progressive ideas required a government with greater scope and power.
By Abigail R. Hall Blanco •
Wednesday December 28, 2016 3:30 PM PST •
In one of my recent blog posts, I discussed the work of NYU economist William Easterly. In particular, I noted how his work on the pitfalls of modern economic development planning bear a striking resemblance to the work and ideas of F. A. Hayek and James M. Buchanan.
Easterly makes a distinction between “planners” versus “searchers” in development activities. According to Easterly, a “planner” is someone who “thinks he already knows the answers.” A “searcher,” meanwhile, is someone who “admits he doesn’t know the answers in advance; he believes that poverty is a complicated triangle of political, social, historical, institutional and technological factors” (page 6 of The White Man’s Burden). For Easterly, the world of development is full of planners. I contend this argumentation relates directly to F. A. Hayek’s discussion of top-down economic planning. Just as central planners cannot possess the knowledge necessary to plan economy, development planners face similar knowledge problems.
Easterly also argues that the world of development planning is plagued with incentive problems. The incentives faced by recipient governments, various government and non-government organizations, etc. often lead to perverse outcomes. This line of thinking draws heavily on the ideas of James Buchanan and the larger body of work on public choice economics.