By Sheldon Richman •
Thursday January 5, 2017 12:21 PM PST •
This week, thanks to the Independent Institute, I was interviewed by NPR’s Marketplace for a piece on Donald Trump’s threat to impose tariffs on goods that come from China. (It’s the first story for the January 3 show here at 2:44.) The interviewer wanted to look back at the effects of the Reagan administration’s protectionist policies against Japan. (In 1988 I wrote a paper for the Cato Institute on Reagan’s appalling protectionism.)
I’ve done many media interviews, but this one really drove home the media’s lack of interest in informing their listeners and viewers on important economic topics. Of course, the producers of the show would themselves have to understand economics in order to separate what’s important from what’s unimportant. This may be a case of the blind leading the blind. At any rate, what follows is a lightly edited transcript of the interview and what was aired from the interview. (The questions are paraphrase since my audio files have only my answers.)
Would you say that Reagan’s trade restrictions against Japan worked?
You have to define the word worked, don’t you? If worked means that they raised prices to American consumers and also to American producers who needed to buy some inputs from Japan, yes, they worked. But that was a bad thing. Raising prices through the political system is not a good, and Americans should not support that. Did they work to restore the health of the American economy? I would say there were no grounds for thinking that.
Should we have trade agreements?
By John R. Graham •
Thursday January 5, 2017 10:21 AM PST •
Other than anarcho-libertarians, most agree that government has a role to play in preventing and suppressing epidemics, a classic public-health problem. Viral or bacterial infections are not passed from animal to person, or person to person, by voluntary exchange. Instead, proximity to another’s infection can lead to an individual’s becoming infected, notwithstanding any market interaction.
So, even the most freedom-oriented individuals accept government spending and restrictions on individual choice when the threat of epidemic increases. In 2014, the arrival at Dallas-Fort Worth International Airport of a man carrying the Ebola virus caused some lawmakers to seek a ban on air travel from countries where Ebola had broken out.
Indeed, the federal Centers for Disease Control and Prevention maintain twenty quarantine stations at ports of entry, where public-health officials have the power to detain arriving passengers suspected of carrying communicable diseases.
Fortunately, we do not have to worry too much about these risks today. People in the United States no longer worry about contracting malaria or polio when walking near or swimming in still water. So, it is remarkable that the American people are not outraged that the U.S. government has let mosquitoes carrying the Zika virus enter Florida, where they continue to infect people. This has happened while the federal government’s energy has focused on controlling people’s private health choices, such as forcing Catholic nuns to pay for artificial contraceptives.
By Abigail R. Hall Blanco •
Wednesday January 4, 2017 12:10 PM PST •
A few weeks ago, a friend of mine linked to an article on Facebook titled, “Sexism in Hollywood is Rampant, and Emma Watson Says her Career Proves It.” The article was published in late 2015, but similar pieces pop up from time to time with similar themes.
Despite my better judgment, I clicked on the article. (Why do I do this to myself?)
When I tell people about the policies I discuss in my economics principles classes, I often say it’s like the economic version of the movie Groundhog Day. Every semester we debunk popular economic fallacies. We learn that free trade creates jobs as opposed to killing them, and that the minimum wage harms low-skilled workers as opposed to helping them.
These “economic zombies” come back again time after time. While it’s sometimes depressing to see the same fallacious thinking over and over, I don’t consider our class discussions a Sisyphean exercise. As any teacher knows, students often need to be exposed to an idea several times before it “sticks.”
Alas, here we are again—more terrible arguments in need of serious correction. The article begins discussing actress Emma Watson’s career. She’s starred in the Harry Potter series as well as other films. She’s done work for the U.N. and is a college graduate.
Then comes what makes me want to bang my head against a wall. In an interview, Watson said,
I have experienced sexism in that I have been directed by male directors 17 times and only twice by women. Of the producers I’ve worked with 13 have been male and one has been a woman. I am lucky: I have always insisted on being treated equally and have generally won that equality.... I think my work with the UN has probably made me even more aware of the problems. I went out for a work dinner recently. It was seven men...and me.
By John R. Graham •
Tuesday January 3, 2017 1:07 PM PST •
If your Christmas dinner table had a cross-border contingent, different national characteristics almost certainly came up for discussion. I enjoyed Christmas in Naples, Florida, with a mixed group of Americans and Canadians. One couple consisted of a Canadian husband and an American wife. She insisted Canada’s single-payer health system was superior in every way (despite the couple’s living in Florida, not Canada).
I had sailed with her husband the day before, and he had invited me to play tennis and golf, too. I was exhausted. How did he have so much energy? “Ever since I was five years old, I was blind as a bat, wearing Coke-bottle thick glasses,” he told me. “I could never play any sports. About seven years ago I had surgery to replace my lenses, and since then I play every sport I can. It has been a liberation.”
Because my friend had the surgery in Toronto, this encouraged his wife to resume her praise of single-payer health care. The surgery happened before they had met, so he had to correct her: “No, I paid about $1,000 per eye.”
(The Ontario Health Plan covers cataract and intraocular lens surgery if medically necessary, but my friend’s surgery must not have been medically necessary because his vision was amenable to correction by spectacles.)
This clarification deflated his wife, who announced she had paid about the same for lens surgery at about the same time, but in Florida! Everybody at the table thought the price was worth it. When asked for my opinion, all I could say was: “We all agree a thousand bucks an eye is a fair price for such a miracle. It appears markets work for health care, whether in Canada or the United States. Can you imagine how much more accessible all health services would be if we allowed them to be available through markets, too?”
* * *
For the pivotal alternative to Obamacare, see Priceless: Curing the Healthcare Crisis and A Better Choice: Healthcare Solutions for America, by John C. Goodman, published by Independent Institute.
By Robert Higgs •
Friday December 30, 2016 11:24 AM PST •
Identity politics is hardly a new development. In one form or another, it has been around for millennia. But beginning in the 1960s in the United States of America, identity politics began to take on greater importance in the marshaling of support for political candidates and policies. The civil rights movement represented a revolt by blacks as such (along with their nonblack supporters) against the denial of political equality that had been their lot for centuries. The politics of black identity, however, quickly spilled over onto other groups, giving rise to a revitalized women’s movement, a Chicano movement, a homosexuals’ movement, and a variety of others based on an ascribed or avowed personal identity. In each case, the supporters of an identity movement made the identity as such the principal if not the only basis for the expression of their political interests and engagement. This narrow focus put them at odds with previous political interest groups such as the Democratic and Republican parties, each of which attempted to gather a variety of self-identified persons under a “big tent” that would seek political power and split the loot among all those in the tent.
As identity politics developed after the 1960s a parallel but related development gave rise to what would become known, especially among opponents, as political correctness, an attempt to control speech and conduct that would (or so it was alleged) demean or disadvantage the members of one or more identity-political groups. This development became most conspicuous on college campuses, where zealous leftist faculty members and weak-kneed administrators imposed increasingly stringent control of speech and action by students and faculty members. Kangaroo courts were created where those charged with violations of political correctness could be punished and before which they were generally presumed guilty and often denied the opportunity to confront or cross-examine those who had charged them with violations of the college code. In the wider society, political correctness became increasingly entrenched in workplaces, news media, government offices, and other public and private areas.
By the end of the twentieth century, white heterosexual men had become almost the only group not assigned a protected status, and indeed the one generally presumed to be guilty of the oppression of all the others, often by assumption rather than according to ordinary standards of proof. Needless to say, straight white men and traditional women did not appreciate having been turned into the presumptive guilty parties in a panoply of condemned speech and behavior.
Their resentment came especially to the fore in the presidential campaign of 2016, when Donald Trump made himself the declared champion of their grievances and the unashamed mocker-in-chief of political correctness in general. This stance, along with his egregious views on international trade and immigration, allowed him to marshal enough support to win the presidency. But the establishment, which had long ago embraced political correctness across the board, was not about to fade away gracefully, and its horror at Trump’s election is already being transformed into counter-revolutionary efforts to stymie or derail many of Trump’s initiatives even before he takes office.
The decisive role of identity politics in the recent election and in reactions to it, especially by progressives, seems to represent an instance of what James Buchanan, one of the principal creators (along with Gordon Tullock) of the modern discipline of public choice analysis, called romanticism. For Buchanan, romanticism was the presumption that voters can select office-holders and public policies (e.g., by means of referendums) that will benefit large sectors of the electorate, rather than benefit mainly the office-holders themselves and their principal financial supporters. The general assumption in public choice analysis is that political actors are self-interested fully as much as actors in the market or other areas of private life outside the governmental realm. This assumption, however, has been difficult to square with certain actions aimed at securing a large-scale collective good (e.g., voting itself when the electorate numbers many thousands or millions and hence the probability that anyone’s vote will be decisive in determining the election’s outcome is infinitesimal).
Public choice analysts and economists in general have tended to view such problematic actions as irrational. After all, it seems that actors are bearing positive personal costs in order to obtain a collective good even though the expected value of their action is effectively zero, owing to the tiny likelihood of their action’s having a decisive effect and hence creating a benefit.
In chapter 3 of my book Crisis and Leviathan, I argue that this standard view of the irrationality of individual actions aimed at securing a large-scale collective good is incorrect because it fails to take into account the way in which each actor actually benefits from his actions. The resolution of this puzzle requires that we recognize the importance of each actor’s self-perceived identity. Because the establishment and maintenance of one’s identity requires that one act publicly in solidarity with like-minded comrades, such action does confer a benefit even for a single individual. Those who free-ride on the cost-bearing of others do not obtain the solidarity benefits required to validate their identity. Hence, millions of people do act—for example, they go to the polls and mark a ballot notwithstanding the negligible probability of their vote’s being decisive.
Buchanan and his followers were too quick to ascribe irrationality—or romanticism—to those who follow the dictates of identity politics. This is not to say that the collective goods they seek to be seen as assisting in obtaining are indeed virtuous or valuable or outcomes that will actually benefit them in any way other than the fulfillment of their psychic yearnings for comradeship. But there is nothing at all novel about people’s sacrificing for such indirect ends; indeed, history is replete with such sacrifices. People want what they want, and often they want simply to be seen as standing faithfully among the ranks of the “good guys,” however they conceive of such a group. It is worse than unfortunate that nowadays so many people are willing to go to the political barricades merely in solidarity with those who support political correctness, anti-immigration measures, and restrictions of international trade. But such actions do not warrant a characterization as irrational or romantic, as opposed to merely misguided.
By Randall Holcombe •
Thursday December 29, 2016 9:31 AM PST •
In the bitter aftermath of the 2016 presidential election, Progressives are lamenting Donald Trump’s victory over self-described Progressive Hillary Clinton. Trump’s victory places him in a position of being able to use the power of the presidency to impose his anti-Progressive agenda, displacing the more enlightened Progressive agenda that Clinton would have pursued.
The irony is that the power of the presidency that Trump will assume is the result of more than a century of Progressive reforms that have given the government increasing control over people’s lives. Progressivism began in the late 1800s with the explicit idea that a proper role of government is to favor some at the expense of others.
The earliest Progressive policies included the regulation of the railroads and other businesses along with antitrust laws that were explicitly designed to impose costs on some—who were often labeled “Robber Barons”—to benefit others. Redistribution programs have the same obvious orientation. Some people pay for benefits received by others. The implementation of these Progressive ideas required a government with greater scope and power.
By Abigail R. Hall Blanco •
Wednesday December 28, 2016 3:30 PM PST •
In one of my recent blog posts, I discussed the work of NYU economist William Easterly. In particular, I noted how his work on the pitfalls of modern economic development planning bear a striking resemblance to the work and ideas of F. A. Hayek and James M. Buchanan.
Easterly makes a distinction between “planners” versus “searchers” in development activities. According to Easterly, a “planner” is someone who “thinks he already knows the answers.” A “searcher,” meanwhile, is someone who “admits he doesn’t know the answers in advance; he believes that poverty is a complicated triangle of political, social, historical, institutional and technological factors” (page 6 of The White Man’s Burden). For Easterly, the world of development is full of planners. I contend this argumentation relates directly to F. A. Hayek’s discussion of top-down economic planning. Just as central planners cannot possess the knowledge necessary to plan economy, development planners face similar knowledge problems.
Easterly also argues that the world of development planning is plagued with incentive problems. The incentives faced by recipient governments, various government and non-government organizations, etc. often lead to perverse outcomes. This line of thinking draws heavily on the ideas of James Buchanan and the larger body of work on public choice economics.
By Robert Higgs •
Tuesday December 27, 2016 9:43 AM PST •
Peace on earth, goodwill
Sounds ideal to you and me
MICC has other plan
By Sam Staley •
Friday December 23, 2016 9:02 PM PST •
Leaving the movie theater after watching Rogue One: A Star Wars Story, I was wondering if the “one off”—a story that is within the Star Wars universe but is independent from the nine core films (Episodes I through IX)—might really be the recipe needed to reboot the franchise. The move to Disney Studios likely helped reinvigorate the movies (if not the story itself), and Rogue One, unlike the other episodes in the grander saga, embodies the feel and tightness of a complete movie. Solid performances by the cast, a storyline that keeps a brisk pace, and directing that maintains consistent forward momentum have produced a film that is an enjoyable and rousing action yarn. This is no small feat given that the vast majority of the audience knows, or at least suspects, the ending.
Rogue One centers on a missing link between Revenge of the Sith (Episode III), which sees the construction of the planet destroying super weapon Death Star, and A New Hope (Episode IV), which chronicles the rise of Luke Skywalker as the Jedi Knight in training who harnesses the Force to destroy it. In Rogue One, we meet Galen Erso (Mads Mikkelsen), the scientist who reluctantly designs and oversees the construction of the Death Star, and discover how the Rebel Alliance secures the plans to destroy it. The story hinges on Galen’s daughter, Jyn Erso (Felicity Jones), who was hidden from the Galactic Empire, rescued by, and then ultimately abandoned by, rebel leader Saw Gerrera (Forest Whitaker). Jyn is a criminal in route to a work camp when she is “rescued” by the rebels, who want her to find her father. They know about the construction of the Death Star and her father’s role, and they see his daughter as the key to finding him and stopping its construction. Thrown into the mix is Cassian Andor (Diego Luna), a Rebel Alliance intelligence officer, who is secretly ordered to kill Galen Erso when they find him.
The film does an excellent job of taking the audience through a linear sequence of scenes and episodes, reintroducing well-known and little-known characters from the epic in ways that should please die-hards without weighing the film down for those less familiar with the details of the epic. Despite dozens of characters entering, reappearing, and leaving the story, many have clearly defined character arcs that allow audiences to connect with them. This gives the characters heart, if not a deeper spiritual soul. Even supporting players such as stoic, rebel veteran Saw Gerrera and the defecting Imperial pilot Bodhi Rook (Riz Ahmed) find their characters changed in meaningful ways by the events and the heroism elicited from the challenges they face. The more intriguing characters include Chirrut Imwe (Donnie Yen), a blind warrior who believes in the Force, and his friend and mercenary rebel Baze Malbus (Jian Wen). The two characters play off each other well, and they embody the nascent and untapped potential of the rebellion along with the implications of applying a naive and untrained understanding of the Force.
In fact, Rogue One is an artful example of how multiple characters evolve along these different arcs in ways that serve the story and the film, and, ultimately, see their own destiny in working with and respecting others in service to a higher purpose (the defeat of the Galactic Empire). Through its focus on individuals and these relationships, the story plays on the development of individuals, their relationships, and the bonding that comes from understanding and evolving mutual respect. This is one of the few movies in the Star Wars saga where I left the theater with a slight sadness about not seeing these characters again. The tightness of the story, pace, and character development benefited from the discipline needed to constrain the story to a normal length feature film (about two hours), and the experience is richer for it.
What is missing from Rogue One, in my view, is a new understanding or exploration into the soul of the saga. Perhaps the movie doesn’t need one to be good, or even excellent. We already know that this is a story of a heroes versus villains, Jedi Knights versus an authoritarian empire, individual courage against collective repression. These stories play out on a grand scale in the original nine episodes of Star Wars. (For libertarian takes on these themes, see articles by software developer Russell Hassan, Ilya Somin’s podcast here, and here, among just a few.) This soul, however, is what captures the imaginations of many in the libertarian movement as well as triggers numerous debates over its content. A new look at how this soul manifests itself in the Rebellion would have deepened the film. Ironically, Rogue One could have provided this through the evolution of the relatively minor characters of Chirrut Imwe and Baze Malbus as well as Saw Gerrara’s decision to break from the Alliance.
My own views probably map closely with those of Amy Sturgis (see her excellent essay in Reason magazine here), who has discussed ways the sprawling Star Wars universe has facilitated broad, cultural myth building around an anti-authoritarian ethic, despite the creator’s ambivalence toward capitalism or a truly libertarian understanding of freedom. George Lucas may be a skeptic, but his story focuses on how individuals work together to fight totalitarianism. The emphasis on individual identity in service to a larger public good gives some libertarians pause, but the central thrust of the story in still focused on protecting individual freedom. Rogue One adds to this by highlighting and honoring the courage and sacrifices of individuals with smaller parts in the overall universe, but whose actions have profound implications for the outcome.
“In the end, Star Wars doesn’t say anything profoundly new, or even say it in a stunningly original way,” writes Sturgis on the saga in general.
“It is a saga with global underpinnings, one that echoes the classical epics that first gave shape to what mythologist Joseph Campbell dubbed the Hero’s Journey. But the true genius of Star Wars rests in how it distills recent popular culture down to its most potent symbols, such as the cowboy or the samurai or Flash Gordon himself. Furthermore, it’s a “working myth” that carries a sense of history and message of substance with which to challenge contemporary audiences. What we’ve seen thus far during the Disney era of the franchise suggests that the most interesting days of Star Wars storytelling are far from over.”
I agree, but would add that in the hands of good filmmakers it makes an excellent and satisfying ride. Rogue One hopefully foretells many more similarly engaging and entertaining films to come from Disney studios.
By Abigail R. Hall Blanco •
Friday December 23, 2016 9:19 AM PST •
The Colombian government recently “persuaded” food producers to agree to have the prices of some of their goods “frozen.” Soon the prices of products like red meat, fish, dairy, eggs, grains, and processed foods will be set by the Colombian government. Once imposed, producers of these goods will be unable to raise their prices.
According to Colombia’s Agricultural Minister, Aurelio Iragorri, the measure is intended to achieve food security following price fluctuations over the past several months.
On this blog I’ve discussed how the price system is fundamentally important. Only via the mechanisms of private property rights, prices, and profit and loss can resources be allocated to their highest valued use.
But these facts of economics don’t always make people happy. As I explain to my principles students, sometimes everyday citizens and government officials look at the prices of particular items and don’t like what they see. They think some prices are either “too high” or “too low.” As a result, government will attempt to “fix” these “wrong” prices by imposing a price control. In the case that a price is deemed to be “too low,” governments will impose what’s called a price floor, setting the minimum price that consumers must pay for a particular good or service. The most common example of this type of price control is the minimum wage.