Anthony de Jasay: Political Philosopher Par Excellence

Anthony de Jasay isn’t a household name, but he should be. The former Parisian banker is one of the most original thinkers in political philosophy today, and his insights on the nature of liberty, justice, and the state have major implications for how we might improve our governments, communities, and culture.

The Summer 2015 issue of The Independent Review features a symposium on Jasay’s work, with contributions by G. Patrick Lynch, Hartmut Kliemt, Pierre Lemiux, André Azevedo Alvez, Carlo Ludovico Cordasco and Sebastiano Bvetta, and David M. Hart. (Also in this issue, Michael Munger reviews Jasay’s latest book, Social Justice and the Indian Rope Trick.)

Jasay’s striking originality makes him hard to classify. His writings suggest an affinity for classical liberalism, but he has criticized that tradition for its “unrestricted wishful thinking.” He is admired by public-choice scholars, but he takes issue with the constitutionalism of James M. Buchanan. And although he advocates free markets, he has called Austrian School economist F. A. Hayek “startlingly naïve.”

Nevertheless, Jasay’s freshness and profundity have earned him high praise from serious, liberty-minded readers. About his 1985 treatise, The State, symposium editor G. Patrick Lynch writes: “In this work, Jasay provides as realistic and unromantic a vision of the foundations of government as one can image.”

To understand the state, Jasay says we must first view it as a single agent with self-interested goals. Then we must ask: What would you do if you were the state?

Jasay’s approach inspires our contributors to tackle a host of important questions: How might a government be designed to minimize any threats to liberty? Why does Jasay find fault with Buchanan’s and Rawls’s “contractarian” theories of government? And how might public goods be provided without the use of government coercion to deal with the free-rider problem?

Jasay made his reputation by illuminating timeless theoretical issues, but he has also written numerous popular columns on current affairs. The final article in our symposium compares this work to that of Frédéric Bastiat, the 19th-century French individualist whom Schumpeter called “the most brilliant economic journalist who ever lived.” The verdict? Jasay brille!

* * *

The Independent Review, a journal devoted to political economy, public policy, and intellectual history, is published quarterly by Independent Institute. SPECIAL OFFER: If you’re not already a subscriber, sign up for the print version and receive a FREE book. eSubscriptions are available via an app for Apple iOS and Amazon Kindle.

CalSTRS Boss Jack Ehnes Deceives Californians About Funding

CalSTRS CEO Jack Ehnes

Jack Ehnes, CEO of the massive California State Teachers’ Retirement System (CalSTRS), deceived the public in a recent blog post opposing public pension reform in California:

CalSTRS has not taken any “pension holidays,”’ which means contributions have been made continuously, thus reinforcing the sustainability of the fund.

Ehnes fails to indicate whether: (A) $1 was contributed to the pension fund each year; (B) the full “annual required contribution” (ARC) was contributed to CalSTRS each year, ensuring enough money to pay all promised benefits; or (C) something in between was contributed. Only (B) would be prudent financial management.

So which was it? Let’s check the facts.

Former Federal Reserve Board Chairman Paul Volcker and former New York Lieutenant Governor Richard Ravitch, looked into the funding of several state public pension systems and found that over just a six-year period, CalSTRS’s ARC was underpaid by a staggering $11 billion (see p. 38 of the report).

Volcker and Ravitch reported that more than $27 billion should have been invested in CalSTRS from 2006 through 2011 to keep it on track, but only $16 billion was invested. In 2013, in fact, CalSTRS had the largest skipped ARC in the country, according to Stanford University researcher David Crane.

CalSTRS is the poster child for irresponsible and inefficient management of a public pension system, as evidenced by its $74 billion deficit (self-reported by CalSTRS). Because the ARC was massively underpaid, CalSTRS lost decades of compounded earnings, so now taxpayers are on the hook to pay ballooning “catch-up” contributions as mandated by Assembly Bill 1469.

CalSTRS mismanagement makes the case for meaningful pension reform in California. Jack Ehnes gives everyone good reason to distrust government pension bosses.

My new book California Dreaming: Lessons on How to Resolve America’s Public Pension Crisis explains which pension reforms should be adopted.

Game Developers Face Final Boss: The FDA

[This piece was later published by Newsweek and the Foundation for Economic Education.]

Absent the FDA, Americans would be healthier and happier.” —Robert Higgs

As I drove to work the other day, I heard a very interesting segment on NPR that featured a startup designing video games to improve cognitive skills and relieve symptoms associated with a myriad of mental health conditions. One game highlighted, Project Evo, has shown good preliminary results in training players to ignore distractions and stay focused on the task at hand:

“We’ve been through eight or nine completed clinical trials, in all cognitive disorders: ADHD, autism, depression,” says Matt Omernick, executive creative director at Akili, the Northern California startup that’s developing the game.

Omernick worked at Lucas Arts for years, making Star Wars games, where players attack their enemies with light sabers. Now, he’s working on Project Evo. It’s a total switch in mission, from dreaming up best-sellers for the commercial market to designing games to treat mental health conditions.

“The qualities of a good video game, things that hook you, what makes the brain — snap — engage and go, could be a perfect vessel for actually delivering medicine,” he says.

In fact, the creators believe their game will be so effective it might one day reduce or replace the drugs kids take for ADHD.

This all sounds very promising.

In recent years, many observers (myself included) have expressed deep concerns that we are living in the “medication generation,” as defined by the rapidly increasing numbers of young people (which seems to have extended to toddlers and infants!) taking psychotropic drugs. As experts and laypersons continue to debate the long-term effects of these substances, the news of intrepid entrepreneurs creating non-pharmaceutical alternatives to treat mental health problems is definitely a welcome development.

But a formidable final boss stands in the way:

[B]efore they can deliver their game to players, they first have to go through the Food and Drug Administration — the FDA.

The NPR story goes on to detail on how navigating the FDA’s bureaucratic labyrinth is akin to the long-grinding campaign required to clear the final dungeon from any Legend of Zelda game. Pharmaceutical companies are intimately familiar with the FDA’s slow and expensive approval process for new drugs, and for this reason, it should come as no surprise that Silicon Valley companies do their best to avoid government regulation. One venture capitalist goes so far as to say, “If it says ‘FDA approval needed’ in the business plan, I myself scream in fear and run away.”

Dynamic, nimble startups are much more in tune with market conditions than the ever-growing regulatory behemoth that is defined by procedure, conformity, and irresponsibility. As a result, conflict between these two worlds is inevitable:

Most startups can bring a new video game to market in six months. Going through the FDA approval process for medical devices could take three or four years — and cost millions of dollars.

In the tech world, where app updates and software patches are part of every company’s daily routine just to keep up with consumer habits, technology can become outdated in the blink of an eye. Regulatory hold on a product can spell a death sentence for any startup seeking to stay ahead of its fierce market competition.

Akili is the latest victim to get caught in the tendrils of the administrative state, and worst of all, in the FDA, which distinguished political economist Robert Higgs has described as “one of the most powerful of federal regulatory agencies, if not the most powerful.” The agency’s awesome authority extends to over twenty-five percent of all consumer goods in the United States and thus “routinely makes decisions that seal the fates of millions.”

Despite its perceived image as the nation’s benevolent guardian of health and well-being, the FDA’s actual track record is anything but, and its failures have been extensively documented in a vast economic literature. The “knowledge problem” has foiled the whims of central planners and social engineers in every setting, and the FDA is not immune. By taking a one-sized-fits-all approach in enacting regulatory policy, it fails to take into account the individual preferences, social circumstances, and physiological attributes of the people that compose a diverse society. For example, people vary widely in their responses to drugs, depending on variables that range from dosage to genetic makeup. In a field as complex as human health, an institution forcing its way on a population is bound to cause problems (for a particularly egregious example, see what happened with the field of nutrition).

The thalidomide tragedy of the 1960s is usually cited as to why we need a centralized, regulatory agency staffed by altruistic public servants to keep the market from being flooded by toxins, snake oils, and other harmful substances. However, this needs to be weighed against the costs of keeping beneficial products withheld. For example, the FDA’s delay of beta blockers, which were widely available in Europe to reduce heart attacks, was estimated to have cost tens of thousands of lives. Despite this infamous episode and other repeated failures, the agency cannot overcome the institutional incentives it faces as a government bureaucracy. These factors strongly skew its officials towards avoiding risk and getting blamed for visible harm. Here’s how the late Milton Friedman summarized the dilemma with his usual wit and eloquence:

Put yourself in the position of a FDA bureaucrat considering whether to approve a new, proposed drug. There are two kinds of mistakes you can make from the point of view of the public interest. You can make the mistake of approving a drug that turns out to have very harmful side effects. That’s one mistake. That will harm the public. Or you can make the mistake of not approving a drug that would have very beneficial effects. That’s also harmful to the public.

If you’re such a bureaucrat, what’s going to be the effect on you of those two mistakes? If you make a mistake and approve a product that has harmful side effects, you are a devil incarnate. Your misdeed will be spread on the front page of every newspaper. Your name will be mud. You will get the blame. If you fail to approve a drug that might save lives, the people who would object to that are mostly going to be dead. You’re not going to hear from them.

Critics of America’s dysfunctional healthcare system have pointed out the significant role of third-party spending in driving up prices, and how federal and state regulations have created perverse incentives and suppressed the functioning of normal market forces. In regard to government restrictions on the supply of medical goods, the FDA deserves special blame for driving up the costs of drugs, slowing innovation, and denying treatment to the terminally ill while demonstrating no competency in product safety.

Going back to the NPR story, a Pfizer representative was quoted in saying that “game designers should go through the same FDA tests and trials as drug manufacturers.” Those familiar with the well-known phenomenon of regulatory capture and the basics of public choice theory should not be surprised by this attitude. Existing industries, with their legions of lobbyists, come to dominate the regulatory apparatus and learn to manipulate the system to their advantage, at the expense of new entrants.

Akili and other startups hoping to challenge the status quo would have to run past the gauntlet set up by the “complex leviathan of interdependent cartels” that makes up the American healthcare system. I can only wish them the best, and hope Schumpeterian creative destruction eventually sweeps the whole field of medicine.

Abolishing the FDA and eliminating its too-often abused power to withhold innovative medical treatments from patients and providers would be one step toward genuine healthcare reform.

Bundled Payments, Barely Hatched, Go the Way of the Dodo

Last month, I wrote about Accountable Care Organizations (ACOs), medical groups accountable to the federal government for management of healthy populations. Even Zeke Emanuel recognizes that they are failing. Dr. Emanuel advised Medicare to “lump together” all the services associated with a procedure, such as a hip replacement, and pay one fee for the entire services.

As I noted, Medicare already does this via its Bundled Payments for Care Initiative (BPCI), which launched in 2013. At the time, hospitals and other providers were offered voluntary participation. Just a few weeks ago, Medicare decided to make bundled payments mandatory for some procedures in some areas. Now we know why: Providers are learning that the bundles don’t work.

Whether we call them “lumps” or “bundles” the results of the voluntary initiative are coming in and they are telling pretty much the same story as the ACO experience:

Medicare’s voluntary test of bundled payments added new contracts in July, but about two-thirds of the hospitals, medical groups, nursing homes and other providers that had initially enrolled instead dropped out.

The initiative, known as the Bundled Payments for Care Improvement initiative and launched under the Affordable Care Act, initially attracted nearly 7,000 providers that agreed to formally review how they could enter bundled-payment contracts with Medicare. The CMS announced on Thursday that 2,100 providers finished that review and entered contracts under which Medicare will bundle the costs of treating various conditions—heart failure, joint replacement, stroke, heart attacks—into a single payment.

The reason for the failure of both initiatives is the same. ACOs are accountable to the federal government instead of their patients. Similarly, the “bundles” are bundled by the federal government. The only way to figure out which services should be bundled together in one payment is to let entrepreneurs try different bundles and let patients decide which to choose.

* * *

For the pivotal alternative to Obamacare, please see A Better Choice: Healthcare Solutions for America by John C. Goodman (Independent Institute, 2015).

The Prime Importance of Private Property Rights

Imagine for a moment you decide to rent out a room in your home to another person. There are two parties in the contract—the landlord (you) and the tenant. You both agree to the lease and sign the contract.

Things are going fine, but then, your tenant stops paying their rent.

The solution to this situation is relatively straightforward. You serve the tenant notice that they will be evicted if they do not pay. As the landlord, you will incur the costs of evicting them, but are likely compensated for the forgone rent by your tenant’s security deposit. You have the option to sue your former tenant if you incurred greater losses.

In many places, however, this process isn’t so easy. Imagine that, instead of evicting your delinquent tenant, you must instead keep providing them living space because it is against the law to “make someone homeless.” Eviction requires producing countless documentation, multiple court appearances, and spending ample amounts of additional time and money to remove the problem tenant. In some cases, the process takes years.

Although the illustration above may seem exaggerated, it is the reality in many places. Venezuela, for example, maintains a law similar to the one described above. A landlord cannot evict a tenant if the tenant does not have other arranged housing. The issue has become a serious problem. In 2014, multiple outlets reported some 3,000 squatters were living in a 45-story building in the capital city of Caracas.

This issue of eviction is illustrative of the broader importance of private property rights. Issues of tenants’ rights are often the subject of news. Everyone has heard stories of the “terrible landlord,” the tenet who was wrongfully evicted, the security deposit that was never returned. Perhaps this is why there are frequent proposals in the U.S. and elsewhere to make it more difficult for landlords to evict tenants, limit the prices they charge, and so on. Certainly, tenants’ rights are important. Renters do, after all, pay for their right to live in another person’s property.

It’s this point that many often forget, and it’s important. Private property means that an individual has exclusive rights to use a particular asset. He doesn’t have to worry about someone else using his assets without his permission. As a result, the owner internalizes whatever action he takes with regard to his property. If he takes good care of his house and makes improvements, for example, he benefits when it comes time to sell. If, by contrast, he allows the home to fall into disrepair, he will face the negative consequences of his actions in the form of a lower selling price.

This dynamic benefits not only the individual, but society as a whole. Private property rights provide incentives for individuals to take care of their property and to consider both the present and future value of their assets. In the context of housing, these rights induce owners to care for their property and increase its future value.

Violating private property rights can sometimes sound like a noble idea. After all, most people do not like the idea of people living on the street, or spending most of their monthly income on housing. But the broader implications of denying or limiting private property rights are disastrous. Without private property rights, the above incentives to care for and enhance the value of property are weak or all together absent. If a landlord knows he cannot reap the full benefits from his property, what incentives does he have to make repairs to his property? If individuals know landlords cannot evict problem tenants, they are much less likely to rent their property. This is exactly what has occurred in Venezuela, where a housing shortage has resulted in not only the confiscation of homes, but also the use of metal from old automobiles in a desperate attempt to erect more housing.

Though not as extreme as the Venezuelan case, attempts to undermine private property in the U.S. occur regularly. Rent controls are a prime example. As recently as this spring, groups in San Francisco urged the city to further restrict apartment prices. The use of eminent domain laws that allow the government to take individual assets is another illustration. It is important to remember that even though such policies may sound appealing, they have serious consequences. For those of us concerned about the wealth and well-being of all individuals, protecting and strengthening private property rights is of the utmost importance.

Victory for Free Speech in Medicine

Judges are chipping away at government censorship of communications about prescription drugs. The Food and Drug Administration exerts great power over a medicine’s label, which describes the medicine’s therapeutic claims. Drug makers and the FDA sometimes spend years negotiating a label.

The FDA regulates both safety and “efficacy.” So, a drug maker has to prove to the FDA its medicine works before marketing it to doctors. However, the cost of clinical trials required to prove claims is monumentally high, so drug makers will not always invest in clinical trials for every indication. Once a drug is used, doctors will find that it is effective for more claims than indicated on the label. The new indications are often supported by peer-reviewed, published research. However, the drug makers have not yet invested the time and money to negotiate with the FDA to get the new claims onto the label.

Oncology is a specialty where so-called off-label prescribing is common. Indeed, off-label prescribing is so common that some states mandate insurers pay for coverage of prescriptions written for off-label use! Clearly, the regulatory bureaucracy is behind the curve on this issue. Nevertheless, the FDA has asserted power to stop pharmaceutical reps from even distributing reprints of peer-reviewed studies supporting off-label uses to doctors.

We are not talking about the cure-all medicine man stopping his covered wagon in town and putting on a show to separate the yokels from their wages. We’re talking about high-level discussions with relevant specialists about new evidence-based medicine.

Fortunately, a judge recently found – on First Amendment grounds – that representatives of Amarin Pharma can distribute such information to doctors despite the FDA’s disapproval.

Established in 1906, the FDA has consistently increased its power. Not until 1962 did it win the power to adjudicate efficacy. Removing that power, and limiting the FDA to regulating safety, would return authority to doctors and patients. The 21st Century Cures Act, which passed the U.S. House of Representatives in June, does not go that far. Nevertheless, it allows more “real world” evidence to be added to a drug’s label, which is a step in the right direction.

* * *

Is the FDA safe and effective? See FDAReview.org.

In Memoriam: Robert A. Conquest (1917–2015)

One of the great ironies of modern history is that the person most responsible for bringing to light the magnitude of Stalin’s terror is a man whose last name is synonymous with occupation and subjugation: Robert Conquest. In word and in deed, the world-renowned historian, who passed away on August 3 at the age of 98, was, of course, nothing like the monster he wrote about in books such as Stalin: Breaker of Nations, Stalin and the Kirov Murder, Kolyma: The Arctic Death Camps, Harvest of Sorrows: Soviet Collectivization and the Terror-Famine, and The Great Terror: A Reassessment.

Independent Institute looks at Robert Conquest (who also served as a founding member of the Board of Advisors of our quarterly, The Independent Review) with reverence and gratitude. In 1992, we hosted a national dinner in his honor, featuring presentations by Preston Martin, Czeslaw Milosz, Aaron Wildavsky, John O’Sullivan, Elena Bonner, Harry Wu, and the honoree himself (video, audio, transcript). Conquest also penned a brilliant op-ed for the occasion, “Learning to Unlearn the Leninist Mindset”—still instructive a quarter century after the fall of the evil empire. In 2000, he published Reflections on a Ravaged Century, and he graced us once again, by speaking at our Oakland headquarters, at an event titled “Freedom, Terror, and Falsehoods: Lessons from the Twentieth Century” (video, audio, transcript).

For too long, the Western intelligentsia ignored Robert Conquest (although he had legions of fans behind the Iron Curtain, where his works circulated clandestinely). Happily, several obituaries and remembrances will help preserve his legacy. (For a sampling, see the New York Times, the Wall Street Journal, George Will, National Review, John O’Sullivan, The Economist, the Daily Beast, and the BBC magazine.) For readers of The Beacon, however, I thought it most fitting to close not from a eulogy, but from Conquest’s op-ed referenced above:

It has been wisely said that the two great causes of human troubles are impatience and laziness. Intellectually, these are precisely the phenomena that produce such destructive fantasies. Ideological quick fixes for all intellectual and social problems are sought, rather than an understanding of their real complexities. The Soviet Union was a proving ground for such approaches. We in the West still have much to learn, and to unlearn, from the events in the former communist countries.

[This post is adapted from the August 18, 2015, issue of The Lighthouse. To subscribe to this weekly email newsletter and other bulletins from Independent Institute, enter your email address here.]

Safe, Legal, and Rare, Part 1: Safe?

In the historic debate over abortion, the “pro-choice” mantra was “Safe, Legal, and Rare”: the argument being that if abortion were legalized, it would be both safer than oft-cited “back alley” abortion, and, coupled with an expansion of sex education and access to contraception, would become increasingly rare.

In light of the continuing release of videos exposing the actual practice of abortion by Planned Parenthood, it’s now fair to assess: 40 years following Roe v. Wade, is abortion in America “Safe, Legal and Rare”?

First, Safe:

The first thing to be aware of is that there is literally no way of knowing how safe abortion is in America, because—unbelievably—even where they exist, reporting requirements are not enforced, and no source has established any meaningful method of tracking abortion. While the Centers for Disease Control and Prevention (CDC) maintains an “Abortion Surveillance Unit,” it in fact has no systematic means of collecting data on abortion-related statistics—including deaths.

Price Transparency Laws Don’t Work

In a functioning market, you know what you owe before you buy a good or service. That is not the case in health care, as we know. Because of increasing deductibles, the failure of price transparency is becoming increasingly irritating to patients.

Some believe a solution can be legislated. This has occurred in New York and Massachusetts; and one of my favorite state legislators, Senator Nancy Barto, has tried to legislate it in Arizona.

Effective January 2014, Massachusetts law requires health providers to provide a maximum price for a procedure within 48-hours of a prospective patient asking. Well, it has not worked, according to a ”secret shopper” survey of professionals conducted by the Pioneer Institute:

Dermatology practices were asked the price of a routine exam and removal of a wart.  Office staff were not well informed about the law and didn’t have systems in place to provide prospective patients with price information.

When price information was obtained, it often came in wide bands such as from $85 to nearly $400

Gastroenterology practices were asked for the price of a “routine screening” colonoscopy with no removal or biopsy of polyps.  This proved to be the most complex request because the procedure requires at least three fees: the gastroenterologist’s, the anesthesiologist’s and the hospital or clinic facility fee.

Many doctors, facilities and anesthesiology services required the consumer to provide a “current procedural terminology code” to get a price estimate despite its not being required under state law. When all three fees were included, the overall routine colonoscopy fee ranged from around $1,300 to $10,000.

I have always suspected that laws which simply command that prices shall be transparent would fail, and it looks like I am being proved correct. They simply cannot be reasonably enforced. A better solution is what I call the common law approach.

A Call to Order in the Hobbesian Jungle

“Why do we have a government at all?”

Occasionally, I have the chance to ask students this question. After examining the unintended consequences of government policies and discussing the economics of politics (i.e. public choice economics in the tradition of James Buchanan and Gordon Tullock), the rosy picture of government from their high school civics classes has been, with any luck, irreparably damaged. We see how many times government policies actually make problems worse, not better. Given that’s the case, would it be better to not have a central government at all?

With very few exceptions, most of the time students look at me like I have two heads. What do you mean, do we need a government? Of course we need a government. They stare at me, questioning my sanity.

I push them to elaborate. “Ok. That’s an acceptable answer, but one-word answers have absolutely zero persuasive power. Can you tell me why?”

At this point, someone usually mentions the idea that government is the fundamental tool for preserving order in society. Without it, there would be absolute chaos—rape, pillage, and plunder lurking around every corner.

While many of the students I encounter are quite bright, they aren’t the first to make this argument—not by a long shot.

In his 1651 book, Leviathan,Thomas Hobbes (1588-1679) made a similar argument. To Hobbes, the default state of mankind is…less than kind. Instead of trading with your neighbor to increase your wealth, Hobbes argued you’d be more inclined to club him over the head and steal his stuff. It follows that government exists to provide order where there would otherwise be chaos. Without government life would be “nasty, brutish, and short.”

There is great debate on the fundamental state of human nature. Whether people are inherently “good” or “bad” is not really of primary concern in this discussion. In fact, we can assume that people are generally inclined to bludgeon their neighbor. Does this make government necessary? Stated differently, is there a solution to the “Hobbesian problem” without the “Hobbesian solution” of government?

There are a variety of reasons to think that self-governance or anarchy (properly defined as the absence of a centralized government) would work better than people tend to think. First, we can observe that for most of human history, there was no such thing as a central government. In present day, the world is still anarchic on an international level—there is no “world government” (unless you count the U.N. Seriously, let’s not kid ourselves). If, in fact, government was necessary, centralized governments should have arisen quickly in human history.

We can also see how a variety of groups have worked throughout the course of history to solve highly complex social problems outside of or absent a state. Take, for example, the creation of language, currency, and even the origins of English common law. These were all developed through a system of private interaction. A variety of people used both formal rules (e.g. contracts) and informal rules (e.g. customs, social norms, etc.) to create enduring institutions that continue to benefit mankind.

In fact, a growing body of literature is illustrating that private forms of governance have been, and continue to be, an important way of organizing human behavior. In his book, The Invisible Hook, economist Peter Leeson demonstrates how some of the most brutal and untrustworthy members of society–pirates–were able to privately create and enforce rules to make individuals better off and decrease conflict. In their book, The Not So Wild, Wild West, Terry Anderson and Peter Hill explore how people out on the American frontier created their own systems of governance and illustrate that the “Wild West,” was actually much more tame than we tend to believe from watching John Wayne movies. Most recently, economist David Skarbek examined how yet another unsavory group—hardened convicts—are able to govern themselves both in and outside of prison via a system of gangs in The Social Order of the Underworld: How Prison Gangs Govern the American Penal System.

Certainly, a blog post cannot do justice to the theoretical and empirical arguments for self-governance. But as I tell my students, self-governance or anarchy may give us more than we think. Too often, when we encounter problems in our society and throughout the world, our default solution is to suggest more government, more regulation, and more oversight. But the answer may not be so simple. In fact, we’d do well to remember that, throughout our history, individuals acting in their own self-interest have developed some pretty amazing solutions to very serious problems!

  • Catalyst
  • Beyond Homeless
  • MyGovCost.org
  • FDAReview.org
  • OnPower.org
  • elindependent.org