Wednesday, July 26, 2017


Stephen W Gordon The US has spent somewhere around 5 trillion on wars (still going up) to avenge/punish/protect after 4,000 Americans were murdered on 9/11. But some how we don't have any money to help the many times that number who will die from lack of health care. Strange.
John Coffey
John Coffey I meant to respond a few days ago, because I think that your point is valid. War is a terrible waste of money.

When it comes to war we have sort of a paradox, because I don't think that we should do nothing, but we usually end up doing the wrong thing or too much.

There is also a paradox when it comes to healthcare. The national sentiment right now is that nobody should be without healthcare, which means that some people are going to need public charity. It doesn't necessarily mean that everyone should have their healthcare paid for by the government, but I have read that we are 64% of the way there already, so some might say let's just go to single payer. This is where I think that paradox comes in, because whenever someone else is paying for your services, you are not going to care about either the price or overusing the system. One thing that keeps prices in check is people's willingness to do without when it gets too expensive, but we think that nobody should do without. Ironically, the current supply of healthcare services is not enough to cover everybody.

I am a firm believer that when government funds something it automatically becomes more expensive. Both healthcare and higher education have risen much faster than inflation. It has everything to do with incentive. When people spend their own money, they are much more careful on how they spend it or do without if something is too expensive.

I would not necessarily be against single payer if there were incentives to control costs. People need to pay for a portion of their healthcare. One possibility is have single payer cover disasters and people have to pay for routine expenses.

My prefered solution is medical savings accounts, which would be subsidized for those who can't afford them. Hypothetically, both you and your employer contribute 5% (or some other percentage) of your income pre-tax dollars into an account that can only be used for medical expenses or to buy health insurance. You have the option to contribute more and the money can accumulate till retirement. If you have an excess amount in the account at retirement then you could take some of it as taxable income. With this system, current retirees would still fall under medicare, but future retirees would be required to use the medical savings accounts first. Also, people should be allowed to invest the account money in something that will get a decent return.

Sunday, July 16, 2017

The moral hazard of climate change.

I write about this because it is one of the most important issues of our time.

I agree with the basic premise of the article that a completely unregulated capitalism could lead to some bad consequences.  However, some things not covered by the article are ....

1.  The government's major role in creating the financial crisis of 2008.

2.  The moral hazard of publicly funding science to the tune of tens of billions of dollars per year to tell us that there is a problem.

3.  The positive benefits of increased CO2 levels in the atmosphere.

4.  How nuclear fusion will allow us to start replacing fossil fuels (if necessary) by mid century.

5.  How predictions of large climate sensitivity, which is how much the atmospheric temperature will go up every time you double the level of atmospheric CO2, are not supported by current temperature data, which shows the climate sensitivity to be about 1.1 degrees celsius.  Many climate scientists have noted this.  Climate alarmists are worried about a climate sensitivity of around 5 degrees celsius.  

6.  How there will be ways to remove CO2 from the atmosphere (if necessary).  The most cost effective method is Iron Fertilization.

Yesterday, I watched several videos talking about Climate Sensitivity in general.  The biggest concern is about positive feedback, such as increased water vapor acting as a greenhouse gas.  However, we have seen no evidence of positive feedback over the last 137 years where the CO2 levels have gone up about 75%.  I have no doubt that some positive feedback does exist, but if the feedbacks were more forcing than the CO2 is forcing then we would have seen runaway greenhouse by now.   In order for this to happen, the feedbacks would need a multiplication factor greater than 1.  Yesterday, I saw one lecturer claim that the feedbacks are only about 0.6, although we currently see no evidence of even this level.  If he is correct then we could expect a climate sensitivity of about 2 degrees.  Even the climate alarmists say that this doesn't lead to disaster, although they do claim it would be inconvenient.  However, some skeptics note that the the temperature change in a typical day can vary by as much as 30 degrees celsius, so a 2 degree change in the average is nothing to get our underwear in a knot about.

I have put much faith in the current temperature data in estimating Climate Sensitivity.  One of the arguments against this is the claim that the oceans have been absorbing much of the recent warming, acting as a buffer against Climate Change.

CO2, which is necessary for all life on earth, is a trace gas that we measure in parts per million.  It is a weak greenhouse gas compared to other gasses.  It has also been on a major decline over the entire history of the earth because it gets sequestered by natural processes.  During the last period of glaciation it was dangerously low, almost to the point where terrestrial plant life would start dying.  We have been running out of CO2.  Only in recent history have humans reversed the decline.

We have technically been in an ice age for 3 million years.  All of human civilization arose during a brief 10,000 year period between glaciations.  The next period of glaciation is expected in another 10,000 years, although some have speculated that global warming may delay this.  Others have said that we can't prevent it.

I tried really hard to find data about what past temperatures were when the CO2 level was last about 800 parts per million, but this information is hard to find.  This is the level of CO2 that we expect to have around the year 2100, which is double of what we have now.   If historical temperatures were to show much higher temperature than we have now then I would be more concerned.

What past temperature and CO2 data do show is that there is not a clear correlation.  Sometimes they can be opposite of each other.  I saw multiple people claim that orbital variation (of the earth) is the by far the biggest driver of temperature.  If anything, CO2 increases usually follow temperature increases, instead of the other way around, because higher temperatures cause the oceans to release more CO2.  What is likely happening here is that first the temperature goes up, which causes CO2 to increase, which then in turns adds to the temperature.

Plants benefit from increased levels of CO2.  Crop yields are up and are continuing to increase.  As a result of man's activity, there has been a greening of the earth.  However, there should be some limit to how far we want to go with this.  The earth has not seen a CO2 level of 1,200 parts per million in 60 million years.  I believe that technology at some point in the 21st century will allow us to stabilize the level.  We will move to nuclear fusion and sequester CO2 as necessary.


Monday, July 3, 2017

Fwd: Do the math

ere is some simple "old school" math. First, Obamacare is bankrupting every Obamacare exchange. Minuteman Health announced its failure last week. That means 19 of 23 Obamacare co-ops are bankrupt and out of business. Billions of taxpayer dollars up in smoke.

That's a failure rate of 83 percent. This has nothing to do with conservative vs liberal. These are just facts.

Here in Nevada, our last two insurance carriers just pulled out. Fourteen of 17 counties in Nevada will not have any insurance option for the Nevada Obamacare exchange in 2018. That's a failure rate of … you guessed it, 83 percent. Hey, at least Obamacare is consistent!

By the way, the Obamacare exchanges failed in both Vermont and Hawaii. Two tiny, liberal states couldn't make it work. Hundreds of millions of taxpayers' dollars lost.

In California, the Democrat Assembly leader is getting death threats because he tabled the idea of universal health care. Democrats are threatening to kill a Democrat because he realized there isn't enough money in the world to pay for free health care. The cost in California for universal health care? $400 billion. That's more than twice as much as the entire California budget.

Eroding common identity

Historically, a nation-state stipulated the primacy of a nation brought together by a common culture, which in turn went on to generate an overarching national identity strong enough to attenuate regional, ethnic or religious differences. In both their American and European systemic varieties, democratic institutions have preserved and protected the rights of the people, while the culturally grounded dominant national identity has given the nation-state its requisite resilience, while also imbuing it with the power to make demands of its citizens. So long as this shared national identity remained strong—call it patriotism, love of country, or belonging beyond one's immediate family and local community—the nation-state retained its cohesion, resting on a sense of reciprocity between the government and the citizen.

Today after decades of espousing multiculturalism and group rights buttressed by the politics of grievance, the foundations of a larger shared national identity have eroded such that governance—or better yet, governability—has become an increasingly scarce commodity across the West. We are at an inflection point, where a growing systemic disorder is stoked not just by shifts in the global power distribution, but by the progressive decline in governability. The dismantling of the core principle that the national homeland should be under the sovereign control of its people lies at the root of this problem.

The hypothesis that institutions ultimately trump culture has over the past quarter century morphed into an article of faith, alongside the fervently held belief that nationalism and democratic politics are at their core fundamentally incompatible. The decades-long assault on the very idea of national identity steeped in a shared culture and defined by a commitment to the preservation of the nation has left Western leadership frequently unable to articulate the fundamentals that bind us and that we thus must be prepared to defend. The deepening fight over the right of the central government to control the national border—which is at the core of the Western idea of the nation-state—is emblematic of this situation.