Friday, May 19, 2023

Debt Ceiling - Here we go again

 The Debt Ceiling negotiations have been a bumpy ride so far.  That was to be expected.  As such, it was something press should have been telling us to expect all along.  But, as is all too typical of them, the press coverage of the current negotiations has been terrible.

This is bad for all of us because public opinion has a lot to do with how things turn out.  And, as is unfortunately the norm, the Republicans are winning the messaging war.  This is only partly due to ineptitude on the part of Democrats in general and the Biden Administration in particular.  The rest is on the press, which should know better by now.

It's not like we haven't gone through all of this before.  I was blogging in 2011 when we last went down this road and I posted extensively about this subject.  Here's a good starting point:  Sigma 5: Debt Ceiling Negotiations.  Then as now bad press coverage contributed to a bad outcome.

Rather than pointing out from the get-go that the whole thing was a crass political ploy on the part of Republicans, the press engaged in a lot of both-sides-ism.  Toward the end they did come around to some extent and start assigning most of the blame to the GOP.  But by then a lot of damage had already been done.

In 2011 we were slowly coming out of the recession caused by the crash of '08.  High government spending would have sped up the recovery.  But the deal the Obama Administration was forced into by Republicans resulted in a contraction in Federal spending.  That slowed the rate of recovery unnecessarily.

But let's go back to the beginning.  Where did the Debt Ceiling come from?  I only recently learned the answer to that question.  It was put in place for the first time in 1917 in the run-up to U.S. entry into World War I.  Before that, we got along just fine without one.

At that time, however, the Federal Government was looking at the need to issue a large number of bonds.  They would be used to finance our part in the War effort.  A big increase in the Debt was a concern, and not just to fiscal conservatives.  In an effort to mollify the opposition "Debt Ceiling" legislation was put through.  The legislation required Congress to approve future increases in the national debt.  The legislation was nicknamed the "Debt Ceiling", or equivalently the "Debt Limit".

It turned out that nobody cared.  No votes were changed by the institution of a Debt Ceiling.  Fortunately, the government had no trouble authorizing and selling the bonds.  And the U.S. participation in World War I ended up being relatively modest, so funding our participation never became a big political issue.  But the Debt Ceiling remained a part of U.S. law.

And it's still there.  Efforts to get rid of it have so far fallen short.  Voting to repeal it makes one an easy target for accusations of being a spendthrift or worse.  On the other hand, for a long time no one found a reason to screw things up by holding up the routine increasing of the Ceiling.  So, for decades it was quietly increased as a matter of routine housekeeping.  That all changed in 2011.

Republicans have long labeled Democrats as the "irresponsible tax and spend" party.  I have looked at history and done the math.  It is the Republicans who are the spendthrifts and the Democrats who are fiscally responsible party.  But GOP messaging has been very effective, so people believe the opposite.

And in 2010 Republican operatives ginned up the "TEA Party".  TEA stood for "Taxed Enough Already".  A number of well-funded stealth operations created a completely phony "ground swell of support from the grass roots" for the TEA Party movement.  This was a good investment because it provided the political cover necessary for Republicans to be able to get away with opposing a routine increase in the Debt Ceiling when it came up in 2011.

Republican refusal to go along was supposedly based on the opposition arising "spontaneously" from the grass roots.  And by "grass roots" they meant the very TEA Party organizations they had carefully constructed.  The press covered the TEA Party as if it was a legitimate grass roots movement and not a construct supported and nurtured by rich and powerful interests closely aligned with GOP leadership.

The very first TEA Party organizers actually were ordinary people who were genuinely concerned about the size of the Federal Debt.  But the GOP establishment quickly recognized the opportunity these people represented.  They poured large amounts of money and professional organizational skill into turning a few scattered groups into a large organization with a national reach.

So, the TEA Party was the opposite of a spontaneous grass-roots movement.  But the press didn't get around to noticing this until several years later.  That's when the money spigot got turned off.  As soon as that happened the whole movement quickly fell apart.  

The press also ignored the history dating all the way back to the Reagan Administration of the GOP blowing up the deficit and then leaving it to Democrats to clean up the mess.  By 2011 that left Democrats and the Obama Administration in a week position.  They felt they had an obligation to not blow the economy up, something the other side seemed perfectly willing to do.  The result was a deal that hurt the U.S. economy for a decade.

And now we're back at it in 2023.  The press coverage is still bad.  Republicans have been able to hold on to their completely unjustified reputation for fiscal probity.  And that has put Democrats and the Biden Administration back into a weak bargaining position.

And then there is this.  The negotiations are going exactly like I (and anyone who was paying attention in 2011) have expected.  But that hasn't stopped the press, who never seem to learn, from engaging in their usual penchant for breathlessly covering any small twist or turn as if it actually meant anything.  It is too soon for anything meaningful to be happening.

By now they should understand how negotiations work.  But, if they do, this understanding is not affecting their coverage. Negotiations are also a subject I know a thing or two about.  Here's a primer on the subject that I put together all the way back in 2010:  Sigma 5: Negotiation 101.  For those that don't feel the need for a refresher, let me cut to the chase.  These things always go down to the last possible second.  Why?

Imagine two parties who are in opposition engaging in a negotiation.  More for one side means less for the other side.  Now let's say that the negotiators put together a deal well before the deadline.  What happens?  One or both sides become angry.  Whether it is true or not some of the parties on one or both sides of the negotiation that were not directly involved in the negotiation come to believe that if their side had held out longer, they would have gotten a better deal.

Deals have to be sold to all parties, not just the ones present at the negotiations.  Both sets of negotiators have to be able to go back to their people and credibly be able to say, "we got the best deal possible".  Going into a negotiation both sides usually ask for the moon.  Inevitably they will have to make concessions in order to craft the final deal.

The idea is to make the fewest concessions possible.  If negotiations go to the last minute, or even past the "deadline", then it bolsters the case that each side got the best deal possible.  So, in contentious situations, negotiations ALWAYS go to the last minute.  The template for this is the 2011 Debt Ceiling negotiations.  They went down to the very last minute.

In this context, "deadlines" are extremely useful.  This often results in artificial deadlines being manufactured.  The question is not whether the deadline is manufactured or real.  The question is whether the deadline is credible.

We see this all the time in labor contracts.  The "deadline" is often the expiration of the previous contract.  But I have seen union members work without a contract for many years.  In these cases, the end of the old contract "deadline" was both artificial and not credible.  Things often change when workers go out on strike.  But if employers can keep operating while workers are striking then this too can turn into an artificial deadline.

Strikes often take time before they start to inflict real pain on management.  The current Writer's Strike in the movie/TV business is a good example.  A finished script may be produced well before the project (movie or TV show) is ready to be aired. So, not much gets held up for those first few days or weeks.  But over time more gets held up and the cost of those delays start piling up.  The last Writer's Strike took 100 days to settle.  This one is likely to last that long or longer.

And that brings us back to the Debt Ceiling negotiations.  We passed one artificial deadline a couple of months ago.  That's when we actually reached the Debt Ceiling.  It was not a credible deadline because we have been there before.  We reached the Debt Ceiling well before the impasse was broken in 2011.  But the Treasury Department employed "Extraordinary Measures" to keep things working, at least for a while.

It was not until the Treasury announced that they had reached the end of their Extraordinary Measures in 2011 that a deal was made.  In 2011 this "Extraordinary Measures" business was new because no one had played fast and loose with the Debt Ceiling before.  People had made symbolic gestures, but only because they knew the symbolic gestures would fail.

In 2011 the important players in Congress and the Administration knew that something like what was eventually labeled "Extraordinary Measures" were possible.  And after 2011, so did anyone who was paying attention.  This time around the press did take Extraordinary Measures into account.

There was little coverage when we hit the actual Debt Ceiling.  When they did start covering the issue in May, however, they went with their usual "why isn't a deal being made right now" approach.  It is as if they had never seen a contentious negotiation before.

Treasury Secretary Yellen announced weeks ago that her department would likely run out of Extraordinary Measures on or about June 1.  Since then, June 1 has become more and more firm.  The process is complicated so the Department will not know it has hit the limit until the last minute.  But it is going to be very close to the end of the business day on June 1.

And the whole argument about whether or not to raise the Debt Ceiling has always been ridiculous.  Congree passes legislation that creates revenue, mostly in the form of taxes.  All of this revenue is strictly controlled by laws which are subject to review by the courts.  All expenses are strictly controlled by "Appropriations" bills passed by Congress and also subject to review by the courts.  So, the size of the deficit is completely determined by the details of the legislation Congress passes.

If Congress is unhappy with the size of the deficit, then all it has to do is revise the tax code or modify the contents of various appropriations bills.  Do that and the problem will be solved.  This too the press could and should note but doesn't bother to.  So, the only real purpose the Debt Ceiling law fulfils is to provide one side with a lever it can use to try to squeeze concessions out of the other side.

But to what end?  There are other, more direct means of achieving the result they claim to be aiming for.  As I noted above, if the deficit is too large either raise taxes or cut spending.  But republicans don't actually want to do either.  They want Democrats to do it for them.

That way they can blame Democrats for the tax increase or spending cut.  When they are in control, they cut taxes and increase spending.  Both are popular.  Both are also irresponsible when done together.  The combination blows the deficit up, something they pretend to care deeply about.  That is, when a Democrat is in the White House.  Just not enough to raise taxes, close loopholes, or reduce spending on popular programs.

And certainly not when a Republican is in the White House.  Every instance of Debt Ceiling hostage taking that I know of involves Republicans doing it to a Democratic President.  Democrats do not reciprocate.  As a recent example, the Debt Ceiling was raised three times during the Trump Administration without any problems.  The deficit and the National Debt also ballooned wildly under Republican President Trump.

And there are at least two ways to legally get around the Debt Ceiling law.  There was a lot of discussion about one of them in 2011.  If the U.S. Mint makes a coin, then the value of the coin counts as revenue.  So, every time the Mint makes a quarter the national debt goes down by twenty-five cents.  That's not much.

But it turns out that the Mint doesn't have to consult Congress when it decides which denominations they produce, nor how many of each they make.  So, at the request of the Administration, the Mint could decide to make a Trillion Dollar coin, presumably in Platinum.  President Obama said he wouldn't do it.  But, if the Biden Administration were to mint thirty-two or more Trillion Dollar coins, the entire National Debt would be wiped out instantly.

The other option was a new one on me when I recently heard about it for the first time.  Section Four of the Fourteenth Amendment to the U.S. Constitution reads:

The validity of the public debt of the United States, authorized by law, including debts incurred for payments and bounties for services rendered is suppressing insurrection or rebellion, shall not be questioned.  But neither the United States nor any State shall assume or pay any debt or obligation incurred in aid of insurrection or rebellion against the United States, or any claim for the loss or emancipation of any slave; but all such debts, obligations and claims shall be held illegal and void.

There is a lot of verbiage in this section dealing with issues arising out of the Civil War.  It validated the Civil War debt run up by the North and repudiated the similar debt run up by the South (and denied compensation to slave owners for the value of freed slaves).

If we slice away all the Civil War stuff we end up with, "The validity of public debt of the United States, authorized by law, shall not be questioned".  The Debt Ceiling law has the effect of questioning of the validity of the public debt.  That is not allowed by the plain language of section Four of the Fourteenth Amendment.

Any law which contradicts the Constitution is unconstitutional, so the Debt Ceiling law is unconstitutional.  So, President Biden has the option of minting Trillion Dollar coins and/or of declaring that the Debt Ceiling law can be ignored because it is unconstitutional.  So far, he was said he will not exercise either option.

Joe Biden is an institutionalist.  In part, that is a result of him having served in the U.S. Senate for 36 years.   He respects the way things are supposed to be done and what role each the various branches of the government are supposed to undertake.  As such, he sees it as the job of Congress to deal responsibly with the Debt Ceiling.  He keeps hoping that they will do so.  But the current batch of House Republicans seem disinclined to do so.

But at the time I am writing this we still have a few days.  The various parties have time to come to a deal and implement it.  If I were President Biden, I would have left both options on the table.  Both can be used as bargaining chips in trying to get a decent deal.  He has chosen not to do this, at least not publicly.  Again, that is in his nature.  He wants to preserve things, not blow them up.

But he is free to exercise the 14th Amendment option at any time.  All he would need to do is to get the Office of Legal Council (a part of the Whie House operation) to issue an opinion.  He can have them do that at any time.  Then he can wait until the stroke of midnight to publish it.  Attached to it would be an order to the Treasury Department to resume business-as-usual.  Crisis averted.

Until, of course, someone sues.  But by the time the Supreme Court rendered a final decision on the case, assuming it went against the President, and the current composition of the U.S. Supreme Court makes anything possible; the consequences would be truly catastrophic.

Even if the courts moved expeditiously, by that time final judgement came down the Federal Government would have blown a long way past the current Debt Ceiling.  At that point the only remedy that would be effective and could be quickly implemented would be to raise the Ceiling immediately.  And, of course, the Court should be of the opinion that the Debt Ceiling law is unconstitutional.  If they did then the problem would be eliminated permanently.

Exercising the 14th Amendment option puts us on a path that is fraught with danger.  That's one reason it is not the scenario that most people expect to see.  And selecting the "minting coins" option is even less likely happen.  Most people, including myself, expect Biden to cave to Republican demands.

But that assumes that Republicans have a coherent set of demands, and that caving to them will be enough to get the Debt Ceiling raised.  The problem is that it is unclear what those demands are.  (This is another area where the press has fallen down on the job.)

There may never be a single "deal" for the President to agree to.  Given that talks have been paused while Republicans figure out what they want to do next, it is possible that the Republican position has already fractured beyond repair.

Unfortunately, it is now critical that top level politicians and bureaucrats be great poker players.  They need to be able to hold their cards close to their vests.  They need to be able to pull off outrageous bluffs.  And they need to be willing to go all-in.  In a saner political environment, none of this would be necessary.

Given all this, I do not expect a resolution before June 1.  There is a real chance that things will end up stretching a day or two past June 1.  Some groundwork can be laid before then.  But if that is happening, it is happening behind the scenes.  So, I expect the drama to continue, but action to be lacking right up until the very last minute.  Unless, of course, the Republicans implode.  If that happens, I have no idea how all this will play out.  Buckle up.  From here on out the ride is going to get bumpier and bumpier.

Tuesday, March 21, 2023

Bonds, Banks, and the Fed

A bank nobody had heard of called Silicon Valley Bank (SVB) crashed recently.  Since then, all hell has been breaking loose.  If you look in the right nooks and crannies of the press the story is actually being covered reasonably well.  But the usual oversimplifications and lack of appropriate context and background dominate the coverage provided by the media that most people actually follow.  This gives me the opportunity to fill in some blanks and provide some the appropriate context and background.

Starting at the beginning, stocks are supposed to be risky, and bonds are supposed to be safe.  The logic is simple.  Many organs of the press report where the Dow's closes at the end of each day.  That number has been bouncing all over the place.  One day it's up.  The next day it's down.  The volatility of the Dow, and other indexes like the S&P 500 and the NASDAQ, reinforces the idea that it is easy to both make and lose money when investing in stocks.

Bonds, on the other hand are supposed to be safe as houses.  You know what you're getting from the get-go, and you almost never lose money on the deal.  Instead, you almost always get what you were promised when you signed up.  And, in fact bonds rarely default (fail to pay investors what was promised on time and in full).  But it is still possible to lose money on bonds even if they don't default.  To understand why it is necessary to dive into how bonds work.

Bonds are characterized by an amount (often called the "face value" or the "denomination"), a "term" or duration, and an interest rate.  So, for the purposes of explanation, let's talk about a $10,000 bond (the amount), whose duration is 10 years, and that has an interest rate of 1%.  These numbers have been picked to make the math simple, but bonds of this exact type have actually been issued in the recent past.

The way things are supposed to go is that an investor hands over $10,000 in exchange for the bond.  Some bonds pay interest quarterly.  Some pay semiannually.  Either way, our investor receives $100 (1% of $10,000) in the first year, another $100 in the second year, and so on.  At the end of the tenth year the $10,000 is returned.  In all, the investor puts in $10,000 and gets back $11,000.

And, as I said, bonds rarely default.  Even "Junk" bonds, bonds issued by institutions who have problems of one kind or another, usually pay off.  So, what could possibly go wrong?  It turns out that if the investor holds the bond to maturity, 10 years in our example, usually nothing.  The interest gets paid on time and in full.  The investor gets his initial investment back, on time and in full.

But what happens if events similar to what has happened in the past several months happen?  It turns out that the "Fed", the Federal Reserve Bank of the United States, jacked up interest rates.  I am going to skip the details of how the Fed manipulates interest rates because it's complicated.  Just trust me, it can, and it does.

The Fed has twin responsibilities.  It is supposed to manipulate things so that inflation stays low, and the economy grows steadily at a relatively even rate.  The standard tool for doing this is to manipulate interest rates.

If the Fed moves interest rates lower, it is supposed to make it easier for businesses to grow and expand.  That lowers unemployment and increases economic growth, perhaps increasing inflation.  If it moves interest rates higher, then that is supposed to decrease business activity with the likely side effect that unemployment goes up, hopefully decreasing inflation.

There is a lot wrong with this simplistic scenario.  If you troll through my past blog posts, you will see me diving into all this in more detail and pointing out how poorly this works.  Nevertheless, this is the standard tool the Fed has traditionally used.  And, for various reasons, which I am again going to skip, the Fed has been goosing the economy by keeping interest rates near zero for most of the last decade.

But supply chain problems and other issues that coincided with the end of the critical phase of the COVID epidemic led to a consensus that the economy had gotten overheated.  This resulted in rampant inflation.  Unemployment was extremely low, so it would there would be little or no harm if it went up a bit.  This was the analysis pushed by the business community.  There was, however, little dissent from this view coming from other sectors.  Not surprisingly, the Fed adopted this view and started raising rates very rapidly.

Typically, the Fed meets once per month to decide whether or not an interest rate adjustment is warranted.  Most months they decide to do nothing.  But for several months in a row the Fed decided to raise interest rates by 75 basis points.  A basis point is 1/100th of a percent, so that amounts to three quarters of a percent change.  That's a big change.  To increase the interest rate by a large amount month after month is unprecedented.

Large increases in the interest rate coming month after month was supposed to quickly cool the economy off.  A cool economy was supposed to decrease inflation to the Fed's target of 2%.  Increasing interest rates quickly and dramatically was supposed to slam the brakes on spending by both businesses and consumers.  That, in turn was supposed to drive inflation down.

But there was little change in spending behavior even after several months of drastic action.  As a result, inflation, while declining, stayed high.  So, the Fed kept jacking rates up.  Recently, some signs of a slowdown finally started appearing, so the Fed only increased rates by 25 basis points the last time around.

But the consensus among Fed watchers was that the Fed was going to continue with its "higher, ever higher" strategy.  The only item that lacked consensus was how fast the Fed would raise rates.  Would it go back to 75 basis points per month, or would it stick with a slower rate of 25 basis points per month.  So, what's all this got to do with SVB?  Interesting question.

It has to do with how you can lose money on bonds that don't default.  Let's say that one year into the life of our 10-year bond we decide to sell it.  Well, if interest rates are still at about 1% then the sale price will be $10,000, more or less.  No harm.  No foul.  But what happens if instead in the interim the Fed has jacked rates up by a lot, and done it very quickly?  Which, by the way, is exactly what they have done recently.

Then why should someone buy our bond for $10,000 more or less?  It only pays a paltry 1% and they can buy a new 10-year bond that pays 4%.  Instead of getting $100 per year in interest payments they can get $400.  In those circumstances they wouldn't touch our bond with a ten-foot pole.

But what if we were to "discount" our bond, say by asking only $8,000 for it?  That way they would eventually pick up an extra $2,000 when the bond got redeemed for the full $10,000 amount.  $8,000 might not be the right price.  But there is a price that would attract a buyer.

The business of figuring out exactly what that price would be is complicated.  Fortunately, there is a "secondary" market in bonds.  You can just look there to find out how much a particular bond needs to be discounted to in order to sell.  The secondary market provides a "current market price" for our bond.

SVB did not get into trouble by selling a bunch of bonds and taking a bath on them.  The situation was slightly more complicated.

As a result of the crash of 2008 a lot of institutions were required to periodically "mark to market" all of their assets.  This stopped them from carrying "Zombie" assets on their books.  These were assets that used to be worth a lot but were now worth far less.  Zombie assets could make everything look fine when, in fact, it was not.

I don't know whether SVB was required to mark their assets to market, but investors and large customers became aware that SVB had a bunch of problematic assets on their books.  They forced SVB to mark them to market and report a big loss.

At this point we don't have a problem.  SVB could spread the actual losses over a period of years by not selling the bonds right away.  So the problem could have been managed.  They also did what they were supposed to do in a situation like this.  They immediately set out to increase their capitol.

But when it comes to finance, it often resembles a game of musical chairs.  In this case the loser of the game would end up having to eat the losses.  So, a bunch of investors on a group chat decided it wasn't going to be them.  It was going to be someone else.  They all decided to immediately pull their money out of SVB.  That way none of them would be stuck with the check.

Banking and finance are now just files on computers.  A "bond" used to be a piece of paper that someone kept in their safe.  Now it is information in a computer file.  So are account balances.  And now it only takes a matter of seconds to move money around.  The time it takes is not affected by whether the amount is question is $1 thousand, $1 million, or $1 billion.  The money moves equally fast in every instance.

A bunch of depositors pulling a few thousand apiece out of SVB wouldn't have made any difference.  But a bunch of investors pulling out millions and billions made the difference between solvency and insolvency.  SVB went from "just fine" to "dead man walking" in less than 48 hours.  And SVB had another problem.

SVB specialized in funding for Venture Capitalists (VCs) and for other high-risk sectors of the tech industry.  Tech has been hit hard in the last few months.  It turns out that the economy was under enough pressure that they stopped seeing the growth rates that investors and VCs expected.  Stock prices of even solid, well-established tech companies have declined substantially in the last six months.

In far too many cases, the companies SCB was investing in were not big and well-established.  They were start-ups.  As a result, they were likely hit even harder than the big, well-established firms.  So, it was likely that the non-bond part of SVB's portfolio was also in a lot of trouble.

There is currently no hard evidence of this.  But the fact that the FDIC, the part of the government responsible for cleaning up the mess that the failure of SVB left in its wake, has not been able to sell SVB off, either as a whole or in pieces, is a good indication of problems here too.

An over-reliance on a single market segment, or a number of closely related market segments, has been a red flag for forever.  SVB should have not been allowed to do that.  But the public does not care about the minutia of bank regulation while bankers and Wall Street does.  So, regulators were pressured to look the other way.  And they were not allowed to put regulations in place to outlaw this sort of behavior.

In that sense, SVB was unique, or nearly so.  But as soon as SVB went under people started looking for other similarly situated banks.  Signature Bank immediately went to the top of that particular list.  It was over-invested in Cryptocurrency, another high-risk market segment.  And it didn't take long for it too to go down.

When things are going well, high-risk market segments can create high profits.  But when pressure is applied, they tend to go down farther and faster than more boring market segments do.  It should come as no surprise to learn that both SVB and Signature Bank were Wall Street darlings.

Thank goodness, an over-concentration on high-risk market segments is not as common in the banking business as it once was.  But the problems with their bond portfolio that SVB had is much more common.  That's because of how banks actually work.

When you put money into a bank the bank doesn't sock it away in a vault.  Instead, it loans it out to other people.  The interest on their loan portfolio is how banks cover operating costs and the cost of whatever interest they pay on CDs, for instance.  It is also how they generate a profit for their shareholders.  This sounds risky, but if done properly it usually isn't.

Banks are aggregators.  As part of how they do business they mix your money in with everybody else's.  If on a given day more money comes in as deposits than goes out as withdrawals, loans, etc., then everything is fine.  There is more than enough money to go around.  However, if more goes out than comes in, this is a potential problem.  But if the net is small, and over time it gets balanced out by more money coming in on other days, then it is an easily managed problem.

As a safety measure, banks are required to set a certain percentage of deposits aside as a "reserve".  Is this money stacks of bills in a vault?  Again, no.  They are, however, required to invest this reserve money in "safe" securities.  The regulators determine what securities qualify as safe.  But they always include U.S. Treasury bonds in the list of securities that qualify.  So, banks own a bunch of them as part of their reserve requirement.  Well, guess what.  These are the very bonds the Fed was jacking the rates up on.

So, it's not just SVB that was subject to this problem.  And it's not just around SVB that musical chairs got played.  The collapse of SVB caused people in the know to look at banks near and far.  They soon zeroed in a bank called First Republic Bank.  It's another of those banks that I had previously never heard of.  But the jackals started circling within a day or so of the collapse of SVB.

First Republic is still in trouble.  But apparently it wasn't in as bad a shape as SVB, because the Feds swooped in and started propping it up rather than shutting it down.  As of this writing First Republic is still in business.  If the pressure lightens up it will survive.  If the pressure intensifies, it likely won't.  The irony is that in a normal environment it would have sailed along nicely, and people like me would still have never heard of it.

And it's now not just U.S. banks.  A giant Swiss bank called Credit Suisse was soon in trouble.  They were old enough and big enough that their name was familiar to me from long before the present crisis.  But first a digression because the story of Swiss banks is an interesting one.  And it turns out to be relevant.

Up until the '30s they were the kinds of banks you would expect to find in a country the size of Switzerland.  But Swiss banks smartly leveraged Swiss neutrality laws to their advantage in the run-up to Word War II.  They took in deposits from people being persecuted like the Jews.  They also were happy to deal with the persecutors.

As more and more countries cut ties with the Nazis, Germany started doing business through Swiss banks.  Swiss banks grew enormously during this period.  In the aftermath of World War II lots of money was stranded in Swiss banks because its owners, Nazis, Jews, and others, were no longer around to collect it.

Swiss banks quickly moved on to playing the same "collect money from both sides" game during the Cold War.  Only the players changed.  As opportunities there eventually diminished with the end of the cold war, they shifted to doing business with drug lords, authoritarian dictators, and the like.

But all good things eventually come to an end.  In the last ten or twenty years the Swiss have been forced to open up their banking system to outside scrutiny.  And other countries like Panama have also gotten into the business of hiding and laundering money, so the Swiss lost their near monopoly.

This has diminished the advantage of Swiss bankers.  But by now they were hooked on the power and prestige of being major players.  They have since resorted to getting into the business of playing the Wall Street game where they get involved in risky ventures in order to juice their balance sheet.  When the game of musical chairs went international, a development that only took a few days, Credit Swisse was the fattest target.

The Swiss government brokered a deal where Union Bank of Switzerland (UBS), the other Swiss behemoth bank, took them over.  The Swiss government was so concerned that they didn't let a little thing like the law get in the way.  They simply changed the law that made such mergers illegal overnight to permit this particular merger to go through.  UBS has its own problems.  But the Swiss government decided that putting all the problems into one basket made them easier to manage.

Is the banking system now back in good order?  It's too soon to tell.  If we can go a couple of weeks without anything else popping up, then we are likely out of the woods.  But if another big bank starts making headlines for being in trouble, or if First Republic goes under, or if the Swiss merger goes south, then we are in for more trouble.

There is a lot more I could get into.  How we got into this mess.  What should be done to fix it.  What will be done to fix it.  But I am going to confine myself to one additional subject.

There is talk of raising the limit on how much of an account balance the FDIC insures.  The current limit is $250,000.  Almost all of the money on deposit at SVB was uninsured because it was in an account that had far more than $250,000 in it.  Most banks have a far lower percentage of their total deposits that are outside of the insurance umbrella in this way.

Theoretically, the FDIC could have left the owners of those accounts hanging out to dry.  The law permitted the FDIC to say, "here's your $250,000 - sorry about the rest".  The whole reason there was a run on SVB was because a bunch of big depositors did not want to take a haircut.  And who can blame them?

With that in mind, it looks like it would make sense to increase the amount.  An argument against increasing the limit to infinity as some are proposing goes under the rubric of "moral hazard".

Let's stay that all deposits are insured.  Then what's to stop someone from putting their money into a bank that they know has problems?  The incentivizing of bad behavior like this leads to a moral hazard.  Or so the argument goes.  But the "how high should the insurance limit be" ship sailed a long time ago.

At this point I would not characterize our current banking crisis as a big one.  Compared to the size of the economy (trillions) even a few billion is small beer.  But in the last few decades we have had two banking crises that do qualify as big banking crises.  There is the one that people still remember, the crash of 2008, and the one they don't.

The one that everyone now conveniently forgets about is the Savings and Loan Crisis of the late '80s and early '90s.  It turns out that there are lots of different types of "banks".  The easiest way to organize them is by looking at who regulates them.

What most people think of as a "bank" is actually a "commercial" bank.  They often have the phrase "National Bank" in their name.  These operate under a "charter" issued by the Federal Government.  They are regulated by Federal agencies and insured by the FDIC.  As a group they are the most heavily regulated and have the strictest operating requirements.

But then there are the state-chartered banks.  These used to be regulated and insured by the state equivalent of the Feds and the FDIC.  I don't know if that is still true.  SVB was a state-chartered bank, but the Feds, including the FDIC, have been all over them.

Then there are Savings and Loans (S&Ls).  Back in the day they couldn't make loans to businesses and couldn't offer checking accounts.  The biggest "bank" failure in the U.S. is Washington Mutual.  Technically, it was a "mutual saving bank", but it was regulated by the same people that regulated S&Ls.

The original idea was that S&Ls couldn't cause much trouble, so regulation was much lighter on them.  And at one time that was true.  But in one of the waves of deregulation that swept the U.S. they were deregulated to the point that they could offer checking accounts and make a much wider variety of loans.

What could possibly go wrong?  Lots.  S&Ls started doing all kinds of things.  Besides doing stupid things some were run by outright crooks.  The whole thing came crashing down during a ten-year period running from roughly 1986 to 1996.  S&Ls went under by the scores and the Federal Government ended up picking up the pieces.

Some laws were changed but S&Ls were not forced to confirm to the rules national banks had to operate under.  Standards and regulations remained much looser.  And throughout the S&L Crisis almost all cases depositors were made whole even if they had far more than the amount on deposit that was covered by insurance.  At this point it became de facto policy to cover all deposits no matter the amount.

The demise of Washington Mutual (or WaMu, as it was commonly called) was not part of the S&L Crisis.  It managed to make it through the S&L Crisis unscathed, because at that time it was well run.  Instead, it belonged to the events connected to the crash of 2008.  By that time good management had been replaced by bad.

And one contributing factor to WaMu's demise was the light regulatory environment it operated under.  In the case of WaMu, and all the other "banks" that went under in this event, depositors were made whole regardless of the size of their account balance.

And the crash of 2008 introduced a whole new group of players into the public consciousness, "investment" banks.  Again, the actual game was to find an excuse for diminished regulation.

A law called Glass-Steagall had been passed in 1933 carving out investment banks as a group subject to a separate, more permissive, regulatory regime.  They couldn't do "retail" banking, offering checking accounts, making loans to individuals and businesses.  They were restricted to only doing business with Wall Street.

The argument was that the customers of investment banks were sophisticated people who were savvy about money and risk.  As such, a heavy regulatory hand was not required.  Unfortunately, Glass-Steagall was repealed in 1999 as part of yet another wave of deregulation.  As a result, most of the names mentioned in the headlines surrounding the crash of 2008 were investment banks.  Not surprisingly, they had done stupid things in pursuit of ever higher profits.

All depositors caught up in the S&L Crisis and in the crash of 2008 were made whole regardless of how much of their balance was or was not supposed to be covered by insurance.  In fact, I can't think of a time in the last half century when a depositor has taken a haircut as a result of a "bank" failing.

So, the limit on how much of a deposit is insured by the Federal Government is a fiction.  Changing it will make literally no difference.  Except, perhaps, to fool some people into believing that something is being done when nothing is being done.

And finally, to return to my original subject, bonds.  SVB was a publicly traded company.  As such it issued stock and was nominally owned by its shareholders.  They are in for a haircut.  But SVB also issued SVB bonds.  The holders of these bonds are likely to be reimbursed 100%.  It may take a few months, but they will probably get all of their money back.

It is common to see stockholders losing a lot and bondholders not losing anything when a company goes bust.  And that is part of the reasoning behind the idea that bonds are safe while stocks are not.

Wednesday, February 15, 2023

Gas Stoves

The right likes to make something out of nothing and nothing out of something.  (The left does it too, but to a far less extent.)  The latest example of this concerns gas stoves.  They, meaning the libs, are coming to wrench our gas stoves out of our cold dead hands, or something to that effect.  BTW, the Biden Administration immediately disavowed any interest in banning or further regulating gas stoves.  But that didn't stop, or even slow down, the outrage from the right.

The facts can be related briefly, so I'll do that.  Then I will take a deeper dive into the "controversy" so see if there is any "there" there.

A couple of weeks ago a research group announced their findings.  They reported that gas stoves are a significant source of air pollution in the home.  This air pollution is bad for you, and it is especially bad for children, they continue.  It leads to an increase in the prevalence of childhood asthma.

I know little about asthma and its causes.  Still. it seems reasonable that if gas stoves are a significant source of indoor air pollution, then that could easily lead to an increase in childhood asthma.  As to the main air pollution claim, I am going to dive a bit deeper into that next.  Then I am going to take a serious look at gas stoves, their history, and whether they are worth all the fuss.  Here goes.

Gas stoves work by burning something.  Usually, the "something" is Natural Gas.  Natural Gas is mostly Methane.  Methane consists of one Carbon atom and three Hydrogen atoms.  If you add three Oxygen atoms and rearrange appropriately, you get one molecule of water (2 Hydrogen atoms plus one Oxygen atom) and one molecule of Carbon Dioxide (2 Oxygen atoms and one Carbon atom).  That's the Cliff's Notes version.  The reality is a lot more complicated.

First of all, the three Oxygen atoms come from one and a half Oxygen molecules.  The Oxygen found in air consists of molecules containing two Oxygen atoms.  So, the Methane molecule must be broken up into its constituent parts.  And the Oxygen molecules must be broken up into their constituent parts.  Then the constituent parts must be reassembled into the final result, a molecule of water and a molecule of Carbon Dioxide.

This process is quick, but it is not instantaneous.  And what is actually going on is that multiple processes are taking place simultaneously.  Things are getting knocked apart.  Things are getting glued together.  It is literally a free for all.  And that means that all possible processes are going on at the same time.  What determines the final outcome is what is called the "rate of reaction" of each of the various competing processes.

Some processes have high rates.  Some processes have low rates.  The high-rate processes tend to predominate in the end.  And the rate depends on various factors.  An important one is temperature.  As temperature increases some rates speed up while others slow down.

It takes a deep understanding of Quantum Mechanics and related disciplines to predict how all this is going to shake out.  Fortunately, we can cut to the chase by running the experiment.  We can turn the stove on and see what happens.

And what happens is more complicated than Methane plus Oxygen in yields water plus Carbon Dioxide out.  You see the rate of reaction of the various processes is never zero.  So, we will always get some unburned Methane.  We will also get some Carbon Monoxide (one Carbon plus one Oxygen).  And we will get some soot (pure Carbon).  This is not the end of the list, but it gives you the idea.

But wait, there's more.  Air is not pure Oxygen.  In fact, air is composed of just over 20% Oxygen, just under 80% Nitrogen, and a percent or so of other stuff.  I'm going to focus on the Nitrogen.  Like Oxygen, what's in air is a molecule consisting of two Nitrogen atoms.

But what's important to our discussion is that Nitrogen is capable of combining with both Hydrogen and Carbon to form various molecules.  These are still more processes and like the other processes, their rate of reaction is never zero.  So, as we burn Methane in a stove, we will get some of those too.

So, we have two methods of attack when it comes to determining what the result of operating a gas stove is.  We can perform difficult and complex Quantum Mechanics computations, or we can just fire up the stove and measure the results.  The report is the result of doing the latter.

The tests they performed measured a certain amount of air pollution.  They concluded that the amount of pollution caused by the routine operation of a gas stove was enough to cause problems.  The magnitude of those problems was in line with the problems caused by being subjected to secondhand smoke.  That seems reasonable to me, but it can't hurt to do further research.

Next, let's take a look at the history of gas stoves.  They seem like they are the sort of thing that has been around forever, but that is wrong.  Their first appearance in their present form is actually quite recent.  Fire goes back a long way.  Gas stoves don't.

The original fire was the campfire, or something similar.  A pile of wood was burned in a relatively open space.  Highly flammable material like tinder was initially set alight using a flint and steel, or a friction contraption.  Small dry sticks were added to make it bigger.  Larger pieces of wood were then added to make it still bigger.  Once it had reached the appropriate size, more wood could periodically be added to keep it going relatively indefinitely.

This open wood fire is very inefficient.  Most of the heat it generates is wasted.  To understand why it is important to dive very shallowly into thermodynamics.  Heat over there is useless.  It needs to be transported over here to the place where it is needed.  There are two methods of heat transference, convection and conduction.  Let's start with the latter.

Heat is like electricity.  It moves easily through some materials - conductors, and poorly through other materials - insulators.  If two conductive materials are in contact with each other, then heat quickly moves from the warmer one to the cooler one.  Something warm like the flame of a fire can quickly transfer heat to something cooler, like a pan on a stove, thus warming the pan up.  The process requires direct contact.  But, if the two materials are both good heat conductors, then heat transfer happens quickly.

It took longer for scientists to understand convection.  Here, contact is not involved.  But if you put your bare hands out toward a campfire, they soon feel warm.  The mystery of what was going on was only solved when infrared waves were discovered.

They are a form of light.  Their frequency is below the "visible" part of the spectrum, so out eyes can not see them.  What's happening in my example is that the campfire is emitting infrared waves.  These waves travel across the gap between the fire and our hands.  When they strike our hands they transfer energy, the energy that warms our hands.

In assessing the efficiency of a system, it is important to focus on how much heat goes where we want.  It is also important to take both conduction and convection into account.  Usually, one is dominant and the other plays little or no part in the process.

In the case of our open campfire, there is no conduction going on.  It is pretty much all convection.  The fire is throwing infrared light out in all directions.  But most of this infrared light never hits anything we are interested in.  Instead, it is wasted.

This waste led to a great innovation, the longhouse.  A longhouse is a relatively large building that is mostly open on the inside.  A campfire is maintained in the center of the floor.  There is a small hole in the roof above the fire that lets the smoke eventually get out.

But by design, pretty much whatever the direction, the walls of the longhouse are there to trap the infrared rays coming from the fire.  Much more of the energy of the fire ends up warming up something useful.

Of course, this arrangement tends to trap a lot of the smoke and soot from the fire inside the longhouse.  So, the air is often pretty nasty.  And this problem led to the next development, the stove.  Instead of a large building the fire is contained in a much smaller ceramic vessel.  Since the vessel surrounds the fire most of the heat ends up warming the vessel.

The vessel, in turn, warms up the room it is in.  (One or two stoves per room were required for the whole thing to work.)  But heating a building with stoves kept the building warm and smoke free at the same time.  Most of the heat the fire in the stove produced went into heating the room, so fuel costs were reasonable.

But ceramic stoves are expensive to build.  And they are slow.  Fire one up and it is likely the better part of a day before you get much warmth out of it.  In Franklin's famous stove, he replaced ceramic with iron and reduced the overall size.  He as able to retain most of the efficiency and all of the smoke reduction, so his design quickly went into widespread use.

All the stoves I described so far are optimized for heating.  But with a little tinkering a stove can be modified to make it a good device for cooking.  For instance, add a separate box next to the firebox.  This becomes what we now call an oven.  Make the top of the stove flat.  Pots and pans can now be placed there and used for food preparation.  Both iron and ceramic stoves were easily modified for use in cooking rather than heating.

And everything I have talked about so far used wood as its fuel.  But with the advent of the industrial revolution, it quickly became apparent that both ceramic and iron stoves could also be adapted to use coal, so they were.  Stoves have since been adapted to use compressed sawdust, wood pellets, and a number of other materials for fuel.  In all cases, the modifications required were modest.

There are also designs that are halfway between an open campfire and a stove.  They are used in wood fired pizza ovens, for instance.  In these halfway designs the fire is mostly but not entirely enclosed.  This design is more efficient than an open campfire but far less efficient than a fully enclosed stove.  And that's about where things stood until about 1850.

Spirit stoves and lamps had been around for millennia.  A solid or liquid that vaporizes at a low temperature was used as a fuel.  The device vaporized the fuel which was then burned.  The problem was that fuel for these devices was hard to come by.  So, although the designs for these devices were well known, they were rarely used.

That changed with the discovery of oil and the industry that grew up around it.  It turned out to be relatively easy to "refine" oil into Kerosene, and other similar liquids.  These liquids make excellent fuels for spirit stoves and lamps.

Once the oil industry got going, Kerosene and its ilk became available in large quantities.  And they were cheap.  And that meant that devices that used these fuels quickly became popular.  The revenue stream produced by selling fuel, mostly for lamps, is what powered the explosive growth of the oil industry in the second half of the nineteenth century.

Once these devices came into widespread use, for the first time in history it was practical to conduct business and pleasure after the sun had gone down.  It wasn't until the beginning of the twentieth century that demand for transportation fuels (gasoline and diesel) became great enough to overtake the market for fuels to power spirit lamps and stoves.

Another product of the oil refining business was Propane gas.  It is a more complex molecule than Methane.  It consists of three Carbon and 8 Hydrogen atoms.  When it is burnt even more processes are involved, which means that even more byproducts beyond the usual water and Carbon Dioxide are produced.  But under ideal conditions the combustion products of Propane consist mostly of water and Carbon Dioxide.  Only small amounts of other stuff are produced.

And Propane is an ideal fuel for a spirit stove or lamp.  It eliminates the need to turn the fuel from a solid or a liquid into a gas.  Propane starts out as a gas.  The widespread availability of Propane is the event that led to the development of the modern gas stove.  A gas stove is simply an evolution of the spirit stove.

In the early days Propane was not widely available.  What helped increase its popularity was the observation that it liquifies if subjected to moderate pressure.  Liquid Propane takes up lots less space than the gaseous form.  That makes it easier to transport Propane in bulk.  Even so, economics did not justify its transportation over long distances.  So, availability improved but still remained spotty.

But once the idea had been introduced, people started investigating alternatives to Propane that could be introduced into the many areas where it was not available.  And it turned out that there was a process that could be applied to coal that would produce a gas that could be used like Propane.  Coal can be found in a surprisingly large number of places.  As a result, "gasworks" plants that turned coal into a Propane-like gas were soon popping up all over the place.

Seattle is one of those places.  A gasworks plant operated in the heart of Seattle for many decades.  That allowed people in Seattle to use "gas" (not gasoline but literally a gas) for cooking, heating, and lighting.  The gas was piped into homes and building all over the downtown area.  The same thing took place in many cities and towns scattered across the country.  This substantially expanded the area where gas stoves were practical.

Another method used to expand the area where gas stoves could be used involved setting up a company that used trucks to fill a Propane tank that the customer owned.  Whatever Propane appliances the customer owned could then be fed from the tank.  This service only made sense where conditions were right.  Propane could only be transported relatively short distances economically.

One thing that held the Natural Gas market back for a long time was the stupidity of the oil industry.  Oil and Natural Gas tend to be found in the same places.  But rather than seeing Natural Gas as another source of profit, the industry treated it as an annoyance that needed to be gotten rid of as cheaply as possible.   So, they just "flared" it off.  They set up large, cheap torches and let it burn.  It took them a long time to figure out that Natural Gas was actually valuable.

The thing they missed was that Natural Gas is cheap and easy to transport.  And it does not require a complex and expensive "refinery" to convert it from raw material to salable product.  With Natural Gas, only a few simple steps are necessary to remove impurities.  The "refined" gas can then be sent long distances via a "gas" pipeline.  The pipes in the pipeline can be relatively small and still move large quantities of product.  So, gas pipelines are relatively cheap to construct and very cheap to operate.

Once the industry wised up, they built Natural Gas pipelines everywhere.  As the gas pipeline network was built out, more and more of the country had access to Natural Gas.  A Natural Gas pipeline eventually made it to Seattle.  That obsoleted the gasworks.  The site of Seattle's gasworks was eventually turned into "Gasworks Park".  The park has become very popular.  It has great views and is a prime spot for flying kites.

Over time, more and more of the country could cook with a gas stove, heat water with a gas water heater, heat a home or building with a gas furnace, generally buy a lot of Natural Gas from the industry.  This was helped along by various marketing efforts that claimed that gas was superior to electricity for pretty much everything, but especially for cooking.

So, why should someone put in a gas stove?  According to the industry it was because it was the best tool for the job of cooking food well.  This conveniently hid a lot of extremely relevant history.  What did great cooks do before gas stoves were widely available?  They invented Haute Cuisine.  High-end cooking is pretty much of a French invention.  That's why so much modern cooking terminology is in French.

People have been throwing feasts since time immemorial.  But they were infrequent special events.  And the emphasis was on quantity and variety.  A feast might go on for several days.  It would consist of course after course, each different.  That took the focus away from the quality of any specific course.  And feasts typically involved drinking, a lot of drinking.  This too detracted from a focus on quality.

French royalty in the eighteenth century, and particularly in the nineteenth century, slowly started changing the focus.  Seen one feast - seen them all.  So, they started focusing on presenting courses that were unique and special.  "Come to my event because the food will be amazing and memorable."  Of course, it soon became a competition.  Who could throw the feast with the most amazing food?  Winning this competition took both skill and money.

This is when the celebrity chef was invented.  Someone who could source rare ingredients and then use them to produce something unique and delicious became highly sought after.  And, of course, at some point, one of these chefs said, "I am tired of working for some ignorant noble who treats me badly.  I am going to go out on my own and create a restaurant.  That way I get the glory and respect instead of some fool who was lucky enough to choose his parents well."  And so, the destination restaurant was born.

And as more and more of them opened it became harder and harder to stand out from the crowd.  And what was the market they were catering to?  Snooty people who had a lot of money and wanted to show off.  So, the dishes got more and more elaborate.

And their preparation got more and more labor intensive.  That drove costs up, but that was the point.  That enabled customers to be able to say, "I went to a more expensive restaurant than you did.  Why?  Because I could afford it and you couldn't."

This trend peaked in about 1900, a time before gas stoves were in widespread use.  Haute era cooking, considered by many to be a peak never equaled since, was all done using non-gas stoves.  Haute Cuisine was gradually replaced by Nouvelle Cuisine starting in about 1900.

It was a move away from exotic ingredients and labor-intensive preparation techniques.  The idea was to focus on putting out a quality product that was based on top quality ingredients and simple preparation techniques that highlighted the flavors inherent in the ingredients.

Haute Cuisine often involved changing the characteristics of the ingredients by hiding them under sauces and the like.  Nouvelle Cuisine has gone through a couple of generations of evolution.  But its goals continue to drive much of the thinking that defines how people think about how "good" food should be prepared and judged.

This history makes it obvious that there is nothing inherent in a gas stove that makes it a superior tool for food preparation.  So why do so many highly respected chefs claim to prefer them.  One reason is money.

Not surprisingly, the fossil fuel industry has been providing kickbacks to marquis chefs who tout the supposed superiority of gas stoves.  Cooking schools get subsidies if they teach their students to cook on gas stoves.  This sort of thing tends to create an echo chamber.

But there is more to it.  Consider the modern restaurant.  Starting with Nouvelle Cuisine the industry has been moving away from techniques that require elaborate preparation.  So, what does make a successful modern restaurant?

You expect a menu that features many options.  Most modern restaurants specialize in a specific type of cuisine, say Italian.  But at an Italian restaurant customers expect to be allowed to select from among a large number of different Italian dishes.

Regardless of the type of cuisine, how long are customers willing to wait between when they order and when the food arrives?  Customers used to have more patience.  One trick restaurants used to use was to serve the meal a course at a time.

The appetizer would come out.  Then a little later the soup would arrive.  And after more delay the entree would arrive.  And so it went through the entire meal.  This allowed the restaurant to stretch things out, leaving it more time to prepare the various dishes.

But even with this kind of distraction only allowed the restaurant to stretch things out so far.  Even customers at high-end restaurants were only willing to wait so long.  Then there is the whole "fast-food" segment.  It is now, and has been for some time, far larger than the high-end segment of the business.  Its success owes to the fact that it succeeded in reducing wait times to at most a few minutes.

The find dining segment of the market argues that they provide a far superior dining experience.  But still, they feel some pressure as a result of the existence of the fast-food segment.  They know that they have at most twenty to thirty minutes to get the entree in front of the customer.  And the high-end segment now has something new to worry about.  There is now a "fast casual" segment.

This market segment pitches the idea that they deliver a far superior product than the fast-food segment.  It may not be as good as what a high-end restaurant is capable of, but it is darn good.  And they get the food in front of the customer far quicker than their high-end brethren.  Fast casual has seen considerable success in the past decade or so.  And this has put even more pressure on even high-end restaurants to speed things up.

The TV show "Chopped" showcases what goes on in the kitchen of a modern high-end restaurant.  Contestants are tasked with cooking a complete course in 15-30 minutes, depending on the segment.  For each course they are presented with four "mandatory" ingredients.  The mandatory ingredients are chosen to clash with each other.  That has to result in a lot of bad food.  But it also results in good TV.  The show is very popular.

To succeed a contestant has to have a bag of tricks and hacks for getting things done in a hurry.  And a substantial component of a contestant's score depends on making the dish look pretty.  Everything but the clashing ingredients accurately mirrors the operation of the kitchen of a modern restaurant.  Chefs must be able to quickly prepare a wide variety of dishes.

Done well, these dishes may require a wide assortment of techniques.  But the kitchen is too small and too short staffed to do all those different things well.  And they have no time.  Each chef is required to be working on several dishes at once.  He can't put much attention and care into any one dish.  Thus, trick and hacks, particularly time saving hacks are critical.

Is this the best way to prepare great food?  No!  But if the restaurant can't get the food on the table fast enough, it won't matter how great it is.  People will go to the "good enough" restaurant that features faster service.

The modern restaurant kitchen is where the gas stove shines.  Different dishes require different cooking temperatures.  The temperature of each burner of a gas stove can be changed independently of the other burners.  And it can be changed instantly.

It can be instantly cranked up to provide a lot of heat under this dish.  Then it can be cranked down instantly to more slowly and gently warm that dish.  That's a lifesaver in a modern restaurant kitchen.  But does it product the best end result?  Let's see.

But first let's take a look at the history of gas stove's modern competition, the electric stove.  Each dates back to the second half of the nineteenth century.  The spirit stove evolved into the gas stove pretty quickly.  It took longer for the design of the electric stove to mature.

The science behind the electric stove is extremely simple.  I mentioned conductors and insulators above.  It is not a binary situation.  Materials exist at every point in the scale between fully conductive and fully insulating.

Consider a material that is down toward the fully conducting end of the scale, but not quite at the end.  It conducts pretty well but puts up some resistance.  To get a little more technical, a certain amount of power goes in one end.  A lesser amount comes out the other end.

The amount of resistance measures how much the power gets diminished.  So, what happens to the power that has disappeared?  The "conservation of energy" law tells us that it must go somewhere.  The answer is that it gets converted into heat.  And, in fact, the conversion ratio is 100%.  All of the power that has disappeared gets converted into heat.  That is great news if what you are trying to do is produce heat.  And that's exactly what a stove does.

So, to make an electric stove all you need to do is to push some electric power through a wire that has some resistance.  Easy - peasy.  And if the wire has little resistance then only a small percentage of the electric power gets converted into heat.  More resistance results in a higher percentage getting converted.  Selecting wire that has the right amount of resistance for use in the stove allow the top temperature the wire reaches to be dialed in.  You don't want the wire to get so hot that it melts.

And that is literally the design used for early electric stoves.  A wire with the appropriate amount of resistance, usually coiled so that it would put out more heat per foot without melting, was placed in a ceramic tray.  Ceramics can withstand high temperatures and are electrical insulators.  So, they supported the wire and kept it from touching things it wasn't supposed to.  An electric stove is not supposed to electrocute its operator.

This worked pretty well.  But it was still too easy for the wire to break or a person to get shocked.  So, a material called "Calrod" was developed.  The hot wire was surrounded by material that conducted heat but was an electrical insulator.  You could still burn yourself, but the shock hazard was eliminated.  Inexpensive stoves, like the one I own, still use Calrod.

But some thought that Calrod was not stylish enough.  That led to the third generation of electric stove design.  In these stoves the electric wire is cleverly imbedded in a flat surface so that some parts function as burners, but the rest of the surface stays cool.  This led to a more stylish electric stove but one that was less functional.

The original "wire in a ceramic tray" design depended completely on convection.  Putting a metal pan in contact with a wire carrying electric current is only good for electrocuting people.  So, by design the hot wire threw off a lot of infrared radiation.  The radiation striking the pan caused it to heat up.

Calrod enabled conduction to be the primary method of heat transfer as it eliminated the electrocution problem.  For this to work the pan needed to be in contact with the Calrod.  Neither the pan nor the Calrod was completely flat.  But both were close enough to flat to make everything work.

Modern flat-top stoves also rely on conduction.  But by design the surface is super-flat.  This means that if the pan is not extremely flat, then there is insufficient contact between the pan and the hot part of the flat-top.

So, people who get these modern flat-top stoves have to buy special pots and pans that have an extremely flat bottom.  Even with the right pots and pans flat-top stoves don't work as well as an old fashioned Calrod stove.  But they look nicer, and that is enough for a lot of people.

This discussion of the various pros and cons of electric stoves provides exactly the right context to discuss the pros and cons of gas stoves.  Gas stoves rely heavily on conduction.  The gas burner consists of a large number of tiny flames.  Why?  To put a large part of the bottom of the pan in contact with an open flame.  The hot gasses in the flame use conduction to transfer heat to the pan.

Seems like it should work well, right?  Actually, it doesn't work nearly as well as people think.  Those little flames guarantee that there will be hotter spots where the flame is and cooler spots where it isn't.  But wait!  It gets worse.  The gas flame is blue, right?  It is.  But notice that it is not all one color of blue.  Part of the flame is a lighter blue and part is a darker blue.

The color tells you how much energy is involved.  The different colors indicate different amounts of energy.  So, even the parts of the pan that are in direct contact with flame are in direct contact with flames of different temperatures.  But wait!  It gets worse.

As a kid I learned the hard way that you can have an invisible flame.  It may be invisible, but it can still burn you.  That's because the invisible part of the flame is putting out infrared light, light your eyes can't see.

A large part of the gas flame is invisible.  It is putting out a still different amount of energy because it's at a still different temperature.  All this means is that gas stoves do not provide even heat.  And that's why chefs like copper pans.

Copper is an excellent conductor of both electricity and heat.  These pans contain a layer of copper across their bottoms.  The copper quickly distributes heat from the hotter parts to the cooler ones.  This, in effect, evens out the uneven heat the gas stove provides.

Electric stoves do a much better job of providing an even heat.  Even heat means that all the food in the pan cooks at the same rate, a necessity if you want the dish to turn out well.

And there are lots of foods that require convection rather than conduction to cook properly.  Most meats fall into this category.  A bar-b-que or smoker is generally considered to be the best way to cook meat.  Both are convection devices.  One cut of meat that can be cooked quickly is a steak.  And the best way to cook a steak is to grill it.  Restaurants go to great lengths to prepare a steak so that it appears to have been grilled on a bar-b-que.

"Grill marks" are part of the bar-b-que experience.  Ignoring them for the moment, the meat is being cooked by a heat source that is not in direct contact with the steak.  Instead, it is several inches below the meat.  It is possible to do that with a gas stove.  But what about those critical grill marks?

In a bar-b-que the heat is provided by a fire that is a few inches below the meat.  The resulting convection does the bulk of the cooking.  But a bar-b-que has a "grill" consisting of narrow metal bars that hold up the meat.  They get heated up for the same reason the meat does.  But they are in direct contact with the meat, and they are hot.  So, they burn small stripes in the meat.  These stripes are called grill marks.

Chefs aspire to use a gas stove to replicate the total bar-b-que experience including the grill marks.  A  standard gas stove needs help, so they use a trick.  They place a large slab of iron on top of the stove.  It typically covers two burners.  The slab has ridges and troughs in it.

The gas stove heats the iron slab.  The iron slab gets hot enough to radiate a lot of infrared light.  That replicates the bar-b-que experience when it comes to cooking most of the meat.  The ridges reproduce the grill marks.

But it's a cheat.  It produces a good steak but not a great steak.  To understand what's still missing, let's consider the smoker.  This is the best device for cooking cuts of meat that need to be cooked slowly.  Smoking meat takes between hours and weeks, depending on the effect desired.  And a wood fire is critical to proper smoking.  No other fuel will do.

Why wood?  It's not just the slow cooking that is important.  It's the smoke.  Lots of kinds of wood are aromatic when they burn.  These aromatic components get absorbed into the meat and contribute to the final flavor.

Most smoking processes depend on using a fire composed of the right kind of wood.  Choosing the best kind of wood to use in smoking a specific cut of a certain kind of meat has been raised to an art form.  But a wide variety of woods can be used to good effect.  A good match is still better than adding no aromatic component at all.  So, not only does the fire need to be a wood fire, but it needs to be a wood fire composed of the right kind of wood.

Bar-b-que often takes a page from the wood smoker playbook.  A fire that uses the right kind of wood adds just the right subtle additional flavor to the finished product.  As a substitute, wood chips of the right kind can be mixed in with the regular fuel.  This does not work quite as well, but it can transfer some of the aromatic flavor to the meat.  And some is better than none.

But there is really no place to put some burning aromatic wood chips in the gas stove - iron slab setup.  Other processes for injecting this aromatic component are possible.  But none of them do as good a job as doing it right in the first place.  And this is what separates a good steak cooked on a gas stove from a great steak cooked over a wood fired bar-b-que.  And it is impossible to properly "low and slow" smoke meat on a gas stove.

Another technique for cooking meat that achieves superior results is the rotisserie.  A thin skewer is run through the meat.  The Skewer is used to hold the meat well above a fire.  The skewer-meat combination is rotated continuously so that the meat is cooked evenly on all sides.  This process is another one that depends on convection.

Theoretically, a gas stove could be used as the heat source for a rotisserie.  A contraption could be set up to hold the skewer of meat well above the burners and to slowly rotate it.  But the process would be wildly inefficient.  Most of the heat produced by the stove would go to where the meat isn't.  

Electric rotisseries, on the other hand, work just fine.  That's because everything can be enclosed.  Like the fire in the longhouse, the heat put off by the electric "burner" can be redirected into the meat by the enclosure.  And, since rotisserie cooking is another "low and slow" technique, not much heat is needed.

Rotisseries slowly cooking whole chickens used to be a standard feature of supermarkets.  They were "powered" by electric light bulbs.  Most of the light put out by an old style "incandescent" light bulb was infrared light.  So, they made an excellent heat source for this situation.  Both supermarket rotisserie chicken and incandescent light bulbs have mostly become a thing of the past.

Electric ovens work better than gas ovens for the same reason.  They are convection devices, so they depend on infrared.  Heated electric wires naturally give off large amounts of infrared light.  Gas ovens require tricks and work arounds to achieve a similar result.

And the "convection oven" is mostly a marketing gimmick.  A fan is added to push the air around, and to make it seem like it is something special.  But a standard, unmodified electric oven makes a fine convection oven.

People have relatively short memories.  It doesn't take them long to think that the way things are now is the way they have always been.  They go to a modern restaurant and eat food that tastes good to them.  They think, "the chef here uses a gas stove.  Therefore, gas stoves must be the best way to cook food."  This is reinforced by the many cooking shows on TV and cable.  Without fail they use gas stoves.

Modern restaurants, of necessity, serve food that can be prepared quickly.  Gas stoves facilitate their ability to move from order to food-on-the table quickly.  But part of what's going on is that restaurants don't put dishes on the menu if they that can't be prepared quickly using a gas stove.

Or, worse yet, people forget that a dish could have been prepared better if the chef had enough time to do so.  I have eaten a lot of baked potatoes in restaurants.  Some of them have been very good restaurants.  I have yet to be served a decent baked potato by any of them.  It's not the chef's fault.  It is just that it is literally impossible to prepare a good baked potato in a restaurant environment.

My mother was, at best, an adequate cook.  But she could cook a baked potato that was far superior to the best one I have ever eaten in a restaurant.  The reason was simple:  time.  My mother knew how many people she would be feeding, and she knew when dinner would be served.  And we ate what she put in front of us, so she only had to prepare a few dishes.

She knew all these things more than a day in advance so she could plan and prepare accordingly.  In spite of the fact that she wasn't in a class with professional chefs she was able to put food in front of us that was often superior to the best restaurant food.  She could out cook professionals because she could do things that they couldn't.

Back to the baked potato because it provides a good example of what I am talking about.  I shudder when I see foil wrapping a baked potato.  Foil is never used in its proper preparation.  Instead, potatoes are placed naked on the bottom shelf of an oven that has been preheated to a low temperature.  The potatoes are washed and poked with a fork a few times.  But that is the extent of the preparation that takes place before they go into the oven.

There they bake for an hour or so.  Towards the end they can be "forked" to determine how long it will be before they are done.  They go straight from oven to plate.  Their skin should be hard and crusty.  Their insides should be warm and fluffy.  Add a little butter, and maybe a bit of salt and pepper, and you have a baked potato that is superior to anything prepared under standard restaurant conditions.

Notice that no fancy techniques or special equipment is required.  And the only "skill" my mother needed to master was that of being able to determine how close to "done" the potato was simply by sticking a fork in it.  That was a skill my mother was easily able to master.  Any chef would be able to too.  Contrast that to the much more difficult skills a restaurant chef must master in order to turn out a far inferior baked potato.

Most baked potatoes that I encounter in a restaurant have skin that is thin and damp and not anywhere close to being completely cooked.  The insides are also underdone.  They are not fluffy.  The potato has obviously been cooked.  But it is still closer to hard than soft.

In theory a restaurant could properly prepare baked potatoes.  But they don't know in advance how many they will need nor when they will need them.  So, they would have to throw out 80-90% of them in order to have enough properly cooked baked potatoes on hand at all times.  That is a cost a restaurant can not afford to absorb.  So, they do what they must.

And it's not just baked potatoes.  Meat from old animals used to constitute a large proportion of the meat we consumed.  But meat from old animals starts out tough.  There are ways to render it tender but they are time consuming.  Meat from old animals is often more flavorful than meat from young animals.  So, we are missing out on that too.  But only a few old people now remember what properly prepared meat from old animals tastes like.

Restaurants don't serve tough meet to customers because customers don't like it.  So, dishes that depend on properly prepared meat from old animals are off the menu.  Instead, we get young and tender but bland meat.  And restaurants and cooking shows tell us what "good" food is, so we don't even get these things at home where it is still practical to employ the necessary techniques.

Gas stoves are better at a few things corn, peas, and other small vegetables that can be cooked quickly.  But they are a poor option when it comes to the proper preparation of lots of dishes.  But the economic environment modern restaurants operate under forces them to err on the side of speed.  And, if it's speed you need, then a gas stove is your best bet.  But if it's the widest variety of great food you want, then look elsewhere.

Friday, February 3, 2023

Nukeelor Power

People used to mispronounce the word "nuclear", all the time.  It's an easy word to pronounce correctly because it is pronounced exactly the way its spelling indicates that it should be.  But a lot of people used to muck it up.  For reasons that I never understood the "cle" part would throw them.  They acted as if it was actually spelled "cel".  Many of those people where public figures who should have known better.  And many of them continued to mispronounce the word for years.  Where were their aides and assistants?

If you are in favor of nuclear power, as I am, things have definitely improved.  At a minimum, the rate at which the word "nuclear" is mispronounced has diminished considerably.  But pronouncing the word incorrectly is of minor importance in the grand scheme of things.  The good news is that there have been improvements in far more important areas too.  But the press has continued to focus what coverage they provide on the less important areas while almost completely ignoring the more important areas.

I dug into this subject in 2020.  I put up two good posts, "Sigma 5: A Brief History of Nuclear Power", and "Sigma 5: Nuclear Waste", in that year.  I recommend both of them.  This post will build on the foundation they lay.  As I noted in those posts, there are two kinds of nuclear processes that can be used to produce power, fusion and fission.  Power generation using nuclear fission has been a commercial reality since the '50s.  It continues in use to this day.  Fusion has been "the future of nuclear power" for almost as long.

In practice, each depends on a single fuel.  With fission it's Uranium.  With fusion it's Hydrogen.  Fission based power is an outgrowth of research done to create the Atomic Bomb.  One main path to fusion-based power generation is based on research done to create the Hydrogen bomb.  The other main path uses a more esoteric approach that is less closely tied to bomb research.

Let's start with the latter.  It takes extreme conditions to make two Hydrogen atoms to fuse together to form a single Helium atom.  Those extreme conditions exist in the center of all stars including our Sun.  Most stars are like our Sun in that the fuel that powers them is Hydrogen.  Stars eventually exhaust their supplies of Hydrogen.  Our Sun will do so in about 5 billion years.  If the star is large enough, and our Sun is, when that happens the star just moves on to using other elements to power the fusion process.

The Sun is gigantic both in terms of its size and in terms of its mass.  All that mass is crushed toward the center by gravity.  As a result, the center of the Sun becomes a location subjected to extreme heat and pressure.  The conditions are extreme enough to cause Hydrogen to fuse into Helium at a substantial rate.  That process releases tremendous amounts of energy which, among other things, pushes back against gravity keeping things in balance.

The trick has always been to reproduce those extreme conditions on earth at a much smaller scale and without the need for a star.  For a long time, scientists thought there were three "states" of matter:  solid, liquid, and gas.  Early in the twentieth century a fourth state was discovered, plasma.  At first plasma just appears to be gas.  But it doesn't behave like a normal gas.  That's because the particles of a plasma are electrically charged.

Half of them have a positive charge.  Half of them have a negative charge.  All the positively charged particles repel each other.  All the negatively charged particles repel each other.  That should cause the plasma to immediately fly apart.  It would if it were a normal gas.

But all the positively charged particles are also simultaneously attracted to all the negatively charged particles and vice versa.  That should cause the plasma to smash together, perhaps forming a solid.  But under the right conditions the two effects exactly offset each other and achieve a balance.  When that happens, a plasma is created.

Creating a plasma takes extremely high temperatures.  And various other things must be just right.  But if the right conditions can be created and maintained, then a stable plasma become possible.  Needless to say, a stable plasma is fraught with extremes.  And in this extreme environment high energy collisions are a distinct possibility.  And high energy collisions are just what we need to cause fusion.

It didn't take long for scientists to see plasmas as a possible path to a controlled fusion reaction that could be used to create power.  One thing that helped is the fact that electricity and magnetism are inextricably intertwined.  A moving electric charge creates a magnetic field.  A magnetic field can be used to steer the path of an electrically charged particle.

So, the game became finding just the right set of magnetic and electrical fields to get a plasma to do what we wanted it to do, create conditions that caused Hydrogen to fuse into Helium at a fast enough rate to be useful, but at a slow enough rate so that it didn't just blow everything up.

A lot of designs were tried.  They all failed.  The one that came the closest was a Russian design called a Tokamak.  To the untutored eye the part that contains the plasma looks like a donut.  All kinds of powerful magnets are wrapped around the outside.  The idea is for the plasma to occupy the central area.  This is surrounded by a vacuum.  Particles can then zoom around in a rough circle while never touching the walls.

The positive plasma particles consist of the nuclei of various isotopes of Hydrogen.  The negative plasma particles consist of the electrons that have been stripped from the Hydrogen nuclei.    All of these particles are moving at extremely high speed.  It is hoped that a few of the Hydrogen nuclei will smash into each other and fuse to create Helium nuclei.  The problem of how to collect all of the energy generated by this process and turn it into electric power is being left for a future generation of scientists and engineers to solve.

Over the past few decades, a bunch of Tokamaks have been built.  None of them have worked.  The plasma can only be maintained in a stable configuration at low density for short periods of time.  The amount of fusion, and thus the amount of energy produced, is tiny.

But scientists have seen progress in moving to higher densities and in maintaining the plasma for longer periods of time.  Both kinds of progress contribute to more fusion activity and, therefore, more energy production.  That has led them to believe that they are making steady progress toward a design that works.  One thing that seems to help is size.  They hope that a big enough Tokamak can be made to work.  The end result of this is ITER, the largest Tokamak built so far.

The ITER project being run by the Europeans.  (The U.S. has, so far, made only modest contributions.)  The project has consumed billions of dollars and many years so far.  It will consume billions more before it is completed several years from now.

If, that is, it is ever completed.  (Another delay of two or more years was recently announced.)  And, if it works as well as its backers hope it will, it will not be a practical device.  It will only be a "proof of concept", a device that paves the way for one that actually works.

If going from the ITER to an actual working device sounds like a long shot, it's because it is.  A lot of things have to go well.  And, if they do, it will be at least 20 years, and likely considerably longer, before a Tokamak will be used to fuse Hydrogen into Helium in a generating facility that is feeding commercial quantities of electric power to the grid.  Let's move on to the next longshot.

I am older than the laser.  I remember when the first working one was built.  Back then, it's possible uses seemed limitless.  A few years later when I was in college (roughly 1970) I remember bumping into a guy who was talking about using lasers to zap Hydrogen hard enough to cause it to fuse.

Back then such a trick seemed like it would be relatively easy to pull off.  A laser would be focused onto a tiny spot.  If the laser was powerful enough, and if the spot was small enough, both of which sounded possible, then it should be able to feed enough energy into the Hydrogen to initiate fusion.  And fusing a tiny amount of Hydrogen into Helium would be all that was needed to produce a tremendous amount of energy.

As with creating and maintaining a suitable plasma, the problem turned out to be way harder than anyone expected.  The early experiments were a bust.  But technology kept getting better.  More powerful lasers.  Advances in focusing.  For a while it looked like the goal was within reach.  But it gradually became apparent that it was not.  At least not without access to a giant test facility costing billions of dollars.  And the funding for such a facility was just not there.

Until it was.  To its credit, the ITER was built from the ground up for the expressed purpose of using a plasma to make Hydrogen fuse into Helium.  The giant laser test facility that eventually got built was built to address an entirely different need, a military one.  Whereas it is almost impossible to get billion dollar sized chunks of money approved for civilian projects, the military has long since figured out how to pull that off.  And they have done it multiple times.  They've even done it for projects that are complete boondoggles.

The project, called Nuclear Stockpile Stewardship, was not the first expensive boondoggle the military has sold the White House and Congress on.  Nor will it be the last.  Let me outline the specifics.  The U.S. signed a treaty outlawing the testing of nuclear weapons.  That was a good thing.  But billions of dollars had been flowing annually into the design, construction, and testing of nuclear weapons.  Not surprisingly, defense contractors (and others) wanted all that money to keep flowing.

So, they started talking up the idea that our stockpile of nuclear weapons would fall apart and stop working if nothing was done.  They do need maintenance.  But their actual needs are modest.  But that's not the story the military, defense contractors, and their buddies in congress pushed.  All kinds of extraordinary (and expensive) measures were desperately needed or terrible, just terrible, things would happen to our nuclear stockpile.

So, a project called Nuclear Stockpile Stewardship was added to the Defense budget and billions of dollars started flowing its way every year.  One of the projects funded by this largess was the National Ignition Facility (NIF).  Ginormous lasers would be built and used in clever ways to simulate nuclear explosions.  The facility was situated at the Lawrence Livermore National Laboratory, often facetiously referred to as Los Alamos West.

A vast quantity of money was spent, and the facility was built.  It brought together 192 gigantic lasers, individually among the most powerful lasers ever built.  They could all be focused on a tiny target.  Most "shots" would be used to test various aspects of nuclear weapon development and maintenance.  But it is a unique facility, one that has by far the most powerful (and expensive) set of lasers available anywhere.  They could be used to do laser fusion research, so they occasionally were.

The possibility of using the occasional NIF "shot" to do laser fusion research was lost on no one.  So, pretty much from the start it has periodically been used to run various laser fusion experiments.  One of those tests recently made a big splash in the press.  "Scientific breakeven" had been achieved.  It was big news only because the field has had little positive news to report for many years now.

Mostly, what we have heard about has been yet another instance of a project getting delayed (ITER) or going further over budget (pretty much everything in the field including ITER).  Scientific breakeven was a positive achievement for a change, but a modest one.

The fact that they had to add "scientific" in front of the word "breakeven" kind of gives the game away.  Breakeven is easy to understand in this context.  You put a certain amount of energy in, and you get at least as much, and hopefully a lot more, out.  In this case 2.05 something (it doesn't matter what) units was put in and 3.15 of the same units came out.  They achieved a gain of a little more than 50%.

That's not very impressive, but it beats the alternative. A similar experiment run a year earlier had put 1.8 units of the same something units in and gotten only 1.3 units out.  The process went backwards to the tune of about 30%.  To roughly double the output (going from a gain of about 70% to a gain of about 150%) required several tweaks to the setup and about a year of work to pull off.

To get from "scientific" breakeven to actual breakeven will take a lot, because truly impressive accounting tricks had to be employed in order to allow the word "breakeven" to be used at all.  The facility as a whole is less than 1% efficient.  For every one unit of laser energy that hits the target, more than a hundred units of energy is used just to fire the lasers.

But wait.  It's worse.  No energy conversion system is 100% efficient.  Less than a third of the energy in the gas a car burns ends up being used to move the car down the road.  So only a fraction of the fusion energy will eventually end up as electrical energy.  All told, the laser fusion process needs to be made about a thousand times better in order to put the process into the ranger of practicality.    The current result needs to be doubled, then doubled again, and again, and again, and again, and again, and again to get us to where we need to be.

Here's another problem.  If a NIF shot had been able to produce the desired amount of output energy it would have destroyed the chamber containing the Hohlraum.  So, NIF can't even be used to get to true breakeven.   Most likely a whole new facility using different and better technology will need to be built.  Such a facility is likely to cost many billions of dollars.  That's bad but let me give you a tiny bit of good news.

The NIF is not designed and optimized for laser fusion purposes, so it is not very good at it.  In laser fusion mode it is a multi-stage process.  The lasers don't actually hit the ultimate target, a tiny bead of frozen Hydrogen.  The bead is contained within a small complex package called a Hohlraum.   It has a hollow, cylindrical shape.  The ends are partially but not completely closed. But wait.  There's more.

The 192 laser beams enter the Hohlraum through holes in the ends and strike its inner wall.  The inner wall material is chosen to produce copious amounts of X-Rays when struck by the NIF's laser beams.  These strike the bead, which actually consists of several different layers.  The Hydrogen at the center of the bead is compressed and flooded with X-Rays.  Only X-Rays have the energy necessary to initiate the fusion process.  And the NIF lasers are not X-Ray lasers.

It is possible that a facility that was designed from the get-go to do laser fusion would not need so many layers and so much indirection.  That's the good news.  The bad news is that the NIF is a "one shot at a time" facility.  And the turn-around time between shots is measured in days.  To be practical as a wholesale source of electric power, many shots per second will be necessary.  Finally, like the current ITER, NIF includes no means for gathering the energy produced and turning it into electricity.  As a result, multiple generations of new facilities will likely be needed.

The reason all this harkens back to Hydrogen bombs is that's how they work too.  An Atomic bomb is exploded.   Its design has been optimized to cause it to produce copious amounts of X-Rays.  The X-Rays are directed at a reservoir of Hydrogen.  Flood a Hydrogen reservoir with enough X-Rays and fusion ensues.

As with ITER, don't expect anything practical to emerge from laser fusion research in less than twenty years.  As the old saying goes, "fusion is the energy source of the future, and always will be".  I hope fusion power production eventually makes the transition from Science Fiction to reality, but I'm not holding my breath.  Fortunately, there is an atomic energy source of the present.  All we have to do is find the will to take better advantage of it.

Let me start my tour of the current state of nuclear fission as a source of electric power with a recap of the big-three accidents.  The Three Mile Accident happened in 1979.  No one was killed.  The public was never put into danger because the radioactivity that was released was confined to the containment building.

As I noted previously, other than the accident itself, everything worked exactly as it was supposed to.  And over the subsequent years the containment building has been cleaned up and all the highly radioactive components hauled off to "disposal" sites like the Hanford Nuclear Reservation.  A little more about the accident itself.

The design used for the reactor generates Hydrogen gas.  Normally, this is easily and safely vented off.  But the valve that malfunctioned and failed to open was the one that was supposed to vent the gas.  This failure trapped the Hydrogen gas in the reactor vessel.  Hydrogen is light so a bubble formed at the top.  The bubble eventually grew big enough to push the cooling water down to below the top of the Uranium/Zirconium rods.  They overheated and things went south from there.

Three Mile Island sparked a change in instrumentation.  The '50s-style "diagram on the wall" system was supplemented by computer assist.  That should eliminate the possibility of a repeat.  Similar reactors have all been upgraded to include computer assist.  They have operated safely in the decades since.  So, as I noted previously, this was only a financial disaster.

The second of the big-three is Chernobyl.  It happened in 1986.  The atomic "pile" in a squash court at the University of Chicago that played an important role in the development of the original Atomic Bomb was the basis for the design.  The reactor vessel was a large cylinder.  It had a strong floor and was covered by a lid that weighted thousands of tons.  Blocks of a couple of different types of material were stacked inside in a carefully designed pattern.

One type of block used was made of graphite, a kind of carbon, so essentially coal.  When the idiot operators performed their experiment things heated up and some of the graphite blocks caught fire.  Soon there was a giant bonfire going on in the reactor vessel.  At various points the Uranium blocks got rearranged in patterns that caused the chain reaction to speed up.

It is unclear how much was caused by the burning graphite versus the chain-reacting Uranium.  But early on the lid was blown clean off.  This gave the graphite access to lots of oxygen, and it burned furiously.  Eventually, things cooled down, likely after the graphite had all burned off.

But while the fire was going the Venturi effect had thrown large amounts of highly radioactive material into the air.  Large amounts of radioactive material settled on the ground close to the reactor.  Tiny amounts of radioactive material eventually spread as far as Sweeden.  This is not surprising because radioactive material is detectable at extremely low concentrations.  Sweeden and its population were put in no danger by this tiny amount of radioactivity.

A containment structure was hastily built.  It proved to be no match for the weather. Several years later a larger, more elaborate, and more expensive structure was put in place.  It secured the reactor building and all the radioactive material it still contained.  That was most of it.  But far too much radioactive material had drifted away.  The material that had settled in the immediate vicinity had done so in a high enough concentration to be actively dangerous.  The new containment building did nothing to mitigate that danger.

At the time of the accident a large "exclusion zone" was put into place to deal with the areas of high radiation.  Everybody was evacuated.  It is still there.  Its boundaries have changed little since 1986.  There are still no people living there.  But this has let plant and animal life thrive in every part of the exclusion zone.  It turns out that people are more of a threat to plants and animals than even high levels of radioactivity.

I am going to skip over the modern history of Chernobyl other than to note that it is now in Ukraine, an active war zone, and move on to the third big disaster, Fukushima.  It took place in 2011.  There the reactor design was similar to Three Mile Island, but for various reasons it did not include a super-strong Three Mile Island style containment vessel.  And in some ways Fukushima was a repeat of Three Mile Island.

In both cases Hydrogen built up.  In the case of Fukushima things went on long enough for far more Hydrogen to build up.  Eventually, this caused explosions.  Without the super-strong containment vessel, the explosions were strong enough to blow the roof off of reactor buildings.  There was no Hydrogen explosion that large at Three Mile Island.  There the roof remained intact.

The Fukushima facility included several nuclear reactors. The operators of that facility were well aware of the possibility of Hydrogen explosions and what the likely result would be.  The plan covering such a possibility was to vent the Hydrogen off well before it reached dangerous levels.

It's just that the damage caused by the earthquake and Tsunami was so extensive that they couldn't do that.  Had Hydrogen venting been possible, then little or no radioactivity would have spread to the civilian areas that surrounded the facility.

At Three Mile Island the reactor was never successfully SCRAMmed.  If it had been, then nothing bad would have happened.  All the reactors at Fukushima were successfully SCRAMmed.  (This happened after the earthquake hit but before the Tsunami struck the facility.)  But there was no power to circulate cooling water in the days that followed.

Eventually the Hydrogen built up (there was no power to run the valve that controlled venting) and things heated up.  Explosions ensued.  They were insignificant by nuclear standards.  But they were powerful enough to further damage the plant and to throw a considerable amount of radioactive material into the air.

The Japanese immediately implemented a large exclusion zone.  As the people on the receiving end of the Hiroshima and Nagasaki Atomic bombs, the Japanese were hypersensitive to any possible exposure of the general population to heightened levels of radioactivity.  As a result, there were no civilian casualties associated with Fukushima.

It is possible that one or more plant employees were exposed to enough radiation to kill them or damage their health.  But I know of none.  It is safe to say that radiation fatalities associated with Fukushima were likely confined to single digits.  And it is possible that the single digit was zero.

Japan is a capable nation.  But they were hampered by the widespread death and destruction that was caused by the earthquake and tsunami and which had nothing to do with Fukushima.  20,000 people were killed by these twin disasters.  Many billions of dollars' worth of damage was inflicted.  The damage included critical infrastructure like power lines.

All things considered it is remarkable that they were able to get the site under control within only a few months.  But by that time, it was in terrible shape.  For instance, they were forced to resort to flooding some areas of the plant with water.  That was the only way to cool the reactors and keep things under control.

As a result, they ended up with a tremendous amount of contaminated water.  Their short-term solution was to store this water in tanks on site.  But as time has passed, they have literally run out of space.  They are solving this latest problem by resorting to "solution by dilution".

They plan to slowly pour the contaminated water into the ocean.  Various people have objected to this.  But they tend to be the types that believe in the fantasy that there is a zero risk/cost option out there.  But there isn't.  The nay sayers also have no idea just how big the ocean is.

There are 1.3 million tons of contaminated water currently being stored on site.  That sounds like a lot.  But if it is poured into the ocean at the rate of only one cubic meter per second it will take less than three years to dispose of all of it. And ocean water is never completely still.  It is always moving.

 Let's say it is poured into a part of the ocean with a current traveling at a walking pace.  That's three kilometers per hour, not very fast.  But even at that slow rate the radioactive water will travel about 500 KM per week.  After only a week it should have been diluted to a ratio of a billion to one.  The ocean is large.  Much of it is miles deep.  500 KM is only a short distance when it comes to traversing the ocean.

The farther the radioactive water travels, the more dilute it will become.  And that's why I am confident that no harm will come from depositing that amount of radioactive material in the ocean.  Experience with the current methods used to store nuclear waste tell us that something needs to be done, and sooner rather than later.

Back to Chernobyl for a moment.  It is impossible to say how many casualties there were there.  Then as now the Russians are not a reliable source of this kind of information.  Plant personnel were killed.  A small team of experts purposely risked their lives to explore and monitor what was going on inside the building.  That was critical information that could be gotten in no other way.  Some of them died.  Others suffered serious health effects.

Soldiers were brought in during the first few days and deliberately put into extreme danger as part of the effort to get things under control.  It is likely that some of them died, and others suffered serious health effects.  And the evacuation was slow.  So, it is possible that some civilians suffered serious health effects.

The highest estimate I have seen that comes from a credible source puts likely Chernobyl related deaths at a few thousand.  Other estimates are lower.  These estimates include both short term and long-term fatalities.  Of course, many times the number who die will suffer mild to severe health effects.  But even for people who lived in the immediate vicinity of Chernobyl at the time of the disaster, a list of the top 100 health hazards they should be concerned with would not include the disaster. 

All three of these disasters, but particularly Fukushima, had a large impact on society as a whole.  At the time of the Fukushima disaster, Japan had about 80 nuclear power plants.  Japan is resource constrained.  Those nuclear plants allowed Japan to reduce by a large amount the quantity of fossil the fuels they needed to import and consume.  But Japan decided to shutter all its nuclear plants after Fukushima anyhow.

And it wasn't just Japan.  France and Germany, two other countries that had decided for reasons similar to Japan's to depend heavily on nuclear for power generation, announced plans to also drastically reduce or shutter their nuclear power plants.

Soon, the only place where new nuclear power plants were being built was China.  By this time China had terrible air pollution problems.  A major contributor were the many coal-fired electric power plants they had built.  China is still building nuclear power plants.  Unfortunately, they are still building coal plants too.

I was pretty depressed by the general situation when I wrote the posts I linked to above.  Fortunately, things have since changed for the better.  Why?  War and pestilence.  But before moving on, a final observation.  As noted above the Chernobyl design was abandoned as a result of the disaster.  Fukushima highlighted the fact that SCRAMming a reactor of that type was not enough.  It was important that the reactor cooled down completely come hell or high water.

The need for a "passive cooldown" capability was well known.  It's just that before Fukushima the need didn't seem that great and the expected cost, once legal wrangling was factored in, seemed too high.  Fukushima might have driven the industry in the direction of producing new nuclear power plants that included passive cooldown.  Instead, things went in another direction.  They built no new plants and started shutting down the old ones.

Incorporating passive cooldown into the design of a nuclear power plant is simply an engineering problem.  It doesn't matter whether the design is an old one or a new one.  Either way, there are no great technical challenges.  It is simply a matter of deciding to do so.

On the other hand, retrofitting the feature into an already built facility would be fantastically expensive, if it was even possible to do at all.  But for a new plant the design and increase in construction costs are relatively modest.  In spite of this no commercial reactor that incorporated this feature was built.  Why?  The anti-nuclear movement.

A new design, or a significant modification to a current design, automatically triggers a review.  And a review opens the process up to litigation.  The anti-nuclear people are past masters at engineering long, drawn out, and expensive cycles of litigation whenever they get a chance.  The certainty of being tied up in a decade of expensive litigation had to be balanced against the perceived benefit by the industry.

The industry perceived that the benefit was small.  Neither Three Mile Island nor Chernobyl had had any cooldown problems.  In both cases the infrastructure surrounding these plants had remained intact and in good operating condition.  The power necessary to complete the cooldown process had been readily available.  At Fukushima it was a different story.  But remember, Fukushima would not have happened absent a record-breaking earthquake coupled with a record-breaking Tsunami.

Back when I wrote the two posts I referenced above, the situation was tightly locked in.  The anti-nuclear forces were strong and well organized.  The opposition was weak and disorganized.  Under their relentless barrage of attacks nuclear power plant designs were frozen.  Construction ground slowly to a halt everywhere but in China, a country where the government was powerful enough to suppress the anti-nuclear movement.

But things were changing, even if it wasn't apparent at first.  Global Warming started out as a concern limited to certain circles of the scientific community.  Word slowly spread from there.  Then Al Gore hit the lecture circuit with an excellent presentation on the subject.  He turned his presentation into a compelling movie in 2006.  The movie garnered enough buzz to attract the interest of the general public.  They went to see it in droves.

The public interest the movie created soon led to a backlash.  The backlash was initially led by various groups of science deniers.  Then the fossil fuel industry, most notably Exxon Mobile, started secretly funding various disinformation initiatives.  Conservatives started thinking "if liberals like Gore are for it, then we are against it".

But the evidence kept piling up.  The impacts caused by Global Warming kept getting larger and more noticeable.  More and more people were impacted.  Severe weather events got not only more severe but more frequent.  Glaciers, some of which were near built-up areas in Europe and elsewhere, shrank noticeably or even disappeared completely.  There was push back from doubters and deniers.  But it soon became nearly impossible to find a glacier that was growing.

Large population areas began routinely suffering from severe floods, hurricanes, tornadoes, extreme snowfalls, fires, etc.  Bad weather caused power blackouts, massive disruptions to transportation systems, and other activities that added up to far more than just the occasional inconvenience.  "Hundred year" extreme weather events became an annual occurrence.  All the stuff that Gore had warned about started happening.

Eventually, a turning point was reached.  It became nearly impossible to deny that Global Warming was real and that it was having a large negative impact on people.  People still didn't want to do anything because they rightly believed that "the fix" would be uncomfortable, inconvenient, and expensive.

People imagined that "the fix" would a larger and more intrusive version of what happened the two times in the past century when OPEC cut the U.S. off from their oil wells.  People had to suffer through blocks long gas lines.  They were expected to dump their big, cheap, gas guzzler car that was fun to drive for a small, more expensive model that was supposedly more practical, but was also not nearly as fun to drive.  And when things returned to normal, somehow gas was a lot more expensive.

But the Global Warming problem could no longer be ignored.  That led to a search for mitigations that were cheap and pleasant.  Elon Musk came out with an electric car that was cool and fun to drive.  It was too expensive for most people, but it introduced the idea of electric cars as a positive experience rather than a negative one.

Solar Panels and Wind Turbines kept getting cheaper.  They have been the cheapest way to generate electricity for several years now.  They have made it possible to shut down dirty coal fired power plants while saving money.  Switching from getting our electricity from burning fossil fuels to green solutions might actually save money rather than costing it.

That started giving people hope.  Hope that it was possible to fix the problem.  Hope that the problem could be fixed at reasonable cost.  Hope is not the same thing as reality.  But having a reason for hope that is based in reality and not fantasy took away a lot of the negative pressure.

And the cost of doing nothing keeps getting higher and higher.  Floods, Hurricanes, and other weather extremes were literally wiping out people's homes, livelihoods, their whole way of life.  There were real, large, and highly visible costs associated with doing nothing.  That has led to an increase in positive pressure, pressure to do something about the problem.

COVID put everything on hold for a couple of years.  To put it mildly, it was a major disruptor.  After COVID the amount of change people were comfortable with increased tremendously.  COVID was not caused by Global Warming.  COVID didn't even made Global Warming worse or more likely.  But it was a sharp reminder of how interconnected everything is and how change is sometimes forced on us whether we like it or not.

And then Russia invaded Ukraine.  More accurately, they resumed the invasion they had begun in 2014.  It's been a long time since the world has seen a major War.  Ukraine is not a World War, at least not yet.  But it was also not an Iraq or Afghanistan sized war.  In those wars the primary weapons were the AK-47 and the IEDs.

Ukraine is a war involving real armies using state-of-the-art weapons with tremendously greater range and destructive power.  One of them can take out a whole building, not just a few people or a single vehicle.  The amount of havoc being wrought, and the swiftness with which it is being dealt out, have been shocking to many.

And the Ukraine war is not being fought in some less developed corner of the world.  It is being fought in a modern country on the edge of Europe.  And it is a "good guys (Ukraine) versus bad guys (Russia)" type of war.  People often lose track of how often in the postwar period Americans and Europeans have supported some corrupt autocracy against a group of "freedom fighters". 

Whether they actually were or weren't freedom fighters was often unclear.  But they were almost always the indigenous population of the area in dispute.  In the case of the war in Ukraine it is the Ukrainian people who are the indigenous population in the area under dispute.  And they are opposing the Russians, who are indisputably the foreign invaders.

In 2014 the Russians tried to make a case that there was significant support for Russia's actions among the local population in the areas they took control of.  But there was no local faction that had risen up and invited them in.  On the other hand, a lot of people living in the areas Russia occupied in 2014 had strong cultural ties to Russia and saw the Ukranian government of the time as corrupt and suspect.  So, the case the Russians were trying to make at that time was dubious but not completely lacking in merit.

The extent to which the people living in the areas Russia occupied in 2014 still feel positively toward Russia is now an open question.  The Russian occupation makes it impossible to learn the true feelings of those people.  But there is no dispute that when Russia resumed military operations in 2022, they were trying to gain control of areas where they had little or no local support.  It was a land grab, pure and simple.

Wars take place in a geopolitical context.  Europe saw Russia's invasion of Ukraine as a serious threat.  There wasn't much they could do in 2014 due to the precarious nature of the Ukrainian government at the time.  But by 2022 Ukraine had a different government, one that was willing and able to effectively oppose Russia.  This gave the Europeans actual options.  Not everything became possible, but a lot did.  For instance, the Europeans did not want to go to war with Russia.  But they were happy to supply Ukraine with all kinds of assistance, including military assistance.

One of the geopolitical considerations was that in early 2022 when the war started Europe was heavily dependent on Russia for oil and gas.  Theoretically, Russia could close the valve on either or both at any time.  Russia, of course, depended heavily on the money these sales brought in.  So, an important question became "how much damage was Russia willing to inflict on its own economy?"  In any case, it now seemed to be in Europe's interest to move away from Russian oil and gas.

But the whole reason the Europe had gotten into bed with the Russians in the first place was because there were few alternatives to Russia given the amount of fossil fuels that Europe wanted to consume.  As soon as the war started Europe started scrambling to find alternative sources.  That effort has only been partially successful.  That made it obvious that what they really needed to do was to substantially reduce their consumption of fossil fuels.  They needed to go green.

Not that long ago there seemed to be little or no reason to stick with nuclear power plants.  But nuclear plants produce no greenhouse gases.  And they don't depend on what Russia is up to.  As the Ukraine war ground on European countries started shelving their plans for shutting down nuclear power plants.  In fact, it seemed like a good idea to get some of the mothballed plants back online.

A similar thing happened in Japan.  The environmental cost of burning fossil fuels was becoming more apparent.  And Japan was not spared from extreme weather events.  So, the economic case for going green kept getting stronger and stronger.  Plans to mothball Japanese nuclear power plants are now on hold.  Whether they will restart any mothballed units, or build new ones, are still open questions.  But both of options are no longer off the table.

And then there's the U.S.  We are energy independent.  But we don't want to see the Russians succeed in Ukraine.  We have poured tens of billions of dollars into Ukraine's war effort.  We, and the Europeans, have now gone through several cycles of "we can't provide Ukraine with 'X' because it will cross a red line", only to reverse ourselves as the war drags on and start providing 'X'.

At the same time extreme weather events in the U.S. have become routine.  So, here too the anti-nuclear side of the argument is no longer seen as being the zero cost one.  That has drastically changed the calculus that surrounds the construction of nuclear power plants.  It hasn't changed completely yet, but it is moving the U.S. away from an anti-nuclear position.

For instance, for the first time in decades there are two new nuclear power plants under construction.  They are Georgia Power Plant Vogtle unit #3 and unit #4.  Both units are scheduled to come online this year (2023).  One (#3) should be online by the spring.  It only has a couple of hoops left to jump through so that is likely to happen.  The other (#4) has more hoops left to jump through, so it is still several months (and several possible delays) away from coming online.

These plants are the large, multibillion dollar projects we are used to when it comes to nuclear power plant construction.  There have been the usual delays and cost overruns.  Assuming lessons have been learned, similar plants should be cheaper and quicker to build.

But the result will still be large and very expensive projects similar to what we have seen in the past.  They are not game changers, except in the sense that they are actually getting built.  They managed to defeat the anti-nuclear forces in the courts.

A more interesting project is NuScale.  It is the furthest along of several projects that are taking similar approaches.  It has managed to jump through some but not all of the regulatory hoops necessary to actually construct a nuclear power plant.  It is currently scheduled to go online in 2029.  I expect that schedule to slip, possibly substantially.

NuScale is not business as usual in the nuclear power business.    It is one of several efforts to build small modular nuclear plants that differ substantially from the traditional design.  The new designs all aim to have modest siting requirements.  The idea is to eliminate the customization inherent in the current process.  That should save money.  A NuScale power plant would be modular.  As would the others.  A plant would consist of several small, standardized modules that could be produced assembly line fashion.  That too is supposed to save money.

Each effort uses a different, more efficient, process to convert the energy released by nuclear fission into electric power.  Several approaches are being put forward.  All are quite different from the current approach.  All are also supposed to produce less nuclear waste.

If successful, the NuScale approach would be a game changer.  If it fails, then maybe one of the alternatives will succeed.  It will be several years and several billion dollars before we know if NuScale will succeed in delivering on its many promises.  It will be even longer before we know how the others will fare.

But the need for green electric power becomes more urgent every year.  And, for a change, nuclear power is looking better and better every year rather than worse and worse.