Wednesday, February 15, 2023

Gas Stoves

The right likes to make something out of nothing and nothing out of something.  (The left does it too, but to a far less extent.)  The latest example of this concerns gas stoves.  They, meaning the libs, are coming to wrench our gas stoves out of our cold dead hands, or something to that effect.  BTW, the Biden Administration immediately disavowed any interest in banning or further regulating gas stoves.  But that didn't stop, or even slow down, the outrage from the right.

The facts can be related briefly, so I'll do that.  Then I will take a deeper dive into the "controversy" so see if there is any "there" there.

A couple of weeks ago a research group announced their findings.  They reported that gas stoves are a significant source of air pollution in the home.  This air pollution is bad for you, and it is especially bad for children, they continue.  It leads to an increase in the prevalence of childhood asthma.

I know little about asthma and its causes.  Still. it seems reasonable that if gas stoves are a significant source of indoor air pollution, then that could easily lead to an increase in childhood asthma.  As to the main air pollution claim, I am going to dive a bit deeper into that next.  Then I am going to take a serious look at gas stoves, their history, and whether they are worth all the fuss.  Here goes.

Gas stoves work by burning something.  Usually, the "something" is Natural Gas.  Natural Gas is mostly Methane.  Methane consists of one Carbon atom and three Hydrogen atoms.  If you add three Oxygen atoms and rearrange appropriately, you get one molecule of water (2 Hydrogen atoms plus one Oxygen atom) and one molecule of Carbon Dioxide (2 Oxygen atoms and one Carbon atom).  That's the Cliff's Notes version.  The reality is a lot more complicated.

First of all, the three Oxygen atoms come from one and a half Oxygen molecules.  The Oxygen found in air consists of molecules containing two Oxygen atoms.  So, the Methane molecule must be broken up into its constituent parts.  And the Oxygen molecules must be broken up into their constituent parts.  Then the constituent parts must be reassembled into the final result, a molecule of water and a molecule of Carbon Dioxide.

This process is quick, but it is not instantaneous.  And what is actually going on is that multiple processes are taking place simultaneously.  Things are getting knocked apart.  Things are getting glued together.  It is literally a free for all.  And that means that all possible processes are going on at the same time.  What determines the final outcome is what is called the "rate of reaction" of each of the various competing processes.

Some processes have high rates.  Some processes have low rates.  The high-rate processes tend to predominate in the end.  And the rate depends on various factors.  An important one is temperature.  As temperature increases some rates speed up while others slow down.

It takes a deep understanding of Quantum Mechanics and related disciplines to predict how all this is going to shake out.  Fortunately, we can cut to the chase by running the experiment.  We can turn the stove on and see what happens.

And what happens is more complicated than Methane plus Oxygen in yields water plus Carbon Dioxide out.  You see the rate of reaction of the various processes is never zero.  So, we will always get some unburned Methane.  We will also get some Carbon Monoxide (one Carbon plus one Oxygen).  And we will get some soot (pure Carbon).  This is not the end of the list, but it gives you the idea.

But wait, there's more.  Air is not pure Oxygen.  In fact, air is composed of just over 20% Oxygen, just under 80% Nitrogen, and a percent or so of other stuff.  I'm going to focus on the Nitrogen.  Like Oxygen, what's in air is a molecule consisting of two Nitrogen atoms.

But what's important to our discussion is that Nitrogen is capable of combining with both Hydrogen and Carbon to form various molecules.  These are still more processes and like the other processes, their rate of reaction is never zero.  So, as we burn Methane in a stove, we will get some of those too.

So, we have two methods of attack when it comes to determining what the result of operating a gas stove is.  We can perform difficult and complex Quantum Mechanics computations, or we can just fire up the stove and measure the results.  The report is the result of doing the latter.

The tests they performed measured a certain amount of air pollution.  They concluded that the amount of pollution caused by the routine operation of a gas stove was enough to cause problems.  The magnitude of those problems was in line with the problems caused by being subjected to secondhand smoke.  That seems reasonable to me, but it can't hurt to do further research.

Next, let's take a look at the history of gas stoves.  They seem like they are the sort of thing that has been around forever, but that is wrong.  Their first appearance in their present form is actually quite recent.  Fire goes back a long way.  Gas stoves don't.

The original fire was the campfire, or something similar.  A pile of wood was burned in a relatively open space.  Highly flammable material like tinder was initially set alight using a flint and steel, or a friction contraption.  Small dry sticks were added to make it bigger.  Larger pieces of wood were then added to make it still bigger.  Once it had reached the appropriate size, more wood could periodically be added to keep it going relatively indefinitely.

This open wood fire is very inefficient.  Most of the heat it generates is wasted.  To understand why it is important to dive very shallowly into thermodynamics.  Heat over there is useless.  It needs to be transported over here to the place where it is needed.  There are two methods of heat transference, convection and conduction.  Let's start with the latter.

Heat is like electricity.  It moves easily through some materials - conductors, and poorly through other materials - insulators.  If two conductive materials are in contact with each other, then heat quickly moves from the warmer one to the cooler one.  Something warm like the flame of a fire can quickly transfer heat to something cooler, like a pan on a stove, thus warming the pan up.  The process requires direct contact.  But, if the two materials are both good heat conductors, then heat transfer happens quickly.

It took longer for scientists to understand convection.  Here, contact is not involved.  But if you put your bare hands out toward a campfire, they soon feel warm.  The mystery of what was going on was only solved when infrared waves were discovered.

They are a form of light.  Their frequency is below the "visible" part of the spectrum, so out eyes can not see them.  What's happening in my example is that the campfire is emitting infrared waves.  These waves travel across the gap between the fire and our hands.  When they strike our hands they transfer energy, the energy that warms our hands.

In assessing the efficiency of a system, it is important to focus on how much heat goes where we want.  It is also important to take both conduction and convection into account.  Usually, one is dominant and the other plays little or no part in the process.

In the case of our open campfire, there is no conduction going on.  It is pretty much all convection.  The fire is throwing infrared light out in all directions.  But most of this infrared light never hits anything we are interested in.  Instead, it is wasted.

This waste led to a great innovation, the longhouse.  A longhouse is a relatively large building that is mostly open on the inside.  A campfire is maintained in the center of the floor.  There is a small hole in the roof above the fire that lets the smoke eventually get out.

But by design, pretty much whatever the direction, the walls of the longhouse are there to trap the infrared rays coming from the fire.  Much more of the energy of the fire ends up warming up something useful.

Of course, this arrangement tends to trap a lot of the smoke and soot from the fire inside the longhouse.  So, the air is often pretty nasty.  And this problem led to the next development, the stove.  Instead of a large building the fire is contained in a much smaller ceramic vessel.  Since the vessel surrounds the fire most of the heat ends up warming the vessel.

The vessel, in turn, warms up the room it is in.  (One or two stoves per room were required for the whole thing to work.)  But heating a building with stoves kept the building warm and smoke free at the same time.  Most of the heat the fire in the stove produced went into heating the room, so fuel costs were reasonable.

But ceramic stoves are expensive to build.  And they are slow.  Fire one up and it is likely the better part of a day before you get much warmth out of it.  In Franklin's famous stove, he replaced ceramic with iron and reduced the overall size.  He as able to retain most of the efficiency and all of the smoke reduction, so his design quickly went into widespread use.

All the stoves I described so far are optimized for heating.  But with a little tinkering a stove can be modified to make it a good device for cooking.  For instance, add a separate box next to the firebox.  This becomes what we now call an oven.  Make the top of the stove flat.  Pots and pans can now be placed there and used for food preparation.  Both iron and ceramic stoves were easily modified for use in cooking rather than heating.

And everything I have talked about so far used wood as its fuel.  But with the advent of the industrial revolution, it quickly became apparent that both ceramic and iron stoves could also be adapted to use coal, so they were.  Stoves have since been adapted to use compressed sawdust, wood pellets, and a number of other materials for fuel.  In all cases, the modifications required were modest.

There are also designs that are halfway between an open campfire and a stove.  They are used in wood fired pizza ovens, for instance.  In these halfway designs the fire is mostly but not entirely enclosed.  This design is more efficient than an open campfire but far less efficient than a fully enclosed stove.  And that's about where things stood until about 1850.

Spirit stoves and lamps had been around for millennia.  A solid or liquid that vaporizes at a low temperature was used as a fuel.  The device vaporized the fuel which was then burned.  The problem was that fuel for these devices was hard to come by.  So, although the designs for these devices were well known, they were rarely used.

That changed with the discovery of oil and the industry that grew up around it.  It turned out to be relatively easy to "refine" oil into Kerosene, and other similar liquids.  These liquids make excellent fuels for spirit stoves and lamps.

Once the oil industry got going, Kerosene and its ilk became available in large quantities.  And they were cheap.  And that meant that devices that used these fuels quickly became popular.  The revenue stream produced by selling fuel, mostly for lamps, is what powered the explosive growth of the oil industry in the second half of the nineteenth century.

Once these devices came into widespread use, for the first time in history it was practical to conduct business and pleasure after the sun had gone down.  It wasn't until the beginning of the twentieth century that demand for transportation fuels (gasoline and diesel) became great enough to overtake the market for fuels to power spirit lamps and stoves.

Another product of the oil refining business was Propane gas.  It is a more complex molecule than Methane.  It consists of three Carbon and 8 Hydrogen atoms.  When it is burnt even more processes are involved, which means that even more byproducts beyond the usual water and Carbon Dioxide are produced.  But under ideal conditions the combustion products of Propane consist mostly of water and Carbon Dioxide.  Only small amounts of other stuff are produced.

And Propane is an ideal fuel for a spirit stove or lamp.  It eliminates the need to turn the fuel from a solid or a liquid into a gas.  Propane starts out as a gas.  The widespread availability of Propane is the event that led to the development of the modern gas stove.  A gas stove is simply an evolution of the spirit stove.

In the early days Propane was not widely available.  What helped increase its popularity was the observation that it liquifies if subjected to moderate pressure.  Liquid Propane takes up lots less space than the gaseous form.  That makes it easier to transport Propane in bulk.  Even so, economics did not justify its transportation over long distances.  So, availability improved but still remained spotty.

But once the idea had been introduced, people started investigating alternatives to Propane that could be introduced into the many areas where it was not available.  And it turned out that there was a process that could be applied to coal that would produce a gas that could be used like Propane.  Coal can be found in a surprisingly large number of places.  As a result, "gasworks" plants that turned coal into a Propane-like gas were soon popping up all over the place.

Seattle is one of those places.  A gasworks plant operated in the heart of Seattle for many decades.  That allowed people in Seattle to use "gas" (not gasoline but literally a gas) for cooking, heating, and lighting.  The gas was piped into homes and building all over the downtown area.  The same thing took place in many cities and towns scattered across the country.  This substantially expanded the area where gas stoves were practical.

Another method used to expand the area where gas stoves could be used involved setting up a company that used trucks to fill a Propane tank that the customer owned.  Whatever Propane appliances the customer owned could then be fed from the tank.  This service only made sense where conditions were right.  Propane could only be transported relatively short distances economically.

One thing that held the Natural Gas market back for a long time was the stupidity of the oil industry.  Oil and Natural Gas tend to be found in the same places.  But rather than seeing Natural Gas as another source of profit, the industry treated it as an annoyance that needed to be gotten rid of as cheaply as possible.   So, they just "flared" it off.  They set up large, cheap torches and let it burn.  It took them a long time to figure out that Natural Gas was actually valuable.

The thing they missed was that Natural Gas is cheap and easy to transport.  And it does not require a complex and expensive "refinery" to convert it from raw material to salable product.  With Natural Gas, only a few simple steps are necessary to remove impurities.  The "refined" gas can then be sent long distances via a "gas" pipeline.  The pipes in the pipeline can be relatively small and still move large quantities of product.  So, gas pipelines are relatively cheap to construct and very cheap to operate.

Once the industry wised up, they built Natural Gas pipelines everywhere.  As the gas pipeline network was built out, more and more of the country had access to Natural Gas.  A Natural Gas pipeline eventually made it to Seattle.  That obsoleted the gasworks.  The site of Seattle's gasworks was eventually turned into "Gasworks Park".  The park has become very popular.  It has great views and is a prime spot for flying kites.

Over time, more and more of the country could cook with a gas stove, heat water with a gas water heater, heat a home or building with a gas furnace, generally buy a lot of Natural Gas from the industry.  This was helped along by various marketing efforts that claimed that gas was superior to electricity for pretty much everything, but especially for cooking.

So, why should someone put in a gas stove?  According to the industry it was because it was the best tool for the job of cooking food well.  This conveniently hid a lot of extremely relevant history.  What did great cooks do before gas stoves were widely available?  They invented Haute Cuisine.  High-end cooking is pretty much of a French invention.  That's why so much modern cooking terminology is in French.

People have been throwing feasts since time immemorial.  But they were infrequent special events.  And the emphasis was on quantity and variety.  A feast might go on for several days.  It would consist of course after course, each different.  That took the focus away from the quality of any specific course.  And feasts typically involved drinking, a lot of drinking.  This too detracted from a focus on quality.

French royalty in the eighteenth century, and particularly in the nineteenth century, slowly started changing the focus.  Seen one feast - seen them all.  So, they started focusing on presenting courses that were unique and special.  "Come to my event because the food will be amazing and memorable."  Of course, it soon became a competition.  Who could throw the feast with the most amazing food?  Winning this competition took both skill and money.

This is when the celebrity chef was invented.  Someone who could source rare ingredients and then use them to produce something unique and delicious became highly sought after.  And, of course, at some point, one of these chefs said, "I am tired of working for some ignorant noble who treats me badly.  I am going to go out on my own and create a restaurant.  That way I get the glory and respect instead of some fool who was lucky enough to choose his parents well."  And so, the destination restaurant was born.

And as more and more of them opened it became harder and harder to stand out from the crowd.  And what was the market they were catering to?  Snooty people who had a lot of money and wanted to show off.  So, the dishes got more and more elaborate.

And their preparation got more and more labor intensive.  That drove costs up, but that was the point.  That enabled customers to be able to say, "I went to a more expensive restaurant than you did.  Why?  Because I could afford it and you couldn't."

This trend peaked in about 1900, a time before gas stoves were in widespread use.  Haute era cooking, considered by many to be a peak never equaled since, was all done using non-gas stoves.  Haute Cuisine was gradually replaced by Nouvelle Cuisine starting in about 1900.

It was a move away from exotic ingredients and labor-intensive preparation techniques.  The idea was to focus on putting out a quality product that was based on top quality ingredients and simple preparation techniques that highlighted the flavors inherent in the ingredients.

Haute Cuisine often involved changing the characteristics of the ingredients by hiding them under sauces and the like.  Nouvelle Cuisine has gone through a couple of generations of evolution.  But its goals continue to drive much of the thinking that defines how people think about how "good" food should be prepared and judged.

This history makes it obvious that there is nothing inherent in a gas stove that makes it a superior tool for food preparation.  So why do so many highly respected chefs claim to prefer them.  One reason is money.

Not surprisingly, the fossil fuel industry has been providing kickbacks to marquis chefs who tout the supposed superiority of gas stoves.  Cooking schools get subsidies if they teach their students to cook on gas stoves.  This sort of thing tends to create an echo chamber.

But there is more to it.  Consider the modern restaurant.  Starting with Nouvelle Cuisine the industry has been moving away from techniques that require elaborate preparation.  So, what does make a successful modern restaurant?

You expect a menu that features many options.  Most modern restaurants specialize in a specific type of cuisine, say Italian.  But at an Italian restaurant customers expect to be allowed to select from among a large number of different Italian dishes.

Regardless of the type of cuisine, how long are customers willing to wait between when they order and when the food arrives?  Customers used to have more patience.  One trick restaurants used to use was to serve the meal a course at a time.

The appetizer would come out.  Then a little later the soup would arrive.  And after more delay the entree would arrive.  And so it went through the entire meal.  This allowed the restaurant to stretch things out, leaving it more time to prepare the various dishes.

But even with this kind of distraction only allowed the restaurant to stretch things out so far.  Even customers at high-end restaurants were only willing to wait so long.  Then there is the whole "fast-food" segment.  It is now, and has been for some time, far larger than the high-end segment of the business.  Its success owes to the fact that it succeeded in reducing wait times to at most a few minutes.

The find dining segment of the market argues that they provide a far superior dining experience.  But still, they feel some pressure as a result of the existence of the fast-food segment.  They know that they have at most twenty to thirty minutes to get the entree in front of the customer.  And the high-end segment now has something new to worry about.  There is now a "fast casual" segment.

This market segment pitches the idea that they deliver a far superior product than the fast-food segment.  It may not be as good as what a high-end restaurant is capable of, but it is darn good.  And they get the food in front of the customer far quicker than their high-end brethren.  Fast casual has seen considerable success in the past decade or so.  And this has put even more pressure on even high-end restaurants to speed things up.

The TV show "Chopped" showcases what goes on in the kitchen of a modern high-end restaurant.  Contestants are tasked with cooking a complete course in 15-30 minutes, depending on the segment.  For each course they are presented with four "mandatory" ingredients.  The mandatory ingredients are chosen to clash with each other.  That has to result in a lot of bad food.  But it also results in good TV.  The show is very popular.

To succeed a contestant has to have a bag of tricks and hacks for getting things done in a hurry.  And a substantial component of a contestant's score depends on making the dish look pretty.  Everything but the clashing ingredients accurately mirrors the operation of the kitchen of a modern restaurant.  Chefs must be able to quickly prepare a wide variety of dishes.

Done well, these dishes may require a wide assortment of techniques.  But the kitchen is too small and too short staffed to do all those different things well.  And they have no time.  Each chef is required to be working on several dishes at once.  He can't put much attention and care into any one dish.  Thus, trick and hacks, particularly time saving hacks are critical.

Is this the best way to prepare great food?  No!  But if the restaurant can't get the food on the table fast enough, it won't matter how great it is.  People will go to the "good enough" restaurant that features faster service.

The modern restaurant kitchen is where the gas stove shines.  Different dishes require different cooking temperatures.  The temperature of each burner of a gas stove can be changed independently of the other burners.  And it can be changed instantly.

It can be instantly cranked up to provide a lot of heat under this dish.  Then it can be cranked down instantly to more slowly and gently warm that dish.  That's a lifesaver in a modern restaurant kitchen.  But does it product the best end result?  Let's see.

But first let's take a look at the history of gas stove's modern competition, the electric stove.  Each dates back to the second half of the nineteenth century.  The spirit stove evolved into the gas stove pretty quickly.  It took longer for the design of the electric stove to mature.

The science behind the electric stove is extremely simple.  I mentioned conductors and insulators above.  It is not a binary situation.  Materials exist at every point in the scale between fully conductive and fully insulating.

Consider a material that is down toward the fully conducting end of the scale, but not quite at the end.  It conducts pretty well but puts up some resistance.  To get a little more technical, a certain amount of power goes in one end.  A lesser amount comes out the other end.

The amount of resistance measures how much the power gets diminished.  So, what happens to the power that has disappeared?  The "conservation of energy" law tells us that it must go somewhere.  The answer is that it gets converted into heat.  And, in fact, the conversion ratio is 100%.  All of the power that has disappeared gets converted into heat.  That is great news if what you are trying to do is produce heat.  And that's exactly what a stove does.

So, to make an electric stove all you need to do is to push some electric power through a wire that has some resistance.  Easy - peasy.  And if the wire has little resistance then only a small percentage of the electric power gets converted into heat.  More resistance results in a higher percentage getting converted.  Selecting wire that has the right amount of resistance for use in the stove allow the top temperature the wire reaches to be dialed in.  You don't want the wire to get so hot that it melts.

And that is literally the design used for early electric stoves.  A wire with the appropriate amount of resistance, usually coiled so that it would put out more heat per foot without melting, was placed in a ceramic tray.  Ceramics can withstand high temperatures and are electrical insulators.  So, they supported the wire and kept it from touching things it wasn't supposed to.  An electric stove is not supposed to electrocute its operator.

This worked pretty well.  But it was still too easy for the wire to break or a person to get shocked.  So, a material called "Calrod" was developed.  The hot wire was surrounded by material that conducted heat but was an electrical insulator.  You could still burn yourself, but the shock hazard was eliminated.  Inexpensive stoves, like the one I own, still use Calrod.

But some thought that Calrod was not stylish enough.  That led to the third generation of electric stove design.  In these stoves the electric wire is cleverly imbedded in a flat surface so that some parts function as burners, but the rest of the surface stays cool.  This led to a more stylish electric stove but one that was less functional.

The original "wire in a ceramic tray" design depended completely on convection.  Putting a metal pan in contact with a wire carrying electric current is only good for electrocuting people.  So, by design the hot wire threw off a lot of infrared radiation.  The radiation striking the pan caused it to heat up.

Calrod enabled conduction to be the primary method of heat transfer as it eliminated the electrocution problem.  For this to work the pan needed to be in contact with the Calrod.  Neither the pan nor the Calrod was completely flat.  But both were close enough to flat to make everything work.

Modern flat-top stoves also rely on conduction.  But by design the surface is super-flat.  This means that if the pan is not extremely flat, then there is insufficient contact between the pan and the hot part of the flat-top.

So, people who get these modern flat-top stoves have to buy special pots and pans that have an extremely flat bottom.  Even with the right pots and pans flat-top stoves don't work as well as an old fashioned Calrod stove.  But they look nicer, and that is enough for a lot of people.

This discussion of the various pros and cons of electric stoves provides exactly the right context to discuss the pros and cons of gas stoves.  Gas stoves rely heavily on conduction.  The gas burner consists of a large number of tiny flames.  Why?  To put a large part of the bottom of the pan in contact with an open flame.  The hot gasses in the flame use conduction to transfer heat to the pan.

Seems like it should work well, right?  Actually, it doesn't work nearly as well as people think.  Those little flames guarantee that there will be hotter spots where the flame is and cooler spots where it isn't.  But wait!  It gets worse.  The gas flame is blue, right?  It is.  But notice that it is not all one color of blue.  Part of the flame is a lighter blue and part is a darker blue.

The color tells you how much energy is involved.  The different colors indicate different amounts of energy.  So, even the parts of the pan that are in direct contact with flame are in direct contact with flames of different temperatures.  But wait!  It gets worse.

As a kid I learned the hard way that you can have an invisible flame.  It may be invisible, but it can still burn you.  That's because the invisible part of the flame is putting out infrared light, light your eyes can't see.

A large part of the gas flame is invisible.  It is putting out a still different amount of energy because it's at a still different temperature.  All this means is that gas stoves do not provide even heat.  And that's why chefs like copper pans.

Copper is an excellent conductor of both electricity and heat.  These pans contain a layer of copper across their bottoms.  The copper quickly distributes heat from the hotter parts to the cooler ones.  This, in effect, evens out the uneven heat the gas stove provides.

Electric stoves do a much better job of providing an even heat.  Even heat means that all the food in the pan cooks at the same rate, a necessity if you want the dish to turn out well.

And there are lots of foods that require convection rather than conduction to cook properly.  Most meats fall into this category.  A bar-b-que or smoker is generally considered to be the best way to cook meat.  Both are convection devices.  One cut of meat that can be cooked quickly is a steak.  And the best way to cook a steak is to grill it.  Restaurants go to great lengths to prepare a steak so that it appears to have been grilled on a bar-b-que.

"Grill marks" are part of the bar-b-que experience.  Ignoring them for the moment, the meat is being cooked by a heat source that is not in direct contact with the steak.  Instead, it is several inches below the meat.  It is possible to do that with a gas stove.  But what about those critical grill marks?

In a bar-b-que the heat is provided by a fire that is a few inches below the meat.  The resulting convection does the bulk of the cooking.  But a bar-b-que has a "grill" consisting of narrow metal bars that hold up the meat.  They get heated up for the same reason the meat does.  But they are in direct contact with the meat, and they are hot.  So, they burn small stripes in the meat.  These stripes are called grill marks.

Chefs aspire to use a gas stove to replicate the total bar-b-que experience including the grill marks.  A  standard gas stove needs help, so they use a trick.  They place a large slab of iron on top of the stove.  It typically covers two burners.  The slab has ridges and troughs in it.

The gas stove heats the iron slab.  The iron slab gets hot enough to radiate a lot of infrared light.  That replicates the bar-b-que experience when it comes to cooking most of the meat.  The ridges reproduce the grill marks.

But it's a cheat.  It produces a good steak but not a great steak.  To understand what's still missing, let's consider the smoker.  This is the best device for cooking cuts of meat that need to be cooked slowly.  Smoking meat takes between hours and weeks, depending on the effect desired.  And a wood fire is critical to proper smoking.  No other fuel will do.

Why wood?  It's not just the slow cooking that is important.  It's the smoke.  Lots of kinds of wood are aromatic when they burn.  These aromatic components get absorbed into the meat and contribute to the final flavor.

Most smoking processes depend on using a fire composed of the right kind of wood.  Choosing the best kind of wood to use in smoking a specific cut of a certain kind of meat has been raised to an art form.  But a wide variety of woods can be used to good effect.  A good match is still better than adding no aromatic component at all.  So, not only does the fire need to be a wood fire, but it needs to be a wood fire composed of the right kind of wood.

Bar-b-que often takes a page from the wood smoker playbook.  A fire that uses the right kind of wood adds just the right subtle additional flavor to the finished product.  As a substitute, wood chips of the right kind can be mixed in with the regular fuel.  This does not work quite as well, but it can transfer some of the aromatic flavor to the meat.  And some is better than none.

But there is really no place to put some burning aromatic wood chips in the gas stove - iron slab setup.  Other processes for injecting this aromatic component are possible.  But none of them do as good a job as doing it right in the first place.  And this is what separates a good steak cooked on a gas stove from a great steak cooked over a wood fired bar-b-que.  And it is impossible to properly "low and slow" smoke meat on a gas stove.

Another technique for cooking meat that achieves superior results is the rotisserie.  A thin skewer is run through the meat.  The Skewer is used to hold the meat well above a fire.  The skewer-meat combination is rotated continuously so that the meat is cooked evenly on all sides.  This process is another one that depends on convection.

Theoretically, a gas stove could be used as the heat source for a rotisserie.  A contraption could be set up to hold the skewer of meat well above the burners and to slowly rotate it.  But the process would be wildly inefficient.  Most of the heat produced by the stove would go to where the meat isn't.  

Electric rotisseries, on the other hand, work just fine.  That's because everything can be enclosed.  Like the fire in the longhouse, the heat put off by the electric "burner" can be redirected into the meat by the enclosure.  And, since rotisserie cooking is another "low and slow" technique, not much heat is needed.

Rotisseries slowly cooking whole chickens used to be a standard feature of supermarkets.  They were "powered" by electric light bulbs.  Most of the light put out by an old style "incandescent" light bulb was infrared light.  So, they made an excellent heat source for this situation.  Both supermarket rotisserie chicken and incandescent light bulbs have mostly become a thing of the past.

Electric ovens work better than gas ovens for the same reason.  They are convection devices, so they depend on infrared.  Heated electric wires naturally give off large amounts of infrared light.  Gas ovens require tricks and work arounds to achieve a similar result.

And the "convection oven" is mostly a marketing gimmick.  A fan is added to push the air around, and to make it seem like it is something special.  But a standard, unmodified electric oven makes a fine convection oven.

People have relatively short memories.  It doesn't take them long to think that the way things are now is the way they have always been.  They go to a modern restaurant and eat food that tastes good to them.  They think, "the chef here uses a gas stove.  Therefore, gas stoves must be the best way to cook food."  This is reinforced by the many cooking shows on TV and cable.  Without fail they use gas stoves.

Modern restaurants, of necessity, serve food that can be prepared quickly.  Gas stoves facilitate their ability to move from order to food-on-the table quickly.  But part of what's going on is that restaurants don't put dishes on the menu if they that can't be prepared quickly using a gas stove.

Or, worse yet, people forget that a dish could have been prepared better if the chef had enough time to do so.  I have eaten a lot of baked potatoes in restaurants.  Some of them have been very good restaurants.  I have yet to be served a decent baked potato by any of them.  It's not the chef's fault.  It is just that it is literally impossible to prepare a good baked potato in a restaurant environment.

My mother was, at best, an adequate cook.  But she could cook a baked potato that was far superior to the best one I have ever eaten in a restaurant.  The reason was simple:  time.  My mother knew how many people she would be feeding, and she knew when dinner would be served.  And we ate what she put in front of us, so she only had to prepare a few dishes.

She knew all these things more than a day in advance so she could plan and prepare accordingly.  In spite of the fact that she wasn't in a class with professional chefs she was able to put food in front of us that was often superior to the best restaurant food.  She could out cook professionals because she could do things that they couldn't.

Back to the baked potato because it provides a good example of what I am talking about.  I shudder when I see foil wrapping a baked potato.  Foil is never used in its proper preparation.  Instead, potatoes are placed naked on the bottom shelf of an oven that has been preheated to a low temperature.  The potatoes are washed and poked with a fork a few times.  But that is the extent of the preparation that takes place before they go into the oven.

There they bake for an hour or so.  Towards the end they can be "forked" to determine how long it will be before they are done.  They go straight from oven to plate.  Their skin should be hard and crusty.  Their insides should be warm and fluffy.  Add a little butter, and maybe a bit of salt and pepper, and you have a baked potato that is superior to anything prepared under standard restaurant conditions.

Notice that no fancy techniques or special equipment is required.  And the only "skill" my mother needed to master was that of being able to determine how close to "done" the potato was simply by sticking a fork in it.  That was a skill my mother was easily able to master.  Any chef would be able to too.  Contrast that to the much more difficult skills a restaurant chef must master in order to turn out a far inferior baked potato.

Most baked potatoes that I encounter in a restaurant have skin that is thin and damp and not anywhere close to being completely cooked.  The insides are also underdone.  They are not fluffy.  The potato has obviously been cooked.  But it is still closer to hard than soft.

In theory a restaurant could properly prepare baked potatoes.  But they don't know in advance how many they will need nor when they will need them.  So, they would have to throw out 80-90% of them in order to have enough properly cooked baked potatoes on hand at all times.  That is a cost a restaurant can not afford to absorb.  So, they do what they must.

And it's not just baked potatoes.  Meat from old animals used to constitute a large proportion of the meat we consumed.  But meat from old animals starts out tough.  There are ways to render it tender but they are time consuming.  Meat from old animals is often more flavorful than meat from young animals.  So, we are missing out on that too.  But only a few old people now remember what properly prepared meat from old animals tastes like.

Restaurants don't serve tough meet to customers because customers don't like it.  So, dishes that depend on properly prepared meat from old animals are off the menu.  Instead, we get young and tender but bland meat.  And restaurants and cooking shows tell us what "good" food is, so we don't even get these things at home where it is still practical to employ the necessary techniques.

Gas stoves are better at a few things corn, peas, and other small vegetables that can be cooked quickly.  But they are a poor option when it comes to the proper preparation of lots of dishes.  But the economic environment modern restaurants operate under forces them to err on the side of speed.  And, if it's speed you need, then a gas stove is your best bet.  But if it's the widest variety of great food you want, then look elsewhere.

Friday, February 3, 2023

Nukeelor Power

People used to mispronounce the word "nuclear", all the time.  It's an easy word to pronounce correctly because it is pronounced exactly the way its spelling indicates that it should be.  But a lot of people used to muck it up.  For reasons that I never understood the "cle" part would throw them.  They acted as if it was actually spelled "cel".  Many of those people where public figures who should have known better.  And many of them continued to mispronounce the word for years.  Where were their aides and assistants?

If you are in favor of nuclear power, as I am, things have definitely improved.  At a minimum, the rate at which the word "nuclear" is mispronounced has diminished considerably.  But pronouncing the word incorrectly is of minor importance in the grand scheme of things.  The good news is that there have been improvements in far more important areas too.  But the press has continued to focus what coverage they provide on the less important areas while almost completely ignoring the more important areas.

I dug into this subject in 2020.  I put up two good posts, "Sigma 5: A Brief History of Nuclear Power", and "Sigma 5: Nuclear Waste", in that year.  I recommend both of them.  This post will build on the foundation they lay.  As I noted in those posts, there are two kinds of nuclear processes that can be used to produce power, fusion and fission.  Power generation using nuclear fission has been a commercial reality since the '50s.  It continues in use to this day.  Fusion has been "the future of nuclear power" for almost as long.

In practice, each depends on a single fuel.  With fission it's Uranium.  With fusion it's Hydrogen.  Fission based power is an outgrowth of research done to create the Atomic Bomb.  One main path to fusion-based power generation is based on research done to create the Hydrogen bomb.  The other main path uses a more esoteric approach that is less closely tied to bomb research.

Let's start with the latter.  It takes extreme conditions to make two Hydrogen atoms to fuse together to form a single Helium atom.  Those extreme conditions exist in the center of all stars including our Sun.  Most stars are like our Sun in that the fuel that powers them is Hydrogen.  Stars eventually exhaust their supplies of Hydrogen.  Our Sun will do so in about 5 billion years.  If the star is large enough, and our Sun is, when that happens the star just moves on to using other elements to power the fusion process.

The Sun is gigantic both in terms of its size and in terms of its mass.  All that mass is crushed toward the center by gravity.  As a result, the center of the Sun becomes a location subjected to extreme heat and pressure.  The conditions are extreme enough to cause Hydrogen to fuse into Helium at a substantial rate.  That process releases tremendous amounts of energy which, among other things, pushes back against gravity keeping things in balance.

The trick has always been to reproduce those extreme conditions on earth at a much smaller scale and without the need for a star.  For a long time, scientists thought there were three "states" of matter:  solid, liquid, and gas.  Early in the twentieth century a fourth state was discovered, plasma.  At first plasma just appears to be gas.  But it doesn't behave like a normal gas.  That's because the particles of a plasma are electrically charged.

Half of them have a positive charge.  Half of them have a negative charge.  All the positively charged particles repel each other.  All the negatively charged particles repel each other.  That should cause the plasma to immediately fly apart.  It would if it were a normal gas.

But all the positively charged particles are also simultaneously attracted to all the negatively charged particles and vice versa.  That should cause the plasma to smash together, perhaps forming a solid.  But under the right conditions the two effects exactly offset each other and achieve a balance.  When that happens, a plasma is created.

Creating a plasma takes extremely high temperatures.  And various other things must be just right.  But if the right conditions can be created and maintained, then a stable plasma become possible.  Needless to say, a stable plasma is fraught with extremes.  And in this extreme environment high energy collisions are a distinct possibility.  And high energy collisions are just what we need to cause fusion.

It didn't take long for scientists to see plasmas as a possible path to a controlled fusion reaction that could be used to create power.  One thing that helped is the fact that electricity and magnetism are inextricably intertwined.  A moving electric charge creates a magnetic field.  A magnetic field can be used to steer the path of an electrically charged particle.

So, the game became finding just the right set of magnetic and electrical fields to get a plasma to do what we wanted it to do, create conditions that caused Hydrogen to fuse into Helium at a fast enough rate to be useful, but at a slow enough rate so that it didn't just blow everything up.

A lot of designs were tried.  They all failed.  The one that came the closest was a Russian design called a Tokamak.  To the untutored eye the part that contains the plasma looks like a donut.  All kinds of powerful magnets are wrapped around the outside.  The idea is for the plasma to occupy the central area.  This is surrounded by a vacuum.  Particles can then zoom around in a rough circle while never touching the walls.

The positive plasma particles consist of the nuclei of various isotopes of Hydrogen.  The negative plasma particles consist of the electrons that have been stripped from the Hydrogen nuclei.    All of these particles are moving at extremely high speed.  It is hoped that a few of the Hydrogen nuclei will smash into each other and fuse to create Helium nuclei.  The problem of how to collect all of the energy generated by this process and turn it into electric power is being left for a future generation of scientists and engineers to solve.

Over the past few decades, a bunch of Tokamaks have been built.  None of them have worked.  The plasma can only be maintained in a stable configuration at low density for short periods of time.  The amount of fusion, and thus the amount of energy produced, is tiny.

But scientists have seen progress in moving to higher densities and in maintaining the plasma for longer periods of time.  Both kinds of progress contribute to more fusion activity and, therefore, more energy production.  That has led them to believe that they are making steady progress toward a design that works.  One thing that seems to help is size.  They hope that a big enough Tokamak can be made to work.  The end result of this is ITER, the largest Tokamak built so far.

The ITER project being run by the Europeans.  (The U.S. has, so far, made only modest contributions.)  The project has consumed billions of dollars and many years so far.  It will consume billions more before it is completed several years from now.

If, that is, it is ever completed.  (Another delay of two or more years was recently announced.)  And, if it works as well as its backers hope it will, it will not be a practical device.  It will only be a "proof of concept", a device that paves the way for one that actually works.

If going from the ITER to an actual working device sounds like a long shot, it's because it is.  A lot of things have to go well.  And, if they do, it will be at least 20 years, and likely considerably longer, before a Tokamak will be used to fuse Hydrogen into Helium in a generating facility that is feeding commercial quantities of electric power to the grid.  Let's move on to the next longshot.

I am older than the laser.  I remember when the first working one was built.  Back then, it's possible uses seemed limitless.  A few years later when I was in college (roughly 1970) I remember bumping into a guy who was talking about using lasers to zap Hydrogen hard enough to cause it to fuse.

Back then such a trick seemed like it would be relatively easy to pull off.  A laser would be focused onto a tiny spot.  If the laser was powerful enough, and if the spot was small enough, both of which sounded possible, then it should be able to feed enough energy into the Hydrogen to initiate fusion.  And fusing a tiny amount of Hydrogen into Helium would be all that was needed to produce a tremendous amount of energy.

As with creating and maintaining a suitable plasma, the problem turned out to be way harder than anyone expected.  The early experiments were a bust.  But technology kept getting better.  More powerful lasers.  Advances in focusing.  For a while it looked like the goal was within reach.  But it gradually became apparent that it was not.  At least not without access to a giant test facility costing billions of dollars.  And the funding for such a facility was just not there.

Until it was.  To its credit, the ITER was built from the ground up for the expressed purpose of using a plasma to make Hydrogen fuse into Helium.  The giant laser test facility that eventually got built was built to address an entirely different need, a military one.  Whereas it is almost impossible to get billion dollar sized chunks of money approved for civilian projects, the military has long since figured out how to pull that off.  And they have done it multiple times.  They've even done it for projects that are complete boondoggles.

The project, called Nuclear Stockpile Stewardship, was not the first expensive boondoggle the military has sold the White House and Congress on.  Nor will it be the last.  Let me outline the specifics.  The U.S. signed a treaty outlawing the testing of nuclear weapons.  That was a good thing.  But billions of dollars had been flowing annually into the design, construction, and testing of nuclear weapons.  Not surprisingly, defense contractors (and others) wanted all that money to keep flowing.

So, they started talking up the idea that our stockpile of nuclear weapons would fall apart and stop working if nothing was done.  They do need maintenance.  But their actual needs are modest.  But that's not the story the military, defense contractors, and their buddies in congress pushed.  All kinds of extraordinary (and expensive) measures were desperately needed or terrible, just terrible, things would happen to our nuclear stockpile.

So, a project called Nuclear Stockpile Stewardship was added to the Defense budget and billions of dollars started flowing its way every year.  One of the projects funded by this largess was the National Ignition Facility (NIF).  Ginormous lasers would be built and used in clever ways to simulate nuclear explosions.  The facility was situated at the Lawrence Livermore National Laboratory, often facetiously referred to as Los Alamos West.

A vast quantity of money was spent, and the facility was built.  It brought together 192 gigantic lasers, individually among the most powerful lasers ever built.  They could all be focused on a tiny target.  Most "shots" would be used to test various aspects of nuclear weapon development and maintenance.  But it is a unique facility, one that has by far the most powerful (and expensive) set of lasers available anywhere.  They could be used to do laser fusion research, so they occasionally were.

The possibility of using the occasional NIF "shot" to do laser fusion research was lost on no one.  So, pretty much from the start it has periodically been used to run various laser fusion experiments.  One of those tests recently made a big splash in the press.  "Scientific breakeven" had been achieved.  It was big news only because the field has had little positive news to report for many years now.

Mostly, what we have heard about has been yet another instance of a project getting delayed (ITER) or going further over budget (pretty much everything in the field including ITER).  Scientific breakeven was a positive achievement for a change, but a modest one.

The fact that they had to add "scientific" in front of the word "breakeven" kind of gives the game away.  Breakeven is easy to understand in this context.  You put a certain amount of energy in, and you get at least as much, and hopefully a lot more, out.  In this case 2.05 something (it doesn't matter what) units was put in and 3.15 of the same units came out.  They achieved a gain of a little more than 50%.

That's not very impressive, but it beats the alternative. A similar experiment run a year earlier had put 1.8 units of the same something units in and gotten only 1.3 units out.  The process went backwards to the tune of about 30%.  To roughly double the output (going from a gain of about 70% to a gain of about 150%) required several tweaks to the setup and about a year of work to pull off.

To get from "scientific" breakeven to actual breakeven will take a lot, because truly impressive accounting tricks had to be employed in order to allow the word "breakeven" to be used at all.  The facility as a whole is less than 1% efficient.  For every one unit of laser energy that hits the target, more than a hundred units of energy is used just to fire the lasers.

But wait.  It's worse.  No energy conversion system is 100% efficient.  Less than a third of the energy in the gas a car burns ends up being used to move the car down the road.  So only a fraction of the fusion energy will eventually end up as electrical energy.  All told, the laser fusion process needs to be made about a thousand times better in order to put the process into the ranger of practicality.    The current result needs to be doubled, then doubled again, and again, and again, and again, and again, and again to get us to where we need to be.

Here's another problem.  If a NIF shot had been able to produce the desired amount of output energy it would have destroyed the chamber containing the Hohlraum.  So, NIF can't even be used to get to true breakeven.   Most likely a whole new facility using different and better technology will need to be built.  Such a facility is likely to cost many billions of dollars.  That's bad but let me give you a tiny bit of good news.

The NIF is not designed and optimized for laser fusion purposes, so it is not very good at it.  In laser fusion mode it is a multi-stage process.  The lasers don't actually hit the ultimate target, a tiny bead of frozen Hydrogen.  The bead is contained within a small complex package called a Hohlraum.   It has a hollow, cylindrical shape.  The ends are partially but not completely closed. But wait.  There's more.

The 192 laser beams enter the Hohlraum through holes in the ends and strike its inner wall.  The inner wall material is chosen to produce copious amounts of X-Rays when struck by the NIF's laser beams.  These strike the bead, which actually consists of several different layers.  The Hydrogen at the center of the bead is compressed and flooded with X-Rays.  Only X-Rays have the energy necessary to initiate the fusion process.  And the NIF lasers are not X-Ray lasers.

It is possible that a facility that was designed from the get-go to do laser fusion would not need so many layers and so much indirection.  That's the good news.  The bad news is that the NIF is a "one shot at a time" facility.  And the turn-around time between shots is measured in days.  To be practical as a wholesale source of electric power, many shots per second will be necessary.  Finally, like the current ITER, NIF includes no means for gathering the energy produced and turning it into electricity.  As a result, multiple generations of new facilities will likely be needed.

The reason all this harkens back to Hydrogen bombs is that's how they work too.  An Atomic bomb is exploded.   Its design has been optimized to cause it to produce copious amounts of X-Rays.  The X-Rays are directed at a reservoir of Hydrogen.  Flood a Hydrogen reservoir with enough X-Rays and fusion ensues.

As with ITER, don't expect anything practical to emerge from laser fusion research in less than twenty years.  As the old saying goes, "fusion is the energy source of the future, and always will be".  I hope fusion power production eventually makes the transition from Science Fiction to reality, but I'm not holding my breath.  Fortunately, there is an atomic energy source of the present.  All we have to do is find the will to take better advantage of it.

Let me start my tour of the current state of nuclear fission as a source of electric power with a recap of the big-three accidents.  The Three Mile Accident happened in 1979.  No one was killed.  The public was never put into danger because the radioactivity that was released was confined to the containment building.

As I noted previously, other than the accident itself, everything worked exactly as it was supposed to.  And over the subsequent years the containment building has been cleaned up and all the highly radioactive components hauled off to "disposal" sites like the Hanford Nuclear Reservation.  A little more about the accident itself.

The design used for the reactor generates Hydrogen gas.  Normally, this is easily and safely vented off.  But the valve that malfunctioned and failed to open was the one that was supposed to vent the gas.  This failure trapped the Hydrogen gas in the reactor vessel.  Hydrogen is light so a bubble formed at the top.  The bubble eventually grew big enough to push the cooling water down to below the top of the Uranium/Zirconium rods.  They overheated and things went south from there.

Three Mile Island sparked a change in instrumentation.  The '50s-style "diagram on the wall" system was supplemented by computer assist.  That should eliminate the possibility of a repeat.  Similar reactors have all been upgraded to include computer assist.  They have operated safely in the decades since.  So, as I noted previously, this was only a financial disaster.

The second of the big-three is Chernobyl.  It happened in 1986.  The atomic "pile" in a squash court at the University of Chicago that played an important role in the development of the original Atomic Bomb was the basis for the design.  The reactor vessel was a large cylinder.  It had a strong floor and was covered by a lid that weighted thousands of tons.  Blocks of a couple of different types of material were stacked inside in a carefully designed pattern.

One type of block used was made of graphite, a kind of carbon, so essentially coal.  When the idiot operators performed their experiment things heated up and some of the graphite blocks caught fire.  Soon there was a giant bonfire going on in the reactor vessel.  At various points the Uranium blocks got rearranged in patterns that caused the chain reaction to speed up.

It is unclear how much was caused by the burning graphite versus the chain-reacting Uranium.  But early on the lid was blown clean off.  This gave the graphite access to lots of oxygen, and it burned furiously.  Eventually, things cooled down, likely after the graphite had all burned off.

But while the fire was going the Venturi effect had thrown large amounts of highly radioactive material into the air.  Large amounts of radioactive material settled on the ground close to the reactor.  Tiny amounts of radioactive material eventually spread as far as Sweeden.  This is not surprising because radioactive material is detectable at extremely low concentrations.  Sweeden and its population were put in no danger by this tiny amount of radioactivity.

A containment structure was hastily built.  It proved to be no match for the weather. Several years later a larger, more elaborate, and more expensive structure was put in place.  It secured the reactor building and all the radioactive material it still contained.  That was most of it.  But far too much radioactive material had drifted away.  The material that had settled in the immediate vicinity had done so in a high enough concentration to be actively dangerous.  The new containment building did nothing to mitigate that danger.

At the time of the accident a large "exclusion zone" was put into place to deal with the areas of high radiation.  Everybody was evacuated.  It is still there.  Its boundaries have changed little since 1986.  There are still no people living there.  But this has let plant and animal life thrive in every part of the exclusion zone.  It turns out that people are more of a threat to plants and animals than even high levels of radioactivity.

I am going to skip over the modern history of Chernobyl other than to note that it is now in Ukraine, an active war zone, and move on to the third big disaster, Fukushima.  It took place in 2011.  There the reactor design was similar to Three Mile Island, but for various reasons it did not include a super-strong Three Mile Island style containment vessel.  And in some ways Fukushima was a repeat of Three Mile Island.

In both cases Hydrogen built up.  In the case of Fukushima things went on long enough for far more Hydrogen to build up.  Eventually, this caused explosions.  Without the super-strong containment vessel, the explosions were strong enough to blow the roof off of reactor buildings.  There was no Hydrogen explosion that large at Three Mile Island.  There the roof remained intact.

The Fukushima facility included several nuclear reactors. The operators of that facility were well aware of the possibility of Hydrogen explosions and what the likely result would be.  The plan covering such a possibility was to vent the Hydrogen off well before it reached dangerous levels.

It's just that the damage caused by the earthquake and Tsunami was so extensive that they couldn't do that.  Had Hydrogen venting been possible, then little or no radioactivity would have spread to the civilian areas that surrounded the facility.

At Three Mile Island the reactor was never successfully SCRAMmed.  If it had been, then nothing bad would have happened.  All the reactors at Fukushima were successfully SCRAMmed.  (This happened after the earthquake hit but before the Tsunami struck the facility.)  But there was no power to circulate cooling water in the days that followed.

Eventually the Hydrogen built up (there was no power to run the valve that controlled venting) and things heated up.  Explosions ensued.  They were insignificant by nuclear standards.  But they were powerful enough to further damage the plant and to throw a considerable amount of radioactive material into the air.

The Japanese immediately implemented a large exclusion zone.  As the people on the receiving end of the Hiroshima and Nagasaki Atomic bombs, the Japanese were hypersensitive to any possible exposure of the general population to heightened levels of radioactivity.  As a result, there were no civilian casualties associated with Fukushima.

It is possible that one or more plant employees were exposed to enough radiation to kill them or damage their health.  But I know of none.  It is safe to say that radiation fatalities associated with Fukushima were likely confined to single digits.  And it is possible that the single digit was zero.

Japan is a capable nation.  But they were hampered by the widespread death and destruction that was caused by the earthquake and tsunami and which had nothing to do with Fukushima.  20,000 people were killed by these twin disasters.  Many billions of dollars' worth of damage was inflicted.  The damage included critical infrastructure like power lines.

All things considered it is remarkable that they were able to get the site under control within only a few months.  But by that time, it was in terrible shape.  For instance, they were forced to resort to flooding some areas of the plant with water.  That was the only way to cool the reactors and keep things under control.

As a result, they ended up with a tremendous amount of contaminated water.  Their short-term solution was to store this water in tanks on site.  But as time has passed, they have literally run out of space.  They are solving this latest problem by resorting to "solution by dilution".

They plan to slowly pour the contaminated water into the ocean.  Various people have objected to this.  But they tend to be the types that believe in the fantasy that there is a zero risk/cost option out there.  But there isn't.  The nay sayers also have no idea just how big the ocean is.

There are 1.3 million tons of contaminated water currently being stored on site.  That sounds like a lot.  But if it is poured into the ocean at the rate of only one cubic meter per second it will take less than three years to dispose of all of it. And ocean water is never completely still.  It is always moving.

 Let's say it is poured into a part of the ocean with a current traveling at a walking pace.  That's three kilometers per hour, not very fast.  But even at that slow rate the radioactive water will travel about 500 KM per week.  After only a week it should have been diluted to a ratio of a billion to one.  The ocean is large.  Much of it is miles deep.  500 KM is only a short distance when it comes to traversing the ocean.

The farther the radioactive water travels, the more dilute it will become.  And that's why I am confident that no harm will come from depositing that amount of radioactive material in the ocean.  Experience with the current methods used to store nuclear waste tell us that something needs to be done, and sooner rather than later.

Back to Chernobyl for a moment.  It is impossible to say how many casualties there were there.  Then as now the Russians are not a reliable source of this kind of information.  Plant personnel were killed.  A small team of experts purposely risked their lives to explore and monitor what was going on inside the building.  That was critical information that could be gotten in no other way.  Some of them died.  Others suffered serious health effects.

Soldiers were brought in during the first few days and deliberately put into extreme danger as part of the effort to get things under control.  It is likely that some of them died, and others suffered serious health effects.  And the evacuation was slow.  So, it is possible that some civilians suffered serious health effects.

The highest estimate I have seen that comes from a credible source puts likely Chernobyl related deaths at a few thousand.  Other estimates are lower.  These estimates include both short term and long-term fatalities.  Of course, many times the number who die will suffer mild to severe health effects.  But even for people who lived in the immediate vicinity of Chernobyl at the time of the disaster, a list of the top 100 health hazards they should be concerned with would not include the disaster. 

All three of these disasters, but particularly Fukushima, had a large impact on society as a whole.  At the time of the Fukushima disaster, Japan had about 80 nuclear power plants.  Japan is resource constrained.  Those nuclear plants allowed Japan to reduce by a large amount the quantity of fossil the fuels they needed to import and consume.  But Japan decided to shutter all its nuclear plants after Fukushima anyhow.

And it wasn't just Japan.  France and Germany, two other countries that had decided for reasons similar to Japan's to depend heavily on nuclear for power generation, announced plans to also drastically reduce or shutter their nuclear power plants.

Soon, the only place where new nuclear power plants were being built was China.  By this time China had terrible air pollution problems.  A major contributor were the many coal-fired electric power plants they had built.  China is still building nuclear power plants.  Unfortunately, they are still building coal plants too.

I was pretty depressed by the general situation when I wrote the posts I linked to above.  Fortunately, things have since changed for the better.  Why?  War and pestilence.  But before moving on, a final observation.  As noted above the Chernobyl design was abandoned as a result of the disaster.  Fukushima highlighted the fact that SCRAMming a reactor of that type was not enough.  It was important that the reactor cooled down completely come hell or high water.

The need for a "passive cooldown" capability was well known.  It's just that before Fukushima the need didn't seem that great and the expected cost, once legal wrangling was factored in, seemed too high.  Fukushima might have driven the industry in the direction of producing new nuclear power plants that included passive cooldown.  Instead, things went in another direction.  They built no new plants and started shutting down the old ones.

Incorporating passive cooldown into the design of a nuclear power plant is simply an engineering problem.  It doesn't matter whether the design is an old one or a new one.  Either way, there are no great technical challenges.  It is simply a matter of deciding to do so.

On the other hand, retrofitting the feature into an already built facility would be fantastically expensive, if it was even possible to do at all.  But for a new plant the design and increase in construction costs are relatively modest.  In spite of this no commercial reactor that incorporated this feature was built.  Why?  The anti-nuclear movement.

A new design, or a significant modification to a current design, automatically triggers a review.  And a review opens the process up to litigation.  The anti-nuclear people are past masters at engineering long, drawn out, and expensive cycles of litigation whenever they get a chance.  The certainty of being tied up in a decade of expensive litigation had to be balanced against the perceived benefit by the industry.

The industry perceived that the benefit was small.  Neither Three Mile Island nor Chernobyl had had any cooldown problems.  In both cases the infrastructure surrounding these plants had remained intact and in good operating condition.  The power necessary to complete the cooldown process had been readily available.  At Fukushima it was a different story.  But remember, Fukushima would not have happened absent a record-breaking earthquake coupled with a record-breaking Tsunami.

Back when I wrote the two posts I referenced above, the situation was tightly locked in.  The anti-nuclear forces were strong and well organized.  The opposition was weak and disorganized.  Under their relentless barrage of attacks nuclear power plant designs were frozen.  Construction ground slowly to a halt everywhere but in China, a country where the government was powerful enough to suppress the anti-nuclear movement.

But things were changing, even if it wasn't apparent at first.  Global Warming started out as a concern limited to certain circles of the scientific community.  Word slowly spread from there.  Then Al Gore hit the lecture circuit with an excellent presentation on the subject.  He turned his presentation into a compelling movie in 2006.  The movie garnered enough buzz to attract the interest of the general public.  They went to see it in droves.

The public interest the movie created soon led to a backlash.  The backlash was initially led by various groups of science deniers.  Then the fossil fuel industry, most notably Exxon Mobile, started secretly funding various disinformation initiatives.  Conservatives started thinking "if liberals like Gore are for it, then we are against it".

But the evidence kept piling up.  The impacts caused by Global Warming kept getting larger and more noticeable.  More and more people were impacted.  Severe weather events got not only more severe but more frequent.  Glaciers, some of which were near built-up areas in Europe and elsewhere, shrank noticeably or even disappeared completely.  There was push back from doubters and deniers.  But it soon became nearly impossible to find a glacier that was growing.

Large population areas began routinely suffering from severe floods, hurricanes, tornadoes, extreme snowfalls, fires, etc.  Bad weather caused power blackouts, massive disruptions to transportation systems, and other activities that added up to far more than just the occasional inconvenience.  "Hundred year" extreme weather events became an annual occurrence.  All the stuff that Gore had warned about started happening.

Eventually, a turning point was reached.  It became nearly impossible to deny that Global Warming was real and that it was having a large negative impact on people.  People still didn't want to do anything because they rightly believed that "the fix" would be uncomfortable, inconvenient, and expensive.

People imagined that "the fix" would a larger and more intrusive version of what happened the two times in the past century when OPEC cut the U.S. off from their oil wells.  People had to suffer through blocks long gas lines.  They were expected to dump their big, cheap, gas guzzler car that was fun to drive for a small, more expensive model that was supposedly more practical, but was also not nearly as fun to drive.  And when things returned to normal, somehow gas was a lot more expensive.

But the Global Warming problem could no longer be ignored.  That led to a search for mitigations that were cheap and pleasant.  Elon Musk came out with an electric car that was cool and fun to drive.  It was too expensive for most people, but it introduced the idea of electric cars as a positive experience rather than a negative one.

Solar Panels and Wind Turbines kept getting cheaper.  They have been the cheapest way to generate electricity for several years now.  They have made it possible to shut down dirty coal fired power plants while saving money.  Switching from getting our electricity from burning fossil fuels to green solutions might actually save money rather than costing it.

That started giving people hope.  Hope that it was possible to fix the problem.  Hope that the problem could be fixed at reasonable cost.  Hope is not the same thing as reality.  But having a reason for hope that is based in reality and not fantasy took away a lot of the negative pressure.

And the cost of doing nothing keeps getting higher and higher.  Floods, Hurricanes, and other weather extremes were literally wiping out people's homes, livelihoods, their whole way of life.  There were real, large, and highly visible costs associated with doing nothing.  That has led to an increase in positive pressure, pressure to do something about the problem.

COVID put everything on hold for a couple of years.  To put it mildly, it was a major disruptor.  After COVID the amount of change people were comfortable with increased tremendously.  COVID was not caused by Global Warming.  COVID didn't even made Global Warming worse or more likely.  But it was a sharp reminder of how interconnected everything is and how change is sometimes forced on us whether we like it or not.

And then Russia invaded Ukraine.  More accurately, they resumed the invasion they had begun in 2014.  It's been a long time since the world has seen a major War.  Ukraine is not a World War, at least not yet.  But it was also not an Iraq or Afghanistan sized war.  In those wars the primary weapons were the AK-47 and the IEDs.

Ukraine is a war involving real armies using state-of-the-art weapons with tremendously greater range and destructive power.  One of them can take out a whole building, not just a few people or a single vehicle.  The amount of havoc being wrought, and the swiftness with which it is being dealt out, have been shocking to many.

And the Ukraine war is not being fought in some less developed corner of the world.  It is being fought in a modern country on the edge of Europe.  And it is a "good guys (Ukraine) versus bad guys (Russia)" type of war.  People often lose track of how often in the postwar period Americans and Europeans have supported some corrupt autocracy against a group of "freedom fighters". 

Whether they actually were or weren't freedom fighters was often unclear.  But they were almost always the indigenous population of the area in dispute.  In the case of the war in Ukraine it is the Ukrainian people who are the indigenous population in the area under dispute.  And they are opposing the Russians, who are indisputably the foreign invaders.

In 2014 the Russians tried to make a case that there was significant support for Russia's actions among the local population in the areas they took control of.  But there was no local faction that had risen up and invited them in.  On the other hand, a lot of people living in the areas Russia occupied in 2014 had strong cultural ties to Russia and saw the Ukranian government of the time as corrupt and suspect.  So, the case the Russians were trying to make at that time was dubious but not completely lacking in merit.

The extent to which the people living in the areas Russia occupied in 2014 still feel positively toward Russia is now an open question.  The Russian occupation makes it impossible to learn the true feelings of those people.  But there is no dispute that when Russia resumed military operations in 2022, they were trying to gain control of areas where they had little or no local support.  It was a land grab, pure and simple.

Wars take place in a geopolitical context.  Europe saw Russia's invasion of Ukraine as a serious threat.  There wasn't much they could do in 2014 due to the precarious nature of the Ukrainian government at the time.  But by 2022 Ukraine had a different government, one that was willing and able to effectively oppose Russia.  This gave the Europeans actual options.  Not everything became possible, but a lot did.  For instance, the Europeans did not want to go to war with Russia.  But they were happy to supply Ukraine with all kinds of assistance, including military assistance.

One of the geopolitical considerations was that in early 2022 when the war started Europe was heavily dependent on Russia for oil and gas.  Theoretically, Russia could close the valve on either or both at any time.  Russia, of course, depended heavily on the money these sales brought in.  So, an important question became "how much damage was Russia willing to inflict on its own economy?"  In any case, it now seemed to be in Europe's interest to move away from Russian oil and gas.

But the whole reason the Europe had gotten into bed with the Russians in the first place was because there were few alternatives to Russia given the amount of fossil fuels that Europe wanted to consume.  As soon as the war started Europe started scrambling to find alternative sources.  That effort has only been partially successful.  That made it obvious that what they really needed to do was to substantially reduce their consumption of fossil fuels.  They needed to go green.

Not that long ago there seemed to be little or no reason to stick with nuclear power plants.  But nuclear plants produce no greenhouse gases.  And they don't depend on what Russia is up to.  As the Ukraine war ground on European countries started shelving their plans for shutting down nuclear power plants.  In fact, it seemed like a good idea to get some of the mothballed plants back online.

A similar thing happened in Japan.  The environmental cost of burning fossil fuels was becoming more apparent.  And Japan was not spared from extreme weather events.  So, the economic case for going green kept getting stronger and stronger.  Plans to mothball Japanese nuclear power plants are now on hold.  Whether they will restart any mothballed units, or build new ones, are still open questions.  But both of options are no longer off the table.

And then there's the U.S.  We are energy independent.  But we don't want to see the Russians succeed in Ukraine.  We have poured tens of billions of dollars into Ukraine's war effort.  We, and the Europeans, have now gone through several cycles of "we can't provide Ukraine with 'X' because it will cross a red line", only to reverse ourselves as the war drags on and start providing 'X'.

At the same time extreme weather events in the U.S. have become routine.  So, here too the anti-nuclear side of the argument is no longer seen as being the zero cost one.  That has drastically changed the calculus that surrounds the construction of nuclear power plants.  It hasn't changed completely yet, but it is moving the U.S. away from an anti-nuclear position.

For instance, for the first time in decades there are two new nuclear power plants under construction.  They are Georgia Power Plant Vogtle unit #3 and unit #4.  Both units are scheduled to come online this year (2023).  One (#3) should be online by the spring.  It only has a couple of hoops left to jump through so that is likely to happen.  The other (#4) has more hoops left to jump through, so it is still several months (and several possible delays) away from coming online.

These plants are the large, multibillion dollar projects we are used to when it comes to nuclear power plant construction.  There have been the usual delays and cost overruns.  Assuming lessons have been learned, similar plants should be cheaper and quicker to build.

But the result will still be large and very expensive projects similar to what we have seen in the past.  They are not game changers, except in the sense that they are actually getting built.  They managed to defeat the anti-nuclear forces in the courts.

A more interesting project is NuScale.  It is the furthest along of several projects that are taking similar approaches.  It has managed to jump through some but not all of the regulatory hoops necessary to actually construct a nuclear power plant.  It is currently scheduled to go online in 2029.  I expect that schedule to slip, possibly substantially.

NuScale is not business as usual in the nuclear power business.    It is one of several efforts to build small modular nuclear plants that differ substantially from the traditional design.  The new designs all aim to have modest siting requirements.  The idea is to eliminate the customization inherent in the current process.  That should save money.  A NuScale power plant would be modular.  As would the others.  A plant would consist of several small, standardized modules that could be produced assembly line fashion.  That too is supposed to save money.

Each effort uses a different, more efficient, process to convert the energy released by nuclear fission into electric power.  Several approaches are being put forward.  All are quite different from the current approach.  All are also supposed to produce less nuclear waste.

If successful, the NuScale approach would be a game changer.  If it fails, then maybe one of the alternatives will succeed.  It will be several years and several billion dollars before we know if NuScale will succeed in delivering on its many promises.  It will be even longer before we know how the others will fare.

But the need for green electric power becomes more urgent every year.  And, for a change, nuclear power is looking better and better every year rather than worse and worse.