Mostly I write long blog posts and publish two per month. I am going to try and get out of that rut and do things in a more blog-ish way for a while. To be honest, the subject I want to attack, how computer networking works, would result in either a single ultra-long post or two or more long posts. So while I am posting on the subject I am going to make a virtue out of a necessity. I am going to break it up into a number of small posts. Once I am ready to move on to something else, we'll see how it goes
I am not going to do an in depth treatment of computer networking. Instead I will stick to an overview that includes enough information that a home user will have a broad general understanding of what is going on. If I am successful he will also be knowledgeable enough to deal with most home networking issues. Here goes.
The first subject I am going to cover is really boring but really important. My original plan was to sprinkle a little in here and a little in there. But it made even my first post run long. So I am just going to break it out as its own subject and start with it. Fortunately it is short. And for those of you who have already dived into networking it may clear some things up. I can at least hope that is true.
Computers are at bottom all about numbers. Most of the time a number is handled within the computer in binary form. Computers do off/on very well. It is possible do get them to handle more complex things but off/on is the quickest and easiest. And off/on gives us a choice between two values, a binary (two possibilities) choice. People numbers are called decimal because you have a choice between ten possible digits. And so on. Two other alternatives that are popular in the computer business are octal (8 possible choices) and hexadecimal (16 possible choices).
Anyhow, if we go with binary numbers, and that's what we are going to do, we end up with 0 and 1. And if we group binary digits (a contradiction as binary = 2 and we have 10 digits on our hands, but that's the phrase in common use) we can up our number of choices. With two bits ("bit" is a contraction of binary and digit) we have 4 choices. With three bits we have 8 choices (commonly referred to as "octal"). With 4 bits we have 16 choices (commonly referred to as "hexadecimal"). I am going to stop there.
But 2 and 4 and 8 and 16 are powers of 2 and that's not by chance. And if we combine x bits into a single combined thing then the maximum number of possible combinations is exactly two raised to the xth power. And that's one of the most basic of computer tricks. By combining a number of bits and treating it as a single entity we can make something that can support lots of unique values. If we need only a few unique values we can use a small number of bits. If we need a whole lot of unique values we can use a large number of bits.
You stumble across powers of two all over the place when dealing with computers. So let me include the value for some that arise frequently. 8 bits gives you 256 possible values. 10 bits gives you a little over a thousand (1024 to be exact) possible values. 16 bits gets you a little more than 64,000 possible values. 20 bits gives you a little over a million possible values. 24 bits gives you over 16 million possible values. And finally, 32 bits gives you a little over 4 billion possible values. That should give you an exact (for the small ones) or an approximate (for the larger ones) feel for various commonly encountered powers of 2.
Let me get ahead of myself for a moment and tell you that in IPv4 (I'll get to what IPv4 is in a later post), a computer is assigned a 32 bit number. That means that IPv4 can handle 4 billion computers. We are going to have to talk about specific IPv4 computer numbers and understand how to do tricks with them. But, for the moment let's forget the what and the how of this 32 bit number and just focus on the number itself. Specifically, let's focus on how we represent it.
There are two obvious choices. First we can just treat it like a standard decimal number. It would have the same number of digits as a ZIP+4 number. And network people could have gone down this path. But they didn't. The other obvious choice would be to list a string of 32 characters consisting of zeroes and ones. That would have been an unwieldy approach so it too was discarded. Then there are some other popular (within the computer community) possible approaches. Three binary digits can be grouped together to form an octal digit and represented by the digits 0-7. That is a common trick. The resulting number would have shrunk to 11 characters. That's doable. Then there's the more modern variant on octal, hexadecimal digits. Here 0-9 are augmented with a-f so that there are sixteen characters available. That would result in a pretty manageable 8 character number. It could have been used but it too wasn't.
Instead a hybrid scheme was adopted. The 32 bits were broken into 4 eight bit subgroups. For obscure reasons each subgroup is technically called an octet. The more common term for a tight grouping of 8 bits is a byte. But for essentially political reasons "octet" is frequently used in the literature. I will use byte and octet interchangeably. A decimal number in the range of 0-255 is used to represent the value of each octet. So an IPv4 address is commonly expressed in the form a.b.c.d where a, b, c, and d, are replaced with a number between 0 and 255. This would seem to be almost as clumsy as just using a number between zero and four billion but it isn't. One reason is that all modern computers use memory that is divided into bytes. So each number represents its own byte and computer people find themselves translating from the 8 bit binary representation of a byte to the decimal equivalent and back all the time. So they've gotten good at it. And here's one trick they commonly use.
You need to become familiar with the following sequence: 1,2,4,8,16,32,64,128. We have seen that the "2, 4, 8" part of this sequence are powers of two. And 1 is also a power of two. It is two raised to the zero-th power. In fact all the numbers in the list are powers of two. 2 is two raised to the first power, 4 is two raised to the second power, and so on. The list ends with 128, which is two raised to the seventh power.
This "powers of two" trick is used to move between the binary representation and the decimal representation of the same thing. Let's say we want a binary pattern of alternating 0s and 1s. First, line up our list in reverse order: 128, 64, 32, 16, 8, 4, 2, 1. Count our bit pattern from left to right. It turns out we want the odd numbered bits to be a zero and the even number ones to be a one. So we take the even entries in the list, the ones that correspond to the locations we want to be a 1. We end up with 64, 16, 4, and 2. Add them up. We ignore the entries in the list that correspond to the locations we want to be a 0. The result is 85. A decimal 85 converted to binary will result in 01010101, just the pattern we want. We can use a similar trick to go backwards.
Start with the number 204. Now subtract the biggest number in the list that is smaller - 128. That leaves 76. Subtract 64 leaving 12 and then 8 leaving 4 then 4 leaving 0. Now use our reverse order list. Traverse the list putting down a 1 for each number that is in our list of numbers from breaking 204 into pieces and a 0 for each number that is missing. We get 11001100 and that is the binary equivalent of 204. This seems complex but it works. And it gets pretty easy with some practice. Finally, here are two special cases that come up often enough to remember - the decimal number 0 = 00000000 and the decimal number 255 = 11111111.
Looking ahead again, IPv4 addresses are divided into nets and subnets. The way this is done is with something called a subnet mask. The first part of the mask consists of all 1s - that's the net part. The rest of the mask consists of 0s - that's the subnet part. With what we now know we can translate some popular subnet masks. 255.255.255.0 turns the first 24 bits on and leaves that last 8 bits off. The mask part is a number between 0 and 16 million (roughly). The subnet part is a number between 0 and 255.
Two other common subnet masks are 255.0.0.0 (8 bits of net - 0-255 and 24 bits of subnet - 0-16 million) and 255.255.0.0 (16 bits of net - 0-64,000, and 16 bits of subnet - 0-64,000). Does that mean that the net/subnet break must happen between bytes? No. That's just a common way of doing it.
The following is a legal mask: 255.255.240.0. Let's take a close look to see why. First, we have a number of octets that have a value of 255. That means that all the bits in those octets are 1s. Then you have one octet that has a funny value. That means some of the bits are 1s and some of them are 0s. We'll take a closer look at it shortly. Finally, all the octets after the funny valued one are 0. That means they are all 0s. If you see this pattern then you know the subnet mask is valid if the funny octet is valid. So let's decompose our funny value. 240 - 128 = 112. 112 - 64 = 48. 48 - 32 = 16. 16 - 16 = 0. Walking out reverse list yields a pattern of 11110000. Putting it all together you will find the net part of the mask is 20 bits long - 0-1 million, and the subnet part of the mask is 12 bits long - 0-4096.
Are there any invalid subnet masks? Yes! In fact, most subnet masks are invalid. A valid subnet mask always consists of a number of 1s followed by all 0s. If you check all the above examples you will find that this is true in each case. But a subnet mask of 255.255.0.255 is invalid. Why? Because it consists of 16 1s followed by 8 0s followed by 8 1s. Once you hit a 0 all the remaining bits must also be 0.
You have now survived the real boring part. The rest of the posts in this series have some actual meet in them.
Thursday, September 24, 2015
Wednesday, September 16, 2015
Hard Disks
I recently ran into a problem with the hard disk on one of my computers. I'm not going to bore you with the details. The only thing I want to extract from my disk problem is that I found out that one aspect of hard disks is evolving. At this point you are probably thinking that this post is going to be all about technical computer stuff. You're right! If that's okay, keep reading. If it isn't here is a good place to quit. I'll leave the details about what is evolving to the end of this post. Instead what I am going to do is one of those "how did we get here" things.
I have been around for a significant part of the creation and evolution of hard disks. I routinely find that the younger generation is completely unfamiliar with what I consider to be significant milestones in the story of how things got to be the way they now are. That's fine. There are lots of people who came before me and are up on milestones that happened before I was paying attention. So I am just as guilty of that sort if thing as the next person. But I am interested in history and the evolution of technology. And I am in a good position to lay it out, at least with respect to disk drives. So I want to do that for whoever might be interested while I am in a position to do so. Where to start?
I am going to start with the invention of the phonograph in 1877 by Thomas Edison. Edison was interested in being able to save and reproduce sound. The key components of the device he came up with were a cylinder and a needle. The needle was connected to a horn assembly which funneled the energy of the sound to where the needle was. This amplified the energy available to move the needle. And all this caused the needle to move by an amount proportional to the intensity of the sound. The needle ran along a spiral path in a cylinder with a foil surface with the sound causing wiggles that mirrored the pattern of the sound.
The point I want to make, and will be continually making, is that there were choices available to Edison. He made certain selections among those choices but he could have made other selections and ended up with a working device. In Edison's case he made two selections I want to highlight. He selected a cylindrical shape and a spiral path. Focusing on the rest of the machine for a moment, these choices allowed him to keep the needle in a fixed position. This allowed him to connect an elaborate funnel mechanism to the needle. And this allowed him to focus enough sound energy that the needle moved a relatively large amount under the influence of relatively quiet sounds. Now back to the cylinder and the spiral.
In Edison's time there existed a machine called a lathe. A lathe rotates a piece of material like wood. If you move a fixed blade along the axis of rotation you can grind complex shapes into long skinny pieces of wood. A typical example of this is a baseball bat. But the profile of the finished piece can be pretty much anything you want. Lathes turned out to be very useful wood working devices and they were common in any well equipped shop of Edison's day. Other lathes were designed to handle other materials like metal.
Putting a foil covered cylinder on a lathe-like device allowed the cylinder to move under the fixed needle along a long path. Adding something called a worm screw allowed the cylinder to shift evenly. So instead of the needle inscribing a circle as the cylinder rotated it inscribed a spiral as the cylinder simultaneously rotated and moved slowly sideways. This was the simplest mechanism available to Edison to allow a needle to move along a long grove on a relatively small device (the recording cylinder).
But it turned out that there was a better, although more complex solution available. By using a more complex mechanism the same long spiral could be inscribed on a flat circular platter. After some years competitors to Edison came out with "platter based" phonographs. The platters were more compact and quickly came to dominate the market. We will be returning to all these choices as the story continues. I am now going to fast forward to roughly the end of World War II.
About this time a magnetic technique evolved that was capable of recording sound. In its initial implementation a wire was coated with a material containing small magnetizeable particles. A "recording head" could write a magnetic signal into the particles. A "reading head" could read it back. The result was something mostly lost to history called a wire recorder. But a lot of tinkering quickly morphed this into something else. Instead of glopping the magnetic material onto a wire it got glopped onto a thin strip of strong material something like movie film. In a popular incarnation up to 4 "tracks" could be recorded as paths along a long 1/4 inch wide strip. By using the first and third track to record a "left" and "right" signal a stereo recording could be made.
Flipping the tape over turned tracks two and four into tracks one and three and the tape could be played back in the other direction. This gave the tape a "B" side, analogous to the "B" side of a record. This all came together as the "reel to reel" audio tape recorder/player that was popular for several decades. After a time the tape was packaged into a "cassette" housing that made it easier to use. The concept remained the same but the repackaging gave us the audio cassette recorder/player that extended the life of audio tape machines for another couple of decades.
And fairly early people figured out that you could record computer data instead of sound using the same or similar equipment. So by the '50s you had tape recording systems that were specially adapted for computer use. Instead of using a ribbon that was a quarter of an inch across they used one that was a half an inch across. And the technology evolved over the years to increase the amount of data that could be recorded on a standard reel of computer tape. But no matter what the engineers did the result was pretty inflexible.
Suppose the tape was positioned near the front of the reel and you wanted to access data toward the back. The only thing to do was to spin through all the intervening tape to get to where you wanted to go. You had the same problem if the reel was positioned toward the back and you wanted to access data near the front. A reel of tape was a "sequential access" system. But "random access" is incredibly convenient in lots of situations. And that led, also in the '50s, to the development of the disk drive.
The disk drive was a throwback to the flat phonograph record. A rigid metal platter was smeared with the same kind of magnetic goo as that used for reels of magnetic tape. The needle was replaced by a "read head" (and a "write head") that was kept in a relatively fixed position. The platter was allowed to revolve beneath the head. There were no longer physical groves as in a phonograph record but there was a magnetic path that performed the same function. And with the phonograph record the spiral groove automatically carried the needle slowly from the outside of the record to the center. You just needed to make sure that the arm the needle was on could pivot. (That was one of the complexities that had to be added to move from the Edison "cylinder" phonograph design to the later "platter" design.) But in the disk drive case there was no groove. What to do?
This question caused early disk drive designers to make a different choice than their phonograph brethren had. They dumped the spiral design and opted for a series of concentric circles. This design choice was an attempt to emulate Edison's "keep it simple stupid" thinking. They kept the "arm that pivots" idea for holding their needle-equivalent, the read/write head. But now they could use a system that pivoted the arm to a specific angle then held it there. What would now pass under the "head" would be a circular path on the disk surface.
The head movement mechanism would be more complicated that the equivalent mechanism on a phonograph but the additional complexity was a manageable amount. It would be designed to accurately pivot the head to any one of many specific angles then hold the head steady at that position. Reproducible results (the head could be reliably positioned to the correct circle or "track") were achievable. The circular path could be treated as if it was a short piece of magnetic tape. Within a specific "track", as they came to be called, you had the standard "sequential access" problem. But you could move to a distant piece of data by pivoting the head arm to the correct setting and waiting for the part of the track you were interested in to move under the head. It wasn't perfect but it was a big improvement.
And once you had one surface spinning away under a head it quickly became apparent that putting another head and another layer of goo on the bottom side of the platter was a good idea. This would double the capacity of your device. And since we have come this far, how about stacking multiple platters on the same spindle? Each platter would add two sides resulting in that much more capacity. And that's what happened. Devices were quickly developed consisting of several platters, all rotating around the same spindle, and with each surface having its own arm and its own read/write head. But this opened up an opportunity for another decision. Would all the heads be operated independently or would they all be "ganged" together. I don't know what thinking went into the decision but all disk drives I know of use the "ganged together" design. A single mechanism moves all the arms simultaneously to the same angle thus positioning all heads over the same track on their respective surfaces.
And implicit in all this is another design decision. Phonograph records rotate at a constant RPM. You have "33", "45", or "78" records. With phonographs this is a sensible decision. Again a rotating mechanism that runs at a constant speed is simpler. But there is something lost. The amount of inches traversed in a groove along the outside of the record in one revolution is much longer that it is for an inside groove. But the amount of information that can be stored in an inch of groove is constant. So you are consuming a lot more inches of groove to store the same fixed amount of information (all rotations take the same amount of time) on an outside groove as you are on an inside groove. Engineers knew this. They took a look at how much information they could reliably store on an inside groove and used that to set the specifications for the entire phonograph record. They just lived with the loss caused by spinning the record too fast to make the best use of the outside grooves. But it did not have to be that way.
I have a "laserdisc" player. It is a technology that didn't really catch on for reasons I am not going to go into now. The technology was an early version of the CD/DVD/Blu Ray technology that is now ubiquitous. But time had marched on between the debut of the various generations of phonograph technology represented by "33", "45", and"78" records and the debut of the laserdisc. It was now possible to vary the rotation rate of a laserdisc depending on whether you were reading from the inside or the outside of the disk. I have a fancy laserdisc player that is capable of playing "CLV" or "CAV" disks. CAV stands for Constant Angular Velocity. This is like a phonograph record playing at a fixed RPM. A CLV disk uses a higher RPM when processing inner tracks and a lower RPM when processing outer tracks. This speed variability results in a Constant Linear Velocity. Every inch of path contains the same amount of data. But the outer tracks can store more data than the inner ones so a CAV disk holds more data. This supposedly gives you a sharper picture but I was never able to tell the difference.
So it is possible to go with a CLV design for disk drives but as far as I know, no one has decided to do that. All disk drives are CAV devices. And let me revisit another design decision. Early disk drives opted for s series of concentric circular paths. To this day that is still true, as far as I know. But my laserdisc did not. It opted for the phonograph style spiral path. And so do CDs, even digital "data" ones and Blu Rays. Again the cost of dealing with complexity has dropped precipitously. So instead of just having a mechanism that swings the arm the head is on to a specific angle a much more complex method is used.
The arm is swung to approximately the correct setting. Then the track is found using some kind of complex method. I don't know the details but we can all testify that it works. Then data is read from the track. Imbedded in the data is a track number. If it's the right track, fine. Otherwise the arm is repositioned and the process repeated until the correct track is found. And remember the track is a spiral. So complex but highly reliable techniques cause the head to jump back a spiral automatically once per rotation if it is important to stay fixed on the same track. The end result is to make a spiral track behave like it is instead a series of concentric circular tracks. Magic!
Next I want to take a look at what is on the track. Here again choices have been made. Let me first discuss the format used on disk drives designed for use with IBM mainframe computers back in the stone age of computers (before PCs). The acronym (there is always an acronym) is CKD. It stands for Count Key Data. Disk drives were very expensive back then and this was an attempt to squeeze out all the performance the equipment was capable of. As I indicated above, a track can be thought as a short piece of magnetic tape. I don't know all the details but at the lowest level you have to know where the data starts and where it ends. What I do know is that the engineers solved this problem and the solution involved something called "inter-record gaps". All we need to know about these beasts is that they are necessary and that they take up space. So on a track we have inter-record gaps and we have the parts we care about.
Early disk drives were unreliable so the last part of each block of "care about" stuff was a checksum, some data that could be used to confirm that the rest of the usable stuff had not gotten garbled. That leaves us with the C, K, and D, parts. The Count was just a relative record number on the track. This allowed you to confirm that you were getting the block of data that you intended to get. The Key part was something that sounded like a good idea but never worked well in practice. The idea was you could give the disk drive a command that said "get me the record that has this specific key value". This would offload activity from the very expensive computer to the hopefully far less expensive disk drive. But nobody every came up with a way to make good use of this capability. But it stayed in the spec. The rest of the "care about" stuff, the Data part, was exactly what you would expect. It was whatever data you said you wanted the record to contain. This yet another example of the actual process being way more complex than you would think it would be.
And you can see this is quite a sophisticated approach. And have I mentioned that the amount of data could be anything you wanted as long as it fit on the track (or the unused remaining part of the track, if the track already contained some records). IBM came out with a series of disk drives over a period of several decades. Not surprising the size of a disk track increased as time went by. Two track lengths that characterized devices late in this sequence were 13030 bytes and 19069 bytes. Those were not exactly obvious choices. I assume they were dictated by what the technology of the time could be made to deliver. And these "capacity" numbers represented a best case. You could write a single record of the specified size on a single track. And, of course, it could not contain a key. But doing this allowed you to squeeze absolutely the most bytes of data possible onto a specific device. If you chose to write two or more records on the same track those pesky inter-record gaps got in the way and the amount of data the track would hold went down. Fortunately, IBM provided handy tables for figuring out how many records of a specific size would fit.
And, oh by the way, this CKD approach has completely fallen by the wayside. It has been replaced by a method called FBA. FBA stands for Fixed Block Architecture. The idea here is that all the data written on the drive consists of blocks of data that are all the same fixed size. Things are simplified. There are now no complex calculations in deciding if a block of data will fit on a track. But that simplification involves a trade off. IBM used a 4K (specifically 4096 bytes) block size on their FBA devices (actually the same physical device as its CKD sibling but with different microcode).
Consider the CKD device with a track size of 19069. We only have room for 4 4K blocks totaling 16,384 bytes. (In case you are wondering, 4 4K records fit even after you factor in the inter-record gaps.) We give up about 20% of the capacity of the track. Back when disk drives were really expensive that was too much of a penalty to pay in most situations and people were willing to put up with the complexity of the CKD architecture to get the additional capacity. But now disk drives are so cheap that people prefer to go with the simple but less efficient FBA approach so they do.
Now "let's return to those thrilling days of yesteryear" (a quote from the old Lone Ranger radio and later TV show, and most recently a really terrible Johnny Depp movie - yuck). Remember how disk drives work. You have what is now called the head-disk-assembly, the combination or arms, actuators, and heads). It swings into position so that the several heads can each process a circle on the appropriate disk surface. Imagine the whole mechanism was invisible and we only looked at the tracks of the heads. You would have a set of equal diameter circles stacked one on top of the other. With a little imagination we could see this pattern making up a cylinder and "cylinder" became the shorthand name for the set of circles that the heads as a group could access, once they were ready.
Then you have a number of heads and you select one. Then you pick the record you wish to process among the several that may exist on the track you have picked. It's computers so we reduce all this to numbers. We have a cylinder number, a head number, and a record number. This trio of numbers can be used to uniquely specify a single record on a disk drive. For reasons I am not going to get into IBM used the acronym CCHHR. And at the hardware level, this is how it still works. You swing the arms holding the heads to the location specified by the CC, you select the head specified by the HH and you select the record specified by the R. That's still how it works. But the fact that that's how it works is now completely disguised. So let's look into the disguise a bit.
Moving forward to a more recent era but still one that is a ways from the present, Microsoft used, and to some extent still uses, something called the FAT file system. FAT stands for File Allocation Table. The original version is now called FAT12. That's because 12 bits were used to specify what I called the CCHHR above. Some bits specified the cylinder number, some bits the head number, and a few bits specified the record number. The FAT12 system was designed to handle floppy disks. The original PC floppy had one side, hence one head. Floppies used the FBA architecture and the specification for the original floppy called for exactly 8 records on a track. That specification was capable of holding 160KB of data because there weren't that many cylinders either (40, if you care).
The specification was quickly tweaked to allow for two sides and 9 records per track yielding a total capacity of 360KB. (Later iterations kicked the capacity of "floppy" disks up to 1.44 MB - or 2.88 MB in a version that never really caught on.) The allocation of bits in the FAT12 spec was easily able to handle this. But within a couple of years the hard disk came along as an accessory to the PC. The original PC hard disk had a capacity of 10MB, tiny by current standards, but beyond the capability of FAT12 to handle. So the FAT12 specification was supplemented by the FAT16 specification, which could handle 10 MB and more. But it was still limited to so many bits for cylinders, so many bits for heads, and so many bits for records. And after not many years this became a problem.
It particularly became a problem because there was room for lots of heads. Heads equate to platters. But hardware makers found that it was a bad idea to have lots of platters. It was much easier to squeeze more tracks on a surface and more blocks in a track. So the embarrassing situation developed where the head field was too big while the cylinder number and/or record number field was too small. It didn't take the hardware makers to come up with a cheat. Why not lie? Claim your disk drive has twice as many heads as it actually has but half as many cylinders. As long as the disk controller faked things up properly the computer would never notice. And that's what disk makers did and it worked for a while.
But even with this trick they only had 16 bits to work with. If you used all possible combinations of all the bits you could only have 65,536 blocks of 512 bytes each. That's 32 MB and it didn't take disk makers very long to figure out how to make disks bigger than 32 MB. Microsoft eventually moved on to FAT32. But by this time the whole CCHHR thing looked pretty ridiculous. Why not just call the first block on the first track of the first cylinder block "0" (Computers like to count starting from zero rather than one). Call the next block "1", and so on. The first block on the second track would just use the next number in line. You just keep counting in the same manner until you get to the end of the disk. Things all of a sudden get much simpler. A disk capacity is just x blocks and you let the relative block number be translated by the disk controller whatever way the controller wants to. As long as the disk is relatively fast and the block with a specific relative block number ends up in same place, who cares where it really is?
That's how the situation has been handled for some time now. But we have again lost something. If you know how things actually are you can use that information to improve performance. It takes time to reposition the heads. If you arrange things so that the block you want is one of those that is under the heads at their current position, you can get to it faster. Now lets say that the block we want is coming up but is under a different head than the one that is currently selected. We can select a different head at electronic speeds. That's really fast. So theoretically you can play tricks to achieve top performance if you know the details of the disk geometry and can depend on your knowledge being correct. But it turned out to be really hard to actually get improved performance by playing these kinds of tricks.
And it turns out that there are other tricks that can be played under the modern rules. It is cheap to put a little intelligence and some buffering capability into modern disk controllers. In this environment controllers can play tricks. An easy one is to copy the data to a buffer instead of immediately writing it to disk. You tell the computer the write is done immediately. Then at some convenient later time you actually write the data to disk. If nothing goes wrong everything works as expected, only faster, and nobody is the wiser.
Another simple trick is called prospective read-ahead. If a certain block is read what's the most likely next block to be read? The next one. What's the easiest block to read next? The next one. So if the controller reads the current block, passes it along to the computer but then also reads the next block into its buffer without being asked what's the harm? Nothing, if the buffer is not full. But the benefit is that the next block can be passed back to the computer immediately from the buffer if the controller in fact receives a read request for it. It turns out that these simple tricks and a number of much more complicated ones can be implemented by modern disk controllers. They result in an increased effective speed of the disk drive. But you can either go with these tricks or the old CKD tricks. It is somewhere between difficult and impossible to combine the two.
I have touched indirectly on the final topic I want to discuss. It is the one I alluded to in the introduction. As I indicated above, back in the day IBM picked 4096 bytes for their FBA block size. So where did 512 bytes, the block size now in common use, come from? It turns out that it came from Unix. Unix has always used FBA architecture for their disks. I don't know why Unix picked this number. Here are a couple of theories. Unix is the bastard stepchild of an operating system called Multics. Multics was a joint development effort by General Electric and the Massachusetts Institute of Technology. I know very little about Multics beyond that. But it is possible that Unix took FBA and a 512 byte block from Multics.
The other theory I have has to do with the hardware Unix was originally developed on. Unix was originally developed at Bell Labs, which had a number of Multics systems in house at the time. But the original version of Unix was developed on minicomputers manufactured by Digital Equipment Corporation. GE still exists but has long since exited the computer business. DEC was for a time an extremely successful computer company. For a couple of years it was so successful that it had the biggest market capitalization (stock price times number of outstanding shares) of any computer company. At that time it was bigger than IBM (at the time a very large company) and Microsoft (at the time a small company) and Apple (at the time a very small company). But alas, DEC is no more and I know only a little more about DEC than I know about Multics.
And it is always possible that the original developers of Unix picked 512 for some other reason. And in any case, Microsoft adopted a block size of 512 for DOS and carried that decision over to Windows. And it has been a good choice for a very long time. But time marches on and with the march of time hard drive sizes keep growing. Now anything less that 1 GB in a single hard drive is considered small. To be considered large a hard drive now has to have a capacity of 1 TB or more. And these multi-terabyte drives only cost a few hundred dollars.
A 1 TB hard drive has about a zillion 512 byte blocks on it. That's pretty ridiculous. And the solution is obvious. Go to a bigger block size. And that process is under way. The industry has settled on a new size. It is, not surprisingly, 4096 bytes or 4K. Devices conforming to this standard are sometimes called "large-sector" drives. And the existence of this move to large-sector drives is the thing I learned as a side effect of my recent hard drive problems. A 1 TB large-sector drive will only have 1/8 of a zillion blocks of data. Don't you feel much better now?
The transition to 4K blocks is well, a transition. But it is just the next one in a long sequence of transitions. And I expect it to be a pretty smooth one. Most modern software is written in "C" or one of it's children, C++, Java, etc. None of the standard I/O libraries for these languages try to look directly at the hardware in the same way that software written in the old days for IBM mainframes did. Instead they just expect to deal with a string of bytes. They are totally indifferent to the fact that the continuous string of bytes that they deal with might in actually be handled in blocks inside some low level device driver. They can't see in there. And given that they definitely don't care that the block size might change from 512 bytes per block to 4096 bytes per block at some point.
There are two exceptions to this. The first exception is the OS itself (Windows, Unix, iOS, etc.). Then there are some utilities whose job is to pay attention to what the hard drive actually looks like. They will care that the data blocks on a large-sector hard drive are 4096 bytes in size rather than 512 bytes. That's their job. But even most utilities won't notice the change as they don't concern themselves with low level hard drive issues.
Changes will be required. Microsoft has already incorporated these changes into the latest versions of its operating systems. Most hard drive oriented utilities have also incorporated the necessary changes in the newer versions of their offerings. And most operating system vendors are either ahead of Microsoft or, at worst, close behind.
Even given all that I would recommend that the average user stay away from these new devices for a while. You generally don't need a disk drive that is big enough (say something over 10 GB) where the change might possibly make a difference. So for the moment stick to disks that use the old format. If you see a disk that says "4K" or "large-sector" pick a different model that doesn't. This is advice that I think will hold say through 2017. By that time all the necessary changes will have been rolled out and the bugs fixed. This recommendation to possibly go with the new specification at that point only pertains to people purchasing new hardware that comes with the latest (or near-latest) version of the operating system. If you are running an old OS, particularly one that is pre-2014, definitely stay away from the new hard drives.
And now you have it, the ammunition to bore to death anyone you meet at a cocktail party that needs boring to death. I try to help where I can.
I have been around for a significant part of the creation and evolution of hard disks. I routinely find that the younger generation is completely unfamiliar with what I consider to be significant milestones in the story of how things got to be the way they now are. That's fine. There are lots of people who came before me and are up on milestones that happened before I was paying attention. So I am just as guilty of that sort if thing as the next person. But I am interested in history and the evolution of technology. And I am in a good position to lay it out, at least with respect to disk drives. So I want to do that for whoever might be interested while I am in a position to do so. Where to start?
I am going to start with the invention of the phonograph in 1877 by Thomas Edison. Edison was interested in being able to save and reproduce sound. The key components of the device he came up with were a cylinder and a needle. The needle was connected to a horn assembly which funneled the energy of the sound to where the needle was. This amplified the energy available to move the needle. And all this caused the needle to move by an amount proportional to the intensity of the sound. The needle ran along a spiral path in a cylinder with a foil surface with the sound causing wiggles that mirrored the pattern of the sound.
The point I want to make, and will be continually making, is that there were choices available to Edison. He made certain selections among those choices but he could have made other selections and ended up with a working device. In Edison's case he made two selections I want to highlight. He selected a cylindrical shape and a spiral path. Focusing on the rest of the machine for a moment, these choices allowed him to keep the needle in a fixed position. This allowed him to connect an elaborate funnel mechanism to the needle. And this allowed him to focus enough sound energy that the needle moved a relatively large amount under the influence of relatively quiet sounds. Now back to the cylinder and the spiral.
In Edison's time there existed a machine called a lathe. A lathe rotates a piece of material like wood. If you move a fixed blade along the axis of rotation you can grind complex shapes into long skinny pieces of wood. A typical example of this is a baseball bat. But the profile of the finished piece can be pretty much anything you want. Lathes turned out to be very useful wood working devices and they were common in any well equipped shop of Edison's day. Other lathes were designed to handle other materials like metal.
Putting a foil covered cylinder on a lathe-like device allowed the cylinder to move under the fixed needle along a long path. Adding something called a worm screw allowed the cylinder to shift evenly. So instead of the needle inscribing a circle as the cylinder rotated it inscribed a spiral as the cylinder simultaneously rotated and moved slowly sideways. This was the simplest mechanism available to Edison to allow a needle to move along a long grove on a relatively small device (the recording cylinder).
But it turned out that there was a better, although more complex solution available. By using a more complex mechanism the same long spiral could be inscribed on a flat circular platter. After some years competitors to Edison came out with "platter based" phonographs. The platters were more compact and quickly came to dominate the market. We will be returning to all these choices as the story continues. I am now going to fast forward to roughly the end of World War II.
About this time a magnetic technique evolved that was capable of recording sound. In its initial implementation a wire was coated with a material containing small magnetizeable particles. A "recording head" could write a magnetic signal into the particles. A "reading head" could read it back. The result was something mostly lost to history called a wire recorder. But a lot of tinkering quickly morphed this into something else. Instead of glopping the magnetic material onto a wire it got glopped onto a thin strip of strong material something like movie film. In a popular incarnation up to 4 "tracks" could be recorded as paths along a long 1/4 inch wide strip. By using the first and third track to record a "left" and "right" signal a stereo recording could be made.
Flipping the tape over turned tracks two and four into tracks one and three and the tape could be played back in the other direction. This gave the tape a "B" side, analogous to the "B" side of a record. This all came together as the "reel to reel" audio tape recorder/player that was popular for several decades. After a time the tape was packaged into a "cassette" housing that made it easier to use. The concept remained the same but the repackaging gave us the audio cassette recorder/player that extended the life of audio tape machines for another couple of decades.
And fairly early people figured out that you could record computer data instead of sound using the same or similar equipment. So by the '50s you had tape recording systems that were specially adapted for computer use. Instead of using a ribbon that was a quarter of an inch across they used one that was a half an inch across. And the technology evolved over the years to increase the amount of data that could be recorded on a standard reel of computer tape. But no matter what the engineers did the result was pretty inflexible.
Suppose the tape was positioned near the front of the reel and you wanted to access data toward the back. The only thing to do was to spin through all the intervening tape to get to where you wanted to go. You had the same problem if the reel was positioned toward the back and you wanted to access data near the front. A reel of tape was a "sequential access" system. But "random access" is incredibly convenient in lots of situations. And that led, also in the '50s, to the development of the disk drive.
The disk drive was a throwback to the flat phonograph record. A rigid metal platter was smeared with the same kind of magnetic goo as that used for reels of magnetic tape. The needle was replaced by a "read head" (and a "write head") that was kept in a relatively fixed position. The platter was allowed to revolve beneath the head. There were no longer physical groves as in a phonograph record but there was a magnetic path that performed the same function. And with the phonograph record the spiral groove automatically carried the needle slowly from the outside of the record to the center. You just needed to make sure that the arm the needle was on could pivot. (That was one of the complexities that had to be added to move from the Edison "cylinder" phonograph design to the later "platter" design.) But in the disk drive case there was no groove. What to do?
This question caused early disk drive designers to make a different choice than their phonograph brethren had. They dumped the spiral design and opted for a series of concentric circles. This design choice was an attempt to emulate Edison's "keep it simple stupid" thinking. They kept the "arm that pivots" idea for holding their needle-equivalent, the read/write head. But now they could use a system that pivoted the arm to a specific angle then held it there. What would now pass under the "head" would be a circular path on the disk surface.
The head movement mechanism would be more complicated that the equivalent mechanism on a phonograph but the additional complexity was a manageable amount. It would be designed to accurately pivot the head to any one of many specific angles then hold the head steady at that position. Reproducible results (the head could be reliably positioned to the correct circle or "track") were achievable. The circular path could be treated as if it was a short piece of magnetic tape. Within a specific "track", as they came to be called, you had the standard "sequential access" problem. But you could move to a distant piece of data by pivoting the head arm to the correct setting and waiting for the part of the track you were interested in to move under the head. It wasn't perfect but it was a big improvement.
And once you had one surface spinning away under a head it quickly became apparent that putting another head and another layer of goo on the bottom side of the platter was a good idea. This would double the capacity of your device. And since we have come this far, how about stacking multiple platters on the same spindle? Each platter would add two sides resulting in that much more capacity. And that's what happened. Devices were quickly developed consisting of several platters, all rotating around the same spindle, and with each surface having its own arm and its own read/write head. But this opened up an opportunity for another decision. Would all the heads be operated independently or would they all be "ganged" together. I don't know what thinking went into the decision but all disk drives I know of use the "ganged together" design. A single mechanism moves all the arms simultaneously to the same angle thus positioning all heads over the same track on their respective surfaces.
And implicit in all this is another design decision. Phonograph records rotate at a constant RPM. You have "33", "45", or "78" records. With phonographs this is a sensible decision. Again a rotating mechanism that runs at a constant speed is simpler. But there is something lost. The amount of inches traversed in a groove along the outside of the record in one revolution is much longer that it is for an inside groove. But the amount of information that can be stored in an inch of groove is constant. So you are consuming a lot more inches of groove to store the same fixed amount of information (all rotations take the same amount of time) on an outside groove as you are on an inside groove. Engineers knew this. They took a look at how much information they could reliably store on an inside groove and used that to set the specifications for the entire phonograph record. They just lived with the loss caused by spinning the record too fast to make the best use of the outside grooves. But it did not have to be that way.
I have a "laserdisc" player. It is a technology that didn't really catch on for reasons I am not going to go into now. The technology was an early version of the CD/DVD/Blu Ray technology that is now ubiquitous. But time had marched on between the debut of the various generations of phonograph technology represented by "33", "45", and"78" records and the debut of the laserdisc. It was now possible to vary the rotation rate of a laserdisc depending on whether you were reading from the inside or the outside of the disk. I have a fancy laserdisc player that is capable of playing "CLV" or "CAV" disks. CAV stands for Constant Angular Velocity. This is like a phonograph record playing at a fixed RPM. A CLV disk uses a higher RPM when processing inner tracks and a lower RPM when processing outer tracks. This speed variability results in a Constant Linear Velocity. Every inch of path contains the same amount of data. But the outer tracks can store more data than the inner ones so a CAV disk holds more data. This supposedly gives you a sharper picture but I was never able to tell the difference.
So it is possible to go with a CLV design for disk drives but as far as I know, no one has decided to do that. All disk drives are CAV devices. And let me revisit another design decision. Early disk drives opted for s series of concentric circular paths. To this day that is still true, as far as I know. But my laserdisc did not. It opted for the phonograph style spiral path. And so do CDs, even digital "data" ones and Blu Rays. Again the cost of dealing with complexity has dropped precipitously. So instead of just having a mechanism that swings the arm the head is on to a specific angle a much more complex method is used.
The arm is swung to approximately the correct setting. Then the track is found using some kind of complex method. I don't know the details but we can all testify that it works. Then data is read from the track. Imbedded in the data is a track number. If it's the right track, fine. Otherwise the arm is repositioned and the process repeated until the correct track is found. And remember the track is a spiral. So complex but highly reliable techniques cause the head to jump back a spiral automatically once per rotation if it is important to stay fixed on the same track. The end result is to make a spiral track behave like it is instead a series of concentric circular tracks. Magic!
Next I want to take a look at what is on the track. Here again choices have been made. Let me first discuss the format used on disk drives designed for use with IBM mainframe computers back in the stone age of computers (before PCs). The acronym (there is always an acronym) is CKD. It stands for Count Key Data. Disk drives were very expensive back then and this was an attempt to squeeze out all the performance the equipment was capable of. As I indicated above, a track can be thought as a short piece of magnetic tape. I don't know all the details but at the lowest level you have to know where the data starts and where it ends. What I do know is that the engineers solved this problem and the solution involved something called "inter-record gaps". All we need to know about these beasts is that they are necessary and that they take up space. So on a track we have inter-record gaps and we have the parts we care about.
Early disk drives were unreliable so the last part of each block of "care about" stuff was a checksum, some data that could be used to confirm that the rest of the usable stuff had not gotten garbled. That leaves us with the C, K, and D, parts. The Count was just a relative record number on the track. This allowed you to confirm that you were getting the block of data that you intended to get. The Key part was something that sounded like a good idea but never worked well in practice. The idea was you could give the disk drive a command that said "get me the record that has this specific key value". This would offload activity from the very expensive computer to the hopefully far less expensive disk drive. But nobody every came up with a way to make good use of this capability. But it stayed in the spec. The rest of the "care about" stuff, the Data part, was exactly what you would expect. It was whatever data you said you wanted the record to contain. This yet another example of the actual process being way more complex than you would think it would be.
And you can see this is quite a sophisticated approach. And have I mentioned that the amount of data could be anything you wanted as long as it fit on the track (or the unused remaining part of the track, if the track already contained some records). IBM came out with a series of disk drives over a period of several decades. Not surprising the size of a disk track increased as time went by. Two track lengths that characterized devices late in this sequence were 13030 bytes and 19069 bytes. Those were not exactly obvious choices. I assume they were dictated by what the technology of the time could be made to deliver. And these "capacity" numbers represented a best case. You could write a single record of the specified size on a single track. And, of course, it could not contain a key. But doing this allowed you to squeeze absolutely the most bytes of data possible onto a specific device. If you chose to write two or more records on the same track those pesky inter-record gaps got in the way and the amount of data the track would hold went down. Fortunately, IBM provided handy tables for figuring out how many records of a specific size would fit.
And, oh by the way, this CKD approach has completely fallen by the wayside. It has been replaced by a method called FBA. FBA stands for Fixed Block Architecture. The idea here is that all the data written on the drive consists of blocks of data that are all the same fixed size. Things are simplified. There are now no complex calculations in deciding if a block of data will fit on a track. But that simplification involves a trade off. IBM used a 4K (specifically 4096 bytes) block size on their FBA devices (actually the same physical device as its CKD sibling but with different microcode).
Consider the CKD device with a track size of 19069. We only have room for 4 4K blocks totaling 16,384 bytes. (In case you are wondering, 4 4K records fit even after you factor in the inter-record gaps.) We give up about 20% of the capacity of the track. Back when disk drives were really expensive that was too much of a penalty to pay in most situations and people were willing to put up with the complexity of the CKD architecture to get the additional capacity. But now disk drives are so cheap that people prefer to go with the simple but less efficient FBA approach so they do.
Now "let's return to those thrilling days of yesteryear" (a quote from the old Lone Ranger radio and later TV show, and most recently a really terrible Johnny Depp movie - yuck). Remember how disk drives work. You have what is now called the head-disk-assembly, the combination or arms, actuators, and heads). It swings into position so that the several heads can each process a circle on the appropriate disk surface. Imagine the whole mechanism was invisible and we only looked at the tracks of the heads. You would have a set of equal diameter circles stacked one on top of the other. With a little imagination we could see this pattern making up a cylinder and "cylinder" became the shorthand name for the set of circles that the heads as a group could access, once they were ready.
Then you have a number of heads and you select one. Then you pick the record you wish to process among the several that may exist on the track you have picked. It's computers so we reduce all this to numbers. We have a cylinder number, a head number, and a record number. This trio of numbers can be used to uniquely specify a single record on a disk drive. For reasons I am not going to get into IBM used the acronym CCHHR. And at the hardware level, this is how it still works. You swing the arms holding the heads to the location specified by the CC, you select the head specified by the HH and you select the record specified by the R. That's still how it works. But the fact that that's how it works is now completely disguised. So let's look into the disguise a bit.
Moving forward to a more recent era but still one that is a ways from the present, Microsoft used, and to some extent still uses, something called the FAT file system. FAT stands for File Allocation Table. The original version is now called FAT12. That's because 12 bits were used to specify what I called the CCHHR above. Some bits specified the cylinder number, some bits the head number, and a few bits specified the record number. The FAT12 system was designed to handle floppy disks. The original PC floppy had one side, hence one head. Floppies used the FBA architecture and the specification for the original floppy called for exactly 8 records on a track. That specification was capable of holding 160KB of data because there weren't that many cylinders either (40, if you care).
The specification was quickly tweaked to allow for two sides and 9 records per track yielding a total capacity of 360KB. (Later iterations kicked the capacity of "floppy" disks up to 1.44 MB - or 2.88 MB in a version that never really caught on.) The allocation of bits in the FAT12 spec was easily able to handle this. But within a couple of years the hard disk came along as an accessory to the PC. The original PC hard disk had a capacity of 10MB, tiny by current standards, but beyond the capability of FAT12 to handle. So the FAT12 specification was supplemented by the FAT16 specification, which could handle 10 MB and more. But it was still limited to so many bits for cylinders, so many bits for heads, and so many bits for records. And after not many years this became a problem.
It particularly became a problem because there was room for lots of heads. Heads equate to platters. But hardware makers found that it was a bad idea to have lots of platters. It was much easier to squeeze more tracks on a surface and more blocks in a track. So the embarrassing situation developed where the head field was too big while the cylinder number and/or record number field was too small. It didn't take the hardware makers to come up with a cheat. Why not lie? Claim your disk drive has twice as many heads as it actually has but half as many cylinders. As long as the disk controller faked things up properly the computer would never notice. And that's what disk makers did and it worked for a while.
But even with this trick they only had 16 bits to work with. If you used all possible combinations of all the bits you could only have 65,536 blocks of 512 bytes each. That's 32 MB and it didn't take disk makers very long to figure out how to make disks bigger than 32 MB. Microsoft eventually moved on to FAT32. But by this time the whole CCHHR thing looked pretty ridiculous. Why not just call the first block on the first track of the first cylinder block "0" (Computers like to count starting from zero rather than one). Call the next block "1", and so on. The first block on the second track would just use the next number in line. You just keep counting in the same manner until you get to the end of the disk. Things all of a sudden get much simpler. A disk capacity is just x blocks and you let the relative block number be translated by the disk controller whatever way the controller wants to. As long as the disk is relatively fast and the block with a specific relative block number ends up in same place, who cares where it really is?
That's how the situation has been handled for some time now. But we have again lost something. If you know how things actually are you can use that information to improve performance. It takes time to reposition the heads. If you arrange things so that the block you want is one of those that is under the heads at their current position, you can get to it faster. Now lets say that the block we want is coming up but is under a different head than the one that is currently selected. We can select a different head at electronic speeds. That's really fast. So theoretically you can play tricks to achieve top performance if you know the details of the disk geometry and can depend on your knowledge being correct. But it turned out to be really hard to actually get improved performance by playing these kinds of tricks.
And it turns out that there are other tricks that can be played under the modern rules. It is cheap to put a little intelligence and some buffering capability into modern disk controllers. In this environment controllers can play tricks. An easy one is to copy the data to a buffer instead of immediately writing it to disk. You tell the computer the write is done immediately. Then at some convenient later time you actually write the data to disk. If nothing goes wrong everything works as expected, only faster, and nobody is the wiser.
Another simple trick is called prospective read-ahead. If a certain block is read what's the most likely next block to be read? The next one. What's the easiest block to read next? The next one. So if the controller reads the current block, passes it along to the computer but then also reads the next block into its buffer without being asked what's the harm? Nothing, if the buffer is not full. But the benefit is that the next block can be passed back to the computer immediately from the buffer if the controller in fact receives a read request for it. It turns out that these simple tricks and a number of much more complicated ones can be implemented by modern disk controllers. They result in an increased effective speed of the disk drive. But you can either go with these tricks or the old CKD tricks. It is somewhere between difficult and impossible to combine the two.
I have touched indirectly on the final topic I want to discuss. It is the one I alluded to in the introduction. As I indicated above, back in the day IBM picked 4096 bytes for their FBA block size. So where did 512 bytes, the block size now in common use, come from? It turns out that it came from Unix. Unix has always used FBA architecture for their disks. I don't know why Unix picked this number. Here are a couple of theories. Unix is the bastard stepchild of an operating system called Multics. Multics was a joint development effort by General Electric and the Massachusetts Institute of Technology. I know very little about Multics beyond that. But it is possible that Unix took FBA and a 512 byte block from Multics.
The other theory I have has to do with the hardware Unix was originally developed on. Unix was originally developed at Bell Labs, which had a number of Multics systems in house at the time. But the original version of Unix was developed on minicomputers manufactured by Digital Equipment Corporation. GE still exists but has long since exited the computer business. DEC was for a time an extremely successful computer company. For a couple of years it was so successful that it had the biggest market capitalization (stock price times number of outstanding shares) of any computer company. At that time it was bigger than IBM (at the time a very large company) and Microsoft (at the time a small company) and Apple (at the time a very small company). But alas, DEC is no more and I know only a little more about DEC than I know about Multics.
And it is always possible that the original developers of Unix picked 512 for some other reason. And in any case, Microsoft adopted a block size of 512 for DOS and carried that decision over to Windows. And it has been a good choice for a very long time. But time marches on and with the march of time hard drive sizes keep growing. Now anything less that 1 GB in a single hard drive is considered small. To be considered large a hard drive now has to have a capacity of 1 TB or more. And these multi-terabyte drives only cost a few hundred dollars.
A 1 TB hard drive has about a zillion 512 byte blocks on it. That's pretty ridiculous. And the solution is obvious. Go to a bigger block size. And that process is under way. The industry has settled on a new size. It is, not surprisingly, 4096 bytes or 4K. Devices conforming to this standard are sometimes called "large-sector" drives. And the existence of this move to large-sector drives is the thing I learned as a side effect of my recent hard drive problems. A 1 TB large-sector drive will only have 1/8 of a zillion blocks of data. Don't you feel much better now?
The transition to 4K blocks is well, a transition. But it is just the next one in a long sequence of transitions. And I expect it to be a pretty smooth one. Most modern software is written in "C" or one of it's children, C++, Java, etc. None of the standard I/O libraries for these languages try to look directly at the hardware in the same way that software written in the old days for IBM mainframes did. Instead they just expect to deal with a string of bytes. They are totally indifferent to the fact that the continuous string of bytes that they deal with might in actually be handled in blocks inside some low level device driver. They can't see in there. And given that they definitely don't care that the block size might change from 512 bytes per block to 4096 bytes per block at some point.
There are two exceptions to this. The first exception is the OS itself (Windows, Unix, iOS, etc.). Then there are some utilities whose job is to pay attention to what the hard drive actually looks like. They will care that the data blocks on a large-sector hard drive are 4096 bytes in size rather than 512 bytes. That's their job. But even most utilities won't notice the change as they don't concern themselves with low level hard drive issues.
Changes will be required. Microsoft has already incorporated these changes into the latest versions of its operating systems. Most hard drive oriented utilities have also incorporated the necessary changes in the newer versions of their offerings. And most operating system vendors are either ahead of Microsoft or, at worst, close behind.
Even given all that I would recommend that the average user stay away from these new devices for a while. You generally don't need a disk drive that is big enough (say something over 10 GB) where the change might possibly make a difference. So for the moment stick to disks that use the old format. If you see a disk that says "4K" or "large-sector" pick a different model that doesn't. This is advice that I think will hold say through 2017. By that time all the necessary changes will have been rolled out and the bugs fixed. This recommendation to possibly go with the new specification at that point only pertains to people purchasing new hardware that comes with the latest (or near-latest) version of the operating system. If you are running an old OS, particularly one that is pre-2014, definitely stay away from the new hard drives.
And now you have it, the ammunition to bore to death anyone you meet at a cocktail party that needs boring to death. I try to help where I can.
Thursday, August 27, 2015
Earthquakes
The New Yorker magazine published an article by Kathryn Schultz in their July 20, 2015 issue called "The Really Big One". As a piece of writing it was extremely well done. As a piece of science it was junk. I expect more from The New Yorker. They are supposed to be one of the few remaining bastions of journalistic integrity. But the magazine business is a business. And modern business principles place popularity before all else. And popularity is founded on simplicity (don't confuse the readers) and sensationalism (here's why we are all going to die). If you think I may be being too harsh on them let me quote the subtitle.
So, if the New Yorker article is just the latest example of the "a giant earthquake is going to wipe out civilization as we know it (or at least a big chunk of it) and it's going to happen real soon now" school of journalism, what's the real story? That's what this post is all about.
Seismology, the study of earthquakes, has been around for a long time now. Its origins can be easily traced back to ancient China. And devastating earthquakes have been with us for all that time. Recently we have had about 20,000 people killed in Japan, Before that an earthquake in Haiti killed more than 100,000 people. Before that the Indonesian earthquake killed about 250,000 people. And that just covers a period of about a decade. It would be nice if earthquakes could be predicted so that measures could be taken to reduce the death and destruction. But earthquake prediction has not advanced much beyond what the ancient Chinese were capable of. Why is that? Let's take a deep dive and see what we can find out.
We now know a lot more about where earthquakes, particularly the big ones, come from. They come from plate tectonics. The earth is like an onion. It has layers. At the center is the aptly named core. The core of the earth is hotter than the surface. This is caused by radioactive decay, if you were wondering, but only the "hotter" part is important here. The core is surrounded by the mantle. Floating on top of that is a thin surface skim called the crust. Heat can relatively easily escape from the crust to space through the atmosphere. This causes the crust to cool and harden into the rocky material we are familiar with.
Everything we see is part of the crust. In round numbers it is 50 miles thick. That sounds like a lot but the radius of the earth is 4,000 miles so it isn't. The crust and the inner part of the core are solid. The rest is not. What that means is that it can bend and flow. All earthquakes originate in the crust. They are caused by the fact that the crust is stiff. Earthquakes are caused by parts of the crust grinding and breaking against other parts of the crust. And that's were the plates come in.
We are all familiar with the slate pavers that are sometimes used to create a garden path. If two pavers are jammed against each other it is easy to image grinding and breaking going on. And that's the simplest model of earthquakes. There is one other part. What's causing the jamming? Temperature differences cause convection currents. Material flows in an attempt to equalize the temperature. What's flowing in this case is mantle material. In some areas you have an upwelling of warm material. In other areas you have a sinking zone of cool material. Finally, material flows across the top of the mantle from the upwelling areas toward the cooling zones. This mantle flow carries parts of the crust along with it. It's really as simple as that.
So that's our modern model of what is going on. Mantle flow drags pieces of the crust along. This causes the crust to grind and break into pieces that smash and bash into each other. And the smashing and bashing causes it to break into chunks called plates. Remember, this has been going on for billions of years so it has had a lot of time to settle down into what we now see. This model is relatively new. It only dates back about 50 years. And as a general model it works very well. But it doesn't tell us much about the details. And even at this very general level of detail it misleads us in an important way.
Pretty much all of us think of the various plates like they are giant slate pavers. Our mental model is of really big slate pavers. And this results in us thinking the plates are rigid like your typical slate paver. It is strong and stiff and really big. Pretty much all you can do to a slate paver is to break it and that's hard to do. So when we think of the North American plate we think of something that is extremely strong and extremely stiff and hard to break. But a big piece of something doesn't behave the same way a small piece does.
Consider the two by four. If it is a few feet long we would consider it pretty unbendable. But how about one that is twenty or forty or sixty feet long? Now all of a sudden it becomes quite bendable. The same thing is true of rock. Thinking of a piece of rock as being just like the paver works just fine if it is a foot long or maybe even a hundred feet long. But what if it is ten or a hundred or a thousand miles long? The properties change.
And slate is slate is slate. But this whole grinding and breaking thing has been going on for a long time. If you look around, even over a distance of a few miles, you will usually find different kinds of rocks and soil with different amounts of strength and stiffness and breakability. So if we study plates that are big enough to encompass an entire continent we find that things are actually much more like a garden as a whole than they are like a single slate paver. In a garden over here you have some pavers. But over there you also have some gravel or some cement or a deck. And over there you also have soil or shrubs or lawn. In a garden you have all these different kinds of things in close proximity.
If you look at tens or hundreds or thousands of miles of land you have all these same kinds of differences. Materials vary and each material can have a different amount of strength or stiffness. And this makes it hard to generalize about the attributes of an entire geologic plate. They are not just like a big piece of slate. Instead the basic properties of the plate vary wildly from place to place. Our mental model of a plate steers us wrong. Scientists try to take this into account. But it is complicated and difficult and they are not as good at it as they would like to be.
Let's now look at earthquakes in more detail. Plates move. A typical speed is an inch a year. That might not seem like much but in a thousand years it amounts to over eighty feet. That too might not seem very fast but in 100-200 million years it can and did create the Atlantic Ocean. And would you really notice if two buildings a few miles apart got an inch closer together or further apart over the course of a year? You wouldn't. It has only been in the last few decades that it has been possible to measure distances accurately enough to detect changes that small that happen that slowly.
So plates are moving. and they are smashing and bashing into each other. What does that actually mean? Well, the edges of these plates are called fault lines. The most famous is the San Andreas fault. The San Andreas is a transverse fault. The relative motion is along the fault line. The North American plate is moving south while the Pacific plate is moving north. They grind against each other as they pass. But what about faults where the motion is perpendicular to the fault line? There are three general cases. Running down the middle of the Atlantic Ocean is the Mid-Atlantic Ridge. It is completely under water except in a couple of places like Iceland. It sits on top of one of those upwellings I talked about. So warm material rises and uses volcanoes to push material to the side. Crustal material is pushed away from the ridge and the Atlantic slowly gets wider. And, of course, if something is getting wider something else must be getting narrower.
Where plates are jamming together we have collisional faults. The entire country of India sits on the Indian plate. The Indian plate is moving north. And that causes it to collide with the Asian plate. The result is the Himalayan mountains. If two plates just smash into each other you get mountain ranges. What's happening in my neighborhood (and what was the subject of the New Yorker article) is a little more complex. The Juan de Fuca plate is smashing into the North American plate. But it's not just a smack. It is diving underneath it. This is called a subduction fault. This might seem less violent then the straight smack situation but it isn't. According to recent research this fault generated a magnitude 9 'quake in 1700. How big is that? The earthquake that killed 20,000 in Japan was a magnitude 9 'quake as was the Indonesian one. The Haiti 'quake was only a magnitude 7. So what do these numbers mean and how can a smaller 'quake kill more people? One question at a time
The magnitude numbers you see in the press typically run from 1 to 9 (often with a single decimal place added - 6.2) and are based on the modern equivalent of the Richter scale. Here's my handy dandy (and completely unscientific ) guide to the Richter scale:
So why did 20,000 people get killed in the magnitude 9 Japan earthquake but hundreds of thousands got killed in Haiti in the much smaller magnitude 7 earthquake? That's the other thing. Generally speaking the farther you are away from the epicenter the less damage there is. The epicenter in Japan was about 50 miles offshore and about 20 miles underground. The Haiti earthquake was only about 10 miles underground and was only about 15 miles from Port-au-Prince, the Haitian capitol. Another factor was that the Japanese have stringent construction standards whereas the Haitians don't. The point is that the magnitude of the earthquake doesn't tell you everything you need to know. The New Yorker article was written as if the "big one" was happening right under Seattle and Portland. In reality it would happen about 50 miles off the coast and both Portland and Seattle are significantly inland. As far as I can tell Sendai, 81 miles away and the closest large city to the epicenter of the Japan earthquake, suffered little damage.
So far we are talking history, about earthquakes that have already happened. What about earthquake prediction? Well, that's a problem. Let me first get into the new modern scientific approach. We can now measure plate movements. The basic idea is a simple one. Faults lock up but plates keep moving. The rock should act like a giant spring, building up more energy as it gets bent by plate motion. Then a plate unlocks violently, i.e. an earthquake happens. Just like letting a big spring go, the rocks shift until they are no longer bent. There they lock back up and we start the whole process over again. If we can measure plate movements we can figure how far the rocks need to shift to relive the stress. That tells us how much energy is involved and that tells us how big the earthquake will be. That's the theory. Of course it is hard to figure when the fault will unlock but we should at least be able to figure out how big the quake will be when it does unlock.
But it turns out that the theory doesn't work very well. Satellite and GPS measurements allow us to measure how far land has or has not moved. That should allow the fault loading to be calculated and geologists routinely do this. But low load faults unlock and high load faults stay locked all the time. A lot of the why of this is still a mystery. Geologists have lots of theories but little in the way of methods to determine how best to sort through them. A big problem is that they can't see what's going on.
Studying the surface is pretty easy. If you fly over the right part of California you can actually see part of the San Andreas fault. But it is rare that surface features tell you much. Most of the interesting stuff is underground. It is theoretically possible to drill holes to see what's going on but the deepest hole ever drilled went less than 8 miles down and most holes go less than 5 miles down. Drilling is very expensive so it has only been done a few times for solely scientific reasons. The result is that the detailed characteristics of the rock in and around a fault are mysterious. Without these detailed characteristics it is impossible to predict how much stress the rock can take. Effectively, geologists are flying blind.
But at least geologists can accurately measure plate movement, right? True but that is less help than you would think. Again using the simple theory, plate A moves X feet while plate B moves Y feet. We can now do some trigonometry and figure out how much the rock needs to shift to relieve the stress, right? Well, what about small earthquakes? They could have moved things and relieved some stress. Ok so let's factor those in. The problem is that to do this we need a complete inventory of all the earthquakes. No problem. Cut to the seismometer records and we are good to go, right? In theory yes. In practice no. There is a world wide network that measures all large earthquakes. But this leaves out most of the small ones. The data from the big earthquakes is better than nothing but it is not nearly enough to do the accounting accurately enough to figure fault loading.
And then there are silent earthquakes. These were only discovered a few years ago. The satellites and GPS stations were catching rock motion that didn't seem to match up with earthquakes. So a spot where this was happening was studied by installing high precision seismometers locally. They picked up earthquakes that were so small and so consistent that they had been missed. But over a period of months they were causing substantial rock movement. If you are loading stress into a fault via plate movement but relieving it via swarms silent earthquakes then that fault is not going to unlock when it is supposed to. Scientists have a long way to go and they know it. So what else is going on?
There is the historical method. This method is used more frequently by scientists than people think. But in this and many other cases it is entirely appropriate. In this case you find out what you can about past earthquakes. Japan has been keeping records of earthquakes, at least the big ones, for a couple of thousand years. Scientists were able to pin down the exact date of the 1700 'quake that happened around here by consulting Japanese records (see below). It was a big 'quake but no one around here at the time was keeping records. And even in parts of the world where they have been keeping records for a relatively long time only of the biggest 'quakes got recorded until very recently. So we have only a little information about 'quakes in the written record. Fortunately, this can be supplemented by the geological record.
We keep getting better at interpreting the geological record. The first clue as to the existence of the 1700 'quake was uncovered by noticing some weird land formations out on the Pacific coast. Closer to my home is Lake Washington. It turns out there are two separate forests at the bottom of the lake. They got there because two different land slides moved a bunch of trees from the side of a hill into the lake where they promptly sank to the bottom and got preserved. Tree rings dated each slide to within a couple of years and both slides were caused by earthquakes. This is now a recurring pattern in my neck of the woods. Geologists get access to better tools. They look around and find more evidence of earthquakes. And often a newly discovered earthquake is associated with a previously unsuspected fault. It turns out there are faults all over the place. This keeps making a complicated situation even more complicated.
Let me stop for a minute and summarize. We think we know the general idea. Plate tectonics push things around. This causes stress to lock into faults. The faults unlock, break, if you prefer, and we have an earthquake. So far so good. But in order to predict earthquakes we need a lot of very detailed knowledge. That detailed knowledge would allow us to predict how much rock is going to break (size), when it is going to break (timing), and where it is going to break (location). Geologists can currently do this in only the most general way. They look at the historical record. If a place has had an earthquake in the past it is a good candidate for a future one (place). The same indications give them a general idea of how big a 'quake is likely to be (magnitude). They can also use the geological and historical record try to calculate a "repeat rate", how frequently 'quakes take place.
If there are no 'quakes in the historical and geological record for a location then it is likely to remain earthquake free. On the other hand, 'quakes routinely appear where they are not supposed to so calling an area 'quake free involves a bit of guessing. The record turns up lots of 'quakes in some places. So this should yield a solid repeat rate, right? Unfortunately, even in the most earthquake prone places the timing of 'quakes is very irregular. Scientists talk of a 'quake being overdue but this is more dressed up guesswork. So how good are geologists at this kind of guesswork? They are good enough that you should incorporate their guesswork in building codes but you shouldn't take any predictions about exactly when the next one will show up very seriously. And it is also important to keep in mind that proper engineering can not completely earthquake-proof a building even if the 'quake is within the design range. But even if the 'quake is outside the design range it will provide some protection.
The different recent experiences of Haiti and Japan bear this out. The Japanese are perhaps the best earthquake people in the world because they have so many 'quakes. But even they did not plan for a 'quake as big as the recent one. The Japan 'quake was larger than anything in Japan's historical record. And it is very expensive to earthquake-proof against very large 'quakes. Even though it was not enough, the level of preparedness in Japan substantially reduced the death and destruction. In Haiti's case, not that much money and decent building codes,effectively enforced, would have saved literally hundreds of thousands of lives.
Finally, let me turn to a related subject, tsunamis. The Haiti story is not a Tsunami story but both the Japan and the Indonesia stories are tsunami stories. The whole Fukushima nuclear disaster and much of the Indonesian death and destruction were cause not by the earthquake but by the tsunami each 'quake generated. So what's the state of the art with tsunamis? There are problems here but things are in much better shape than they are with 'quakes. First the theory. It's pretty simple. When an earthquake happens the land gets thrown around. It this happens on land that is pretty much that. But if it happens under water the water above the epicenter gets thrown about too. Throwing a lot of water around causes tsunamis. Its as simple as that.
Well, there is a little more to it. Let's start with the bad news. How much water gets thrown around and which way does it go? This is the mystery part. Most of the guess part of this process involves guessing the location and other specific details of the earthquake. That is part of the magic and mystery of earthquakes. But let's move on. What if we know (or can guess) the exact earthquake specifics. What then? Then we are into the good part.
Oceans have been tossing giant waves around for ages. So scientists have been able to thoroughly study them and they have a good understanding of them. This means that scientists can predict their behavior very accurately. That's good news. The bad news is that water is very efficient at moving large waves long distances. So a tsunami can devastate shorelines thousands of miles from its epicenter. Scientists can accurately predict where the tsunami is going to go and how long it is going to take to get there. They just can't do anything to stop it. A tsunami can do a lot of damage a long way away. On land even a giant earthquake tails off to nothing within a few hundred miles so the damage mostly happens close to the epicenter. The same is not true when a 'quake throws a big tsunami.
With a few measurements from the tsunami close to the epicenter scientists can very accurately predict what will happen. Scientists were caught flat footed in the case of the tsunami associated with the Indonesia 'quake. But even so they were still able to get the measurements they needed and to provide some decent guidance as to what would happen. The big problem turned out to be that there was no system in place for passing this information on to the relevant authorities. By the time of the Japan 'quake and associated tsunami much better procedures were in place. And outside of Japan itself the size of the tsunami was much more manageable so the damage outside Japan was minimal. So the modern approach is to measure where it happens and a few other specifics. From there computer models will be able to predict the where, when, and how big of the tsunami.
And about that 'quake of 1700 I was talking about above. It was an underwater 'quake that created a large tsunami. That tsunami crossed the entire Pacific Ocean and eventually hit Japan. At that point it was still powerful enough to do enough damage that the Japanese put it in their records. That information and a computer model of how long it took for the tsunami to cross the ocean allowed scientists to pinpoint the exact day the 'quake happened even though no record of it exists on this side of the ocean.
If a similar sized 'quake to the 1700 one happened today at roughly the same location, it would likely rattle dishes in Seattle and Portland. It might even do some damage to the two cities. But it would not do the kind of damage outlined in the New Yorker story. And such a 'quake could and probably would create a large tsunami. And that tsunami would likely wreck havoc on the coastal communities of Washington, Oregon, and other coastal cities around the Pacific. But there are geographical barriers that would prevent it from doing much damage to Seattle or Portland.
There is a way to wreck havoc on Seattle to the extent outlined in the New Yorker piece? (I don't know the geology of Portland well enough to answer this question with respect to that city.) Yes! It's simple. You just have one of the several faults that passes directly under Seattle rupture in a big way. There is ample evidence in the geological record for this happening in the past so it could certainly happen again. And the amount of death and destruction resulting would satisfy even the likes of Ms. Shultz. Mayhem of biblical proportions is possible. But it wouldn't have happened the way Ms. Shultz outlined. It also wouldn't have been a "really big one" sized earthquake. I guess telling it that way just wouldn't have made as good of a story.
An earthquake will destroy a sizable portion of the coastal Northwest. The question is when.And here's the caption below the accompanying illustration.
The next full-margin rupture of the Cascadia subduction zone will spell the worst natural disaster in the history of the continent.Now let me point out that the Yucatan peninsula is part of "the continent" and that a meteor hit the peninsula 65 million years ago (during the "history of the continent") and that event caused the extinction of the dinosaurs. See what I mean about sensationalism. As to the simplicity part, the article name checks Portland and Seattle. Why? Because they are big cities that people have heard of. Would they be destroyed by a Cascadia subduction zone earthquake happening off the coast of Washington and Oregon? No! But if you need to manufacture a mega-disaster, wiping out Seaside, Oregon just doesn't cut it.
So, if the New Yorker article is just the latest example of the "a giant earthquake is going to wipe out civilization as we know it (or at least a big chunk of it) and it's going to happen real soon now" school of journalism, what's the real story? That's what this post is all about.
Seismology, the study of earthquakes, has been around for a long time now. Its origins can be easily traced back to ancient China. And devastating earthquakes have been with us for all that time. Recently we have had about 20,000 people killed in Japan, Before that an earthquake in Haiti killed more than 100,000 people. Before that the Indonesian earthquake killed about 250,000 people. And that just covers a period of about a decade. It would be nice if earthquakes could be predicted so that measures could be taken to reduce the death and destruction. But earthquake prediction has not advanced much beyond what the ancient Chinese were capable of. Why is that? Let's take a deep dive and see what we can find out.
We now know a lot more about where earthquakes, particularly the big ones, come from. They come from plate tectonics. The earth is like an onion. It has layers. At the center is the aptly named core. The core of the earth is hotter than the surface. This is caused by radioactive decay, if you were wondering, but only the "hotter" part is important here. The core is surrounded by the mantle. Floating on top of that is a thin surface skim called the crust. Heat can relatively easily escape from the crust to space through the atmosphere. This causes the crust to cool and harden into the rocky material we are familiar with.
Everything we see is part of the crust. In round numbers it is 50 miles thick. That sounds like a lot but the radius of the earth is 4,000 miles so it isn't. The crust and the inner part of the core are solid. The rest is not. What that means is that it can bend and flow. All earthquakes originate in the crust. They are caused by the fact that the crust is stiff. Earthquakes are caused by parts of the crust grinding and breaking against other parts of the crust. And that's were the plates come in.
We are all familiar with the slate pavers that are sometimes used to create a garden path. If two pavers are jammed against each other it is easy to image grinding and breaking going on. And that's the simplest model of earthquakes. There is one other part. What's causing the jamming? Temperature differences cause convection currents. Material flows in an attempt to equalize the temperature. What's flowing in this case is mantle material. In some areas you have an upwelling of warm material. In other areas you have a sinking zone of cool material. Finally, material flows across the top of the mantle from the upwelling areas toward the cooling zones. This mantle flow carries parts of the crust along with it. It's really as simple as that.
So that's our modern model of what is going on. Mantle flow drags pieces of the crust along. This causes the crust to grind and break into pieces that smash and bash into each other. And the smashing and bashing causes it to break into chunks called plates. Remember, this has been going on for billions of years so it has had a lot of time to settle down into what we now see. This model is relatively new. It only dates back about 50 years. And as a general model it works very well. But it doesn't tell us much about the details. And even at this very general level of detail it misleads us in an important way.
Pretty much all of us think of the various plates like they are giant slate pavers. Our mental model is of really big slate pavers. And this results in us thinking the plates are rigid like your typical slate paver. It is strong and stiff and really big. Pretty much all you can do to a slate paver is to break it and that's hard to do. So when we think of the North American plate we think of something that is extremely strong and extremely stiff and hard to break. But a big piece of something doesn't behave the same way a small piece does.
Consider the two by four. If it is a few feet long we would consider it pretty unbendable. But how about one that is twenty or forty or sixty feet long? Now all of a sudden it becomes quite bendable. The same thing is true of rock. Thinking of a piece of rock as being just like the paver works just fine if it is a foot long or maybe even a hundred feet long. But what if it is ten or a hundred or a thousand miles long? The properties change.
And slate is slate is slate. But this whole grinding and breaking thing has been going on for a long time. If you look around, even over a distance of a few miles, you will usually find different kinds of rocks and soil with different amounts of strength and stiffness and breakability. So if we study plates that are big enough to encompass an entire continent we find that things are actually much more like a garden as a whole than they are like a single slate paver. In a garden over here you have some pavers. But over there you also have some gravel or some cement or a deck. And over there you also have soil or shrubs or lawn. In a garden you have all these different kinds of things in close proximity.
If you look at tens or hundreds or thousands of miles of land you have all these same kinds of differences. Materials vary and each material can have a different amount of strength or stiffness. And this makes it hard to generalize about the attributes of an entire geologic plate. They are not just like a big piece of slate. Instead the basic properties of the plate vary wildly from place to place. Our mental model of a plate steers us wrong. Scientists try to take this into account. But it is complicated and difficult and they are not as good at it as they would like to be.
Let's now look at earthquakes in more detail. Plates move. A typical speed is an inch a year. That might not seem like much but in a thousand years it amounts to over eighty feet. That too might not seem very fast but in 100-200 million years it can and did create the Atlantic Ocean. And would you really notice if two buildings a few miles apart got an inch closer together or further apart over the course of a year? You wouldn't. It has only been in the last few decades that it has been possible to measure distances accurately enough to detect changes that small that happen that slowly.
So plates are moving. and they are smashing and bashing into each other. What does that actually mean? Well, the edges of these plates are called fault lines. The most famous is the San Andreas fault. The San Andreas is a transverse fault. The relative motion is along the fault line. The North American plate is moving south while the Pacific plate is moving north. They grind against each other as they pass. But what about faults where the motion is perpendicular to the fault line? There are three general cases. Running down the middle of the Atlantic Ocean is the Mid-Atlantic Ridge. It is completely under water except in a couple of places like Iceland. It sits on top of one of those upwellings I talked about. So warm material rises and uses volcanoes to push material to the side. Crustal material is pushed away from the ridge and the Atlantic slowly gets wider. And, of course, if something is getting wider something else must be getting narrower.
Where plates are jamming together we have collisional faults. The entire country of India sits on the Indian plate. The Indian plate is moving north. And that causes it to collide with the Asian plate. The result is the Himalayan mountains. If two plates just smash into each other you get mountain ranges. What's happening in my neighborhood (and what was the subject of the New Yorker article) is a little more complex. The Juan de Fuca plate is smashing into the North American plate. But it's not just a smack. It is diving underneath it. This is called a subduction fault. This might seem less violent then the straight smack situation but it isn't. According to recent research this fault generated a magnitude 9 'quake in 1700. How big is that? The earthquake that killed 20,000 in Japan was a magnitude 9 'quake as was the Indonesian one. The Haiti 'quake was only a magnitude 7. So what do these numbers mean and how can a smaller 'quake kill more people? One question at a time
The magnitude numbers you see in the press typically run from 1 to 9 (often with a single decimal place added - 6.2) and are based on the modern equivalent of the Richter scale. Here's my handy dandy (and completely unscientific ) guide to the Richter scale:
- 1-4 - A seismometer registers it but in most cases people don't.
- 5 - You definitely feel it. It's like being in a "fender bender" car crash.
- 6 - It's like being on a roller coaster.
- 7 - It's like being in a serious car crash where the air bags go off and maybe the car rolls over.
- 8 - It's like being in a roller coaster that jumps the tracks at the top of the starting ramp and crashes to the ground. People get killed.
- 9 - There is lots of death and destruction. It's like what happens in a big disaster movie.
So why did 20,000 people get killed in the magnitude 9 Japan earthquake but hundreds of thousands got killed in Haiti in the much smaller magnitude 7 earthquake? That's the other thing. Generally speaking the farther you are away from the epicenter the less damage there is. The epicenter in Japan was about 50 miles offshore and about 20 miles underground. The Haiti earthquake was only about 10 miles underground and was only about 15 miles from Port-au-Prince, the Haitian capitol. Another factor was that the Japanese have stringent construction standards whereas the Haitians don't. The point is that the magnitude of the earthquake doesn't tell you everything you need to know. The New Yorker article was written as if the "big one" was happening right under Seattle and Portland. In reality it would happen about 50 miles off the coast and both Portland and Seattle are significantly inland. As far as I can tell Sendai, 81 miles away and the closest large city to the epicenter of the Japan earthquake, suffered little damage.
So far we are talking history, about earthquakes that have already happened. What about earthquake prediction? Well, that's a problem. Let me first get into the new modern scientific approach. We can now measure plate movements. The basic idea is a simple one. Faults lock up but plates keep moving. The rock should act like a giant spring, building up more energy as it gets bent by plate motion. Then a plate unlocks violently, i.e. an earthquake happens. Just like letting a big spring go, the rocks shift until they are no longer bent. There they lock back up and we start the whole process over again. If we can measure plate movements we can figure how far the rocks need to shift to relive the stress. That tells us how much energy is involved and that tells us how big the earthquake will be. That's the theory. Of course it is hard to figure when the fault will unlock but we should at least be able to figure out how big the quake will be when it does unlock.
But it turns out that the theory doesn't work very well. Satellite and GPS measurements allow us to measure how far land has or has not moved. That should allow the fault loading to be calculated and geologists routinely do this. But low load faults unlock and high load faults stay locked all the time. A lot of the why of this is still a mystery. Geologists have lots of theories but little in the way of methods to determine how best to sort through them. A big problem is that they can't see what's going on.
Studying the surface is pretty easy. If you fly over the right part of California you can actually see part of the San Andreas fault. But it is rare that surface features tell you much. Most of the interesting stuff is underground. It is theoretically possible to drill holes to see what's going on but the deepest hole ever drilled went less than 8 miles down and most holes go less than 5 miles down. Drilling is very expensive so it has only been done a few times for solely scientific reasons. The result is that the detailed characteristics of the rock in and around a fault are mysterious. Without these detailed characteristics it is impossible to predict how much stress the rock can take. Effectively, geologists are flying blind.
But at least geologists can accurately measure plate movement, right? True but that is less help than you would think. Again using the simple theory, plate A moves X feet while plate B moves Y feet. We can now do some trigonometry and figure out how much the rock needs to shift to relieve the stress, right? Well, what about small earthquakes? They could have moved things and relieved some stress. Ok so let's factor those in. The problem is that to do this we need a complete inventory of all the earthquakes. No problem. Cut to the seismometer records and we are good to go, right? In theory yes. In practice no. There is a world wide network that measures all large earthquakes. But this leaves out most of the small ones. The data from the big earthquakes is better than nothing but it is not nearly enough to do the accounting accurately enough to figure fault loading.
And then there are silent earthquakes. These were only discovered a few years ago. The satellites and GPS stations were catching rock motion that didn't seem to match up with earthquakes. So a spot where this was happening was studied by installing high precision seismometers locally. They picked up earthquakes that were so small and so consistent that they had been missed. But over a period of months they were causing substantial rock movement. If you are loading stress into a fault via plate movement but relieving it via swarms silent earthquakes then that fault is not going to unlock when it is supposed to. Scientists have a long way to go and they know it. So what else is going on?
There is the historical method. This method is used more frequently by scientists than people think. But in this and many other cases it is entirely appropriate. In this case you find out what you can about past earthquakes. Japan has been keeping records of earthquakes, at least the big ones, for a couple of thousand years. Scientists were able to pin down the exact date of the 1700 'quake that happened around here by consulting Japanese records (see below). It was a big 'quake but no one around here at the time was keeping records. And even in parts of the world where they have been keeping records for a relatively long time only of the biggest 'quakes got recorded until very recently. So we have only a little information about 'quakes in the written record. Fortunately, this can be supplemented by the geological record.
We keep getting better at interpreting the geological record. The first clue as to the existence of the 1700 'quake was uncovered by noticing some weird land formations out on the Pacific coast. Closer to my home is Lake Washington. It turns out there are two separate forests at the bottom of the lake. They got there because two different land slides moved a bunch of trees from the side of a hill into the lake where they promptly sank to the bottom and got preserved. Tree rings dated each slide to within a couple of years and both slides were caused by earthquakes. This is now a recurring pattern in my neck of the woods. Geologists get access to better tools. They look around and find more evidence of earthquakes. And often a newly discovered earthquake is associated with a previously unsuspected fault. It turns out there are faults all over the place. This keeps making a complicated situation even more complicated.
Let me stop for a minute and summarize. We think we know the general idea. Plate tectonics push things around. This causes stress to lock into faults. The faults unlock, break, if you prefer, and we have an earthquake. So far so good. But in order to predict earthquakes we need a lot of very detailed knowledge. That detailed knowledge would allow us to predict how much rock is going to break (size), when it is going to break (timing), and where it is going to break (location). Geologists can currently do this in only the most general way. They look at the historical record. If a place has had an earthquake in the past it is a good candidate for a future one (place). The same indications give them a general idea of how big a 'quake is likely to be (magnitude). They can also use the geological and historical record try to calculate a "repeat rate", how frequently 'quakes take place.
If there are no 'quakes in the historical and geological record for a location then it is likely to remain earthquake free. On the other hand, 'quakes routinely appear where they are not supposed to so calling an area 'quake free involves a bit of guessing. The record turns up lots of 'quakes in some places. So this should yield a solid repeat rate, right? Unfortunately, even in the most earthquake prone places the timing of 'quakes is very irregular. Scientists talk of a 'quake being overdue but this is more dressed up guesswork. So how good are geologists at this kind of guesswork? They are good enough that you should incorporate their guesswork in building codes but you shouldn't take any predictions about exactly when the next one will show up very seriously. And it is also important to keep in mind that proper engineering can not completely earthquake-proof a building even if the 'quake is within the design range. But even if the 'quake is outside the design range it will provide some protection.
The different recent experiences of Haiti and Japan bear this out. The Japanese are perhaps the best earthquake people in the world because they have so many 'quakes. But even they did not plan for a 'quake as big as the recent one. The Japan 'quake was larger than anything in Japan's historical record. And it is very expensive to earthquake-proof against very large 'quakes. Even though it was not enough, the level of preparedness in Japan substantially reduced the death and destruction. In Haiti's case, not that much money and decent building codes,effectively enforced, would have saved literally hundreds of thousands of lives.
Finally, let me turn to a related subject, tsunamis. The Haiti story is not a Tsunami story but both the Japan and the Indonesia stories are tsunami stories. The whole Fukushima nuclear disaster and much of the Indonesian death and destruction were cause not by the earthquake but by the tsunami each 'quake generated. So what's the state of the art with tsunamis? There are problems here but things are in much better shape than they are with 'quakes. First the theory. It's pretty simple. When an earthquake happens the land gets thrown around. It this happens on land that is pretty much that. But if it happens under water the water above the epicenter gets thrown about too. Throwing a lot of water around causes tsunamis. Its as simple as that.
Well, there is a little more to it. Let's start with the bad news. How much water gets thrown around and which way does it go? This is the mystery part. Most of the guess part of this process involves guessing the location and other specific details of the earthquake. That is part of the magic and mystery of earthquakes. But let's move on. What if we know (or can guess) the exact earthquake specifics. What then? Then we are into the good part.
Oceans have been tossing giant waves around for ages. So scientists have been able to thoroughly study them and they have a good understanding of them. This means that scientists can predict their behavior very accurately. That's good news. The bad news is that water is very efficient at moving large waves long distances. So a tsunami can devastate shorelines thousands of miles from its epicenter. Scientists can accurately predict where the tsunami is going to go and how long it is going to take to get there. They just can't do anything to stop it. A tsunami can do a lot of damage a long way away. On land even a giant earthquake tails off to nothing within a few hundred miles so the damage mostly happens close to the epicenter. The same is not true when a 'quake throws a big tsunami.
With a few measurements from the tsunami close to the epicenter scientists can very accurately predict what will happen. Scientists were caught flat footed in the case of the tsunami associated with the Indonesia 'quake. But even so they were still able to get the measurements they needed and to provide some decent guidance as to what would happen. The big problem turned out to be that there was no system in place for passing this information on to the relevant authorities. By the time of the Japan 'quake and associated tsunami much better procedures were in place. And outside of Japan itself the size of the tsunami was much more manageable so the damage outside Japan was minimal. So the modern approach is to measure where it happens and a few other specifics. From there computer models will be able to predict the where, when, and how big of the tsunami.
And about that 'quake of 1700 I was talking about above. It was an underwater 'quake that created a large tsunami. That tsunami crossed the entire Pacific Ocean and eventually hit Japan. At that point it was still powerful enough to do enough damage that the Japanese put it in their records. That information and a computer model of how long it took for the tsunami to cross the ocean allowed scientists to pinpoint the exact day the 'quake happened even though no record of it exists on this side of the ocean.
If a similar sized 'quake to the 1700 one happened today at roughly the same location, it would likely rattle dishes in Seattle and Portland. It might even do some damage to the two cities. But it would not do the kind of damage outlined in the New Yorker story. And such a 'quake could and probably would create a large tsunami. And that tsunami would likely wreck havoc on the coastal communities of Washington, Oregon, and other coastal cities around the Pacific. But there are geographical barriers that would prevent it from doing much damage to Seattle or Portland.
There is a way to wreck havoc on Seattle to the extent outlined in the New Yorker piece? (I don't know the geology of Portland well enough to answer this question with respect to that city.) Yes! It's simple. You just have one of the several faults that passes directly under Seattle rupture in a big way. There is ample evidence in the geological record for this happening in the past so it could certainly happen again. And the amount of death and destruction resulting would satisfy even the likes of Ms. Shultz. Mayhem of biblical proportions is possible. But it wouldn't have happened the way Ms. Shultz outlined. It also wouldn't have been a "really big one" sized earthquake. I guess telling it that way just wouldn't have made as good of a story.
Sunday, August 16, 2015
GOP Orthodoxy
The first GOP debate of the 2016 presidential election cycle is now in our rear view mirrors. Well, it was actually two "debates" and not much debating went on. But the lack of debating in the "debate" is typical of these kinds of events. There has been a lot of coverage of who said what and who is up and down in the polls as a result of their performance. I am going to go in a different direction. I want to focus not on what was contentious but what was orthodox.
The debates took place on August 6. They consisted of an earlier "kids table" round featuring mainstream candidates who are lower in the polls (7 of them) and a later "main event" round featuring the ten candidates that were doing the best in the polls in the days immediately preceding the debate. To keep things simple I am going to focus on the main event. I found a transcript that is relatively ad free at www.presidency.ucsb.edu/ws/index.php?pid=110489.
This post is a variant on the "Ken Ham Creationism" post I did a year or so ago. It can be found at http://sigma5.blogspot.com/2014/02/ken-ham-creationism.html. In that post I focused on Ham's beliefs on the theory that a lot of "creationists" would find that they actually disagreed with some of Ham's beliefs. This could drive a wedge into and eventually weaken the movement. In this case I want to focus not on differences but on commonalities. So, unlike with the Ham post where I refrained from challenging Ham's beliefs, here I intend to challenge them.
The debate was run by Fox News. They are the unofficial spokesmen for the conservative movement and for Republicans in general. So I looked for implicit beliefs of the Fox operation as manifested in the content of the questions the Fox anchors asked. And I looked for beliefs expressed by one or more participant that went unchallenged by the others. That constitutes GOP orthodoxy. Since I do not subscribe to that orthodoxy I will challenge it. Here goes.
Cutting taxes is always good.
The GOP contends that is the party of fiscal conservatism and that "liberals" are spendthrifts. Fiscal conservatives believe in balanced budgets or even surpluses. You can get there in two ways. You can have a "small" government (low tax revenue) and low spending or you can have "big" government (high tax revenue) and high spending. The only thing that is necessary is that tax revenue meet or exceed spending. A combination of low taxes and high spending is not being fiscally conservative. But that is what the GOP actually delivers. Currently the GOP wants to expand military spending and perhaps cut other programs. (There is disagreement on the latter.) That is anti-conservative fiscal policy. If you are unwilling to cut spending and, in fact, want to increase spending on the military, you need to raise taxes.
Balancing the budget is good
This sounds like a good argument. But most economists don't buy it. A "reasonable" continuous deficit is the best place to be for the long term. There is certainly a lot of discussion and disagreement about what constitutes reasonable. But I am going to leave that aside and just ask if the GOP actually does what it says. The answer is no. Back at almost the beginning of this blog I wrote a post entitled "There are no fiscally conservative Republicans" (see http://sigma5.blogspot.com/2010/10/there-are-no-fiscally-conservative.html). Nothing has changed since then.
Political Correctness is bad.
This is a diversion. The GOP is just as much in favor of political correctness as the Democrats. They just believe in enforcing different political correctness rules than the Democrats. In particular the GOP accuses the Democrats of enforcing political correctness rules when the Democrats effectively challenge GOP orthodoxy. To some extent the Democrats do the same thing to the GOP but not as loudly or as effectively.
Abortion is bad.
Actually everyone agrees with this one. No one thinks abortions are a good thing. The difference is what the various sides think ought to be done. Generally speaking liberals see abortion in some cases as a necessary evil. Republicans believe it is almost always unnecessary and therefore almost always evil. The debate on the GOP side had been reduced to asking whether abortion is ok in the cases of rape, incest, or the life of the mother. There is some disagreement on this but the consensus is moving toward a "no exceptions" position. Conservatives also incorrectly characterize the position of their opponents as "pro abortion" (see "Political Correctness" above).
I think there is a reasoned and ethical anti-abortion position. I just don't see it as being held by many conservatives. If you are anti-abortion then you should be for policies that actually reduce the need for abortions. The GOP position is "no sex outside of marriage" and "abstinence only". Bristol Palin is the poster child for why these positions are idiotic. She is now pregnant with her second out of wedlock child and each child has been fathered by a different man. Whatever Sara Palin, her mother and a former GOP Vice Presidential Candidate did, it was ineffective.
The best thing to do is to reduce unplanned and unwanted pregnancies. This eliminates the need for most abortions. There are effective ways to do this. They are sex education and birth control. A very large multi-year study was just completed in Colorado. If provided very effective birth control (IUDs) to a large group of women. The abortion rate plunged. GOP legislators have now terminated this program. And in general, there is a nearly complete overlap between conservatives who oppose abortion and conservatives who oppose sex education and birth control.
Conservatives need to stop lying about the positions held by opponents and need to start aggressively supporting sex education and birth control. Until they do we should ignore what they have to say on abortion.
Defund Planned Parenthood
This goes hand and hand with the abortion discussion above. Planned Parenthood does provide abortion services in some states. But there are laws banning them from doing this with federal money and they abide by those laws. To the extent that abortions are subsidized they are subsidized with money that does not come from the federal government. And 97% of what Planned Parenthood does is not abortion related. What they do for the most part is basic women's health and it is mostly delivered to poor women. This is a vastly underserved market. Defunding Planned Parenthood defunds these programs. In other words it is an attack on women's health in general and poor women's health in particular.
So what is going on? Well, a lot of the non-abortion services Planned Parenthood provides are sex education, birth control, and "female issues" kinds of services. Lost in the abortion hubbub is the more important discussion about sex ed, birth control, and "female issues". Most conservatives fall into one of two categories: "we hate sex ed and birth control" or "we are ignorant of and/or grossed out by female issues". Defunding Planned Parenthood advances the agenda of conservatives in either category. Defunding Planned Parenthood is somewhat about abortion. But it is mostly about the other subjects. Our pea brained media can't separate this out from the abortion fight so these more important issues go completely unreported on.
Medicaid expansion is bad
Part of Obamacare is an effort to get medical insurance to poor people. The idea was to expand Medicaid. The Supreme Court partially blocked this by giving states the option to opt out of this part of Obamacare. The GOP argument is that this is some kind of evil intrusion of federal power. Medicaid programs are administered by the states and there are a lot of federal regulations that the state programs must follow. But, while the Obamacare expansion increased the scope of Medicaid programs it did not change the basic structure of the programs.
The Medicaid expansion has been wildly successful, where it has been implemented, in getting a lot of poor people out of emergency rooms and into the standard health care system. The program is also "private" as opposed to being government run because the federal subsidies go to reducing the cost of standard private health insurance plans. So the amount of "government expansion" or "government intrusion" involved is modest. And the GOP has not advanced any alternative that would get good medical coverage to these people. They don't say it but the actual policy of the GOP is to leave these people uninsured.
In spite of the state "opt out" provision many states have opted in and expanded Medicaid. But in many states where the GOP runs the state government the state has opted out. We see large drops in the number of uninsured poor in those states that have adopted the expansion and little to no drop in those states that have opted out. The emergency room is also the most expensive way to deliver medicine. Obamacare has been very successful in cutting emergency room costs. More improvement is expected as poor people get out of the emergency room channel and into regular channels that include preventative services.
The Obamacare $700 billion bonanza
GOP candidates railed in 2012 and are still railing about the $700 billion bonanza. Medicaid expansion and other Obamacare components cost money. Obama, as a good fiscal conservative, put taxes and other "revenue enhancers" into Obamacare to the tune of about $700 billion to cover these increased costs. So far the revenue is on track but the costs are coming in lower than the estimates. But there still needs to be sufficient revenue to offset the additional costs. Conservatives have teen trying to raid this piggy bank since it was enacted into law. No true fiscal conservative would do this but then the GOP is not the party of fiscal conservatism if you ignore their rhetoric and examine their actions. The current crop of GOP presidential hopefuls are continuing this tradition.
And it goes hand in hand with the "massive state costs" of the Medicare expansion. This argument is the foundational one the GOP uses to justify denying health care to millions. But that $700 billion went in part to cover 100% of the costs of the Medicaid expansion in the first three years and 90% of the cost afterwards. Sure the states are on the hook at some point for the last 10%. But the business the 90% part brings into the state is probably enough to boost state revenues enough to cover a good chunk of the 10% so the program is close to free to the states.
Illegal Immigration is bad
We have had restrictive immigration policies dating back to roughly 1900. The US has struggled for generations with the question of how to let "good" immigrants in and keep "bad" immigrants out. And a big problem has been that what constitutes "good" and what constitutes "bad" keeps changing. For a long time we liked blacks as long as they came in slave ships. Then we didn't. We have pretty much always liked immigrants from the UK. But they have been covered by quotas as long as we have had them. We liked the Chinese when they were building the Transcontinental Railroad (middle 1800s) until we didn't but now we do again if they are tech types or rich and powerful. We were hostile to Indians from India until very recently when we weren't (again tech types). We were hostile to the Irish until there were enough of them to constitute an important voting block. The same was true for Italians.
Currently we are hostile to Mexicans ("they are rapists") except that few Mexicans are crossing our borders. And we are hostile to other Central and South American peoples. But, except for the Indians (American Indians - not tech types from India) we are all the descendants of immigrants. The argument is basically a reflection of our prejudices. But we pretend that the problem is that they are an economic drag. The evidence for an over all drag as opposed to a drag in small areas like border towns is slim to non-existent.
I am pro immigration. Our population is aging. Immigrants tend to be young, working age people and their children. They boost the economy and contribute to Social Security. There is currently a shortage of jobs but that is because of the influence of the wealthy and powerful. They like low wages for all but the 1%. The best way to achieve this is to keep employment low.
The thing that started the current immigration cycle was the importation of low wage workers to work on farms in the mid 1900s. That primed the pump. Agriculture (generally GOP leaning) and other business interests (also generally GOP leaning) actually like the current mess because it gives them access to low wage workers. The people who are already here are already here. The cheapest thing to do would be to legalize them. But then they would agitate for better wages. So the actual GOP policy is to stir the pot and make sure nothing happens.
Sanctuary Cities
These too are seen as some kind of evil plot. The fact that most sanctuary cities were set up to deal with the public safety problems caused by the terrible way Immigration enforcement was implemented is never acknowledged. Cities found they had violent crime problems in their immigrant communities. The cops couldn't get anywhere because no one wanted to deal with the cops. This was because the cops were fronting for ICE, the federal Immigration people. Sanctuary programs eased tensions and resulted in more effective policing which reduced crime. And the current argument for why sanctuary cities are so bad rests on exactly one case. Applying the same logic to guns would have eliminated the right to sell or own a gun in this country.
Build/Fix the damn fence
I am old enough to remember the Cold War. At that time it was important to contrast ourselves with the "evil commies". And they had built the infamous Iron Curtain. So we had to draw the starkest contrast we could. So we touted the fact that we had the longest undefended border in the world. At that time conservatives were staunch anti-communists so they were all in favor of our "no fence" open borders policy. And it worked just fine. There wasn't any great problem in the hundreds of years of open borders. There was even little or no problem during the many decades when our restrictive immigration policies overlapped our open borders policy. Then a decade or so ago conservatives decided there was a big problem.
And by the way there is a much more effective solution to the illegal immigrant problem than trying to build an impenetrable fence thousands of miles long. It is called a government ID card. The Nazis were famous for this sort of thing. You could be asked at any time for your "papers". If you didn't have them or they were out of order you were in deep trouble. The same thing, only now applied to illegal aliens, would for better or worse work just fine.
This requirement to carry papers is also something the "dirty commies" liked. So to maintain the contrast between us and them we were the land of the "I don't need no stinking papers" people. I don't know if its true for new cards but my decades old Social Security card says "not to be used for identification". A consistent thread among the libertarians and many others is that citizens should not be required to carry government ID. But without the rank and file noticing this is changing within conservative circles. It is also effectively being changed in society at large.
Thanks to conservatives in many places you now need the right ID, the modern equivalent of "papers", to vote. We have an e-verify system to validate employment status. It's not fool proof but it works pretty well. You now need a social security number for anything financial (Income tax) or medical (Social Security/Medicare/Medicaid). To get a drink you need a driver's license. We are quickly heading toward a time when a government issued ID card will be a necessity. So the libertarian idea of no government issued ID cards (Social Security cards and driver's licenses are both issued by the government as are "concealed carry" permits for guns) seems to be on the way out. Libertarians should be complaining bitterly but they are mostly quiet.
I am leaning to the thought that a government ID card is inevitable. I already have an "enhanced ID" driver's license. The system we now have is a hodge podge that is looking worse every day. In the mean time somehow a super-fence is supposed to be the solution to all of our Immigration related problems. It is unlikely that a super-fence is even feasible. I think the actual motivation is that contractors will build it and they are good sources for campaign contributions. And the GOP strategy is "the fence MUST be completed before anything else can be done". This is complete nonsense.
NSA Data Collection
We now know that the government is collecting massive amounts of data on innocent citizens. This program was started under the "W" Bush administration in response to 9/11 but has been continued pretty much intact by the Obama administration. Senator Paul is an outlier in the GOP in arguing that something should be done to scale it back. In the debates Governor Christie stridently defended it. My reading is that Senator Paul is pretty much alone in his stand. Certainly nothing was heard from the Fox moderators or the other participants.
This program is a clear violation of the Fourth Amendment. Many conservatives style themselves as "strict constructionists", people who try to hew closely to the words of the Constitution. Any strict constructionist should be outraged by this program. They should be loudly joined by libertarians. But there is broad support among conservatives for vacuuming up all kinds of data by and about innocent American citizens.
This is made even worse by the fact that this extra data collection (and "enhanced interrogation", an idea implicitly supported by most panelists) has so far shown itself to be completely ineffective. The 9/11 commission report demonstrated that the hijackers could and should have been caught using the techniques and procedures in place before 9/11 and before the mass data collection programs were in place. Their conclusion was that the big problem was not a lack of data or staff or computers. Instead the problem was "silo"ing, various agencies keeping the data they had close and not sharing it with other agencies. Between them the CIA, FBI, and NSA had everything they needed. Its just that each agency was so busy defending its turf that none of them were in a position to pull it all together.
ISIS is bad and the Obama Administration is ineffective
As with abortion you will actually get no disagreement about the first one, although conservatives pretend that liberals/Obama are somehow pro or at least soft on ISIS (more GOP political correctness on display). And the Obama Administration has not been very effective in rolling ISIS up so far. But the primary problem is there are no options that are likely to be effective. The politics of the region are extremely complex. The result is that there are no completely dependable allies. This leaves us to depend on allies like Turkey that has its own agendas like wiping the Kurds out or the Saudis who see themselves as the defenders of all things Sunni (ISIS is and the areas they occupy are Sunni) or the Iraqis who see Sunnis and the Kurds as the enemy. It is no surprise that progress has been modest at best.
So the real question has to do with whether there is a better alternative. Senator Graham in the "kids" debate suggested putting American boots on the ground. This has been and would likely be bitterly resented. The Trump solution is to put Carly Fiorina (current GOP presidential candidate and ex-CEO of Hewlett Packard) in charge of negotiations. She bungled things at Hewlett Packard pretty badly. She has little or no foreign policy expertise and little or no expert knowledge of the region. That doesn't sound like a very good idea either. This "talk louder and threaten more" approach does not sound promising to me. For more on this consult "ISIS - Do Something Stupid Now" at http://sigma5.blogspot.com/2014/09/isis-do-something-stupid-now.html.
Obamacare is a Complete Disaster
This is and has been the standard GOP line since it passed. But all the data suggests that it has generally worked. It has moved millions into the mainstream medical system. And cost estimates have turned out to be wrong. Costs so far seem to be lower than forecast and there is no indication yet of a reason to believe that this trend will not continue indefinitely. But everyone in the GOP fold always characterized it as a "complete disaster". They fail to acknowledge that its core concepts were developed by a conservative think tank, The Heritage Foundation, and that its implementation was based closely on a successful program implemented by a Republican Governor who went on to unsuccessfully run for President, Mitt Romney. The national experience has closely paralleled the Massachusetts experience.
Republicans have also failed to propose an alternative. Instead they proclaim "repeal and replace". Then there's the $700 billion. I discussed the actual situation with respect to this money above.
It's good to be born into modest circumstances
Some of the GOP candidates have risen up from modest circumstances. But so did Bill Clinton, Hillary Clinton, and Barak Obama. Obama currently has no significant wealth and neither Clinton had any significant wealth until after Bill left office. And then there's Jeb Bush the pre-Trump favorite. He is the son of a President, the brother of a President, and the grandson of a US Senator. Or there's Mitt Romney, the 2012 GOP standard bearer. He was the son of a CEO of a major American car company (and one time GOP candidate for the Presidency). Or, going back another 4 years, there's John McCain. He was the son of a US Navy Admiral and the grandson of another US Navy Admiral. Finally, consider the current GOP front runnier - Donald J. Trump. He claims to be worth TEN BILLION DOLLARS (the capitol letters are his). He is the son of a successful New York real estate developer. Being born into modest circumstances doesn't seem to be a recipe for success within the GOP.
Replace the current tax code with (insert quack system here)
The standard GOP line seems to be that the current tax code is completely broken and needs to be completely replaced. Then we get one quack idea or another as to what it should be replaced with. There was the Herman Can 9-9-9 system in the 2012 election cycle. Some variation on a "flat tax" has popped up several times. A recent example of this would be (as far as I can tell - the details are hard to find and harder to figure out) the Huckabee "tithe" 10% system. And on and on and on. No professional thinks any of them are workable or would be effective. But that doesn't stop the proposals from being floated.
There are too many of them to make it worth while to keep track of each one. So I just apply two tests. What does the proposal do to the overall amount of taxes corporations pay? For any GOP proposal the answer is guaranteed to be "corporations pay less". Aggregate corporate income tax revenue used to pretty much match aggregate revenue collected on the income of individuals. But that was a long time ago. Now corporations as a group pay far less than individuals as a group do. Most of the savings on the corporate side are the result of loopholes engineered to reduce the taxes of large corporations. Most large corporations pay income tax on only a small percentage of their total income. Many pay no income tax at all. Small businesses, without the same lobbying clout, get hit much harder.
The second thing I look at is how the burden shifts between low income individuals and high income individuals. All the GOP proposals I have seen lighten the burden on high income individuals and make it up by increasing the burden on low income individuals. It is important to note that wealthy individuals game the system the same way corporations do. They lobby for loopholes that reduce their taxes. That's as wrong as the shifting of the burden away from corporations and toward individuals. Looking at these two tests allows me to quickly analyze these proposals and discard them. I have yet to see a GOP tax proposal that moves things in the right direction.
There is an approach that does not involve a wholesale revision of the tax code. It also makes the current code more fair and shifts the burden in the right direction (toward large corporations and wealthy individuals). That's closing loopholes. There are dozens of loopholes any one of which amounts to more than a billion dollars in lost revenue. So loophole closing can make a real difference. On the individual side a good example is "carried interest". The details are complex but the effect is that hedge fund managers, some of whom make more than a billion dollars, get to use a low 15% Income tax rate rather than the standard 39.5% that would normally apply. This cuts a billionaire's tax bill by $250 million dollars.
Hedge fund managers claim they work hard for the money. But it is a nine-to-five office job in a very nice office with maybe some overtime thrown in. There are millions of people who work as hard or harder, often putting in more hours under much worse conditions, but earning a pittance for their hard work. Yet they pay a larger portion of their income to the Federal government in Social Security and Income tax withholding than the hedge fund manager does. And while this is going on we will be told that the government can't do this or that because it doesn't have enough money. And one of the GOP candidates is being funded by one of these hedge fund managers. It's a good investment for the hedge fund guy. Whose call do you think a politician takes - yours or the hedge fund guy's? And who does the politician want to take care of - you or the hedge fund guy?
On the corporate side we have a similar situation. The poster child for loopholes is the oil and gas industry. These are some of the most profitable companies in the world. They have enough money to pay their senior executives obscene salaries. But they need and deserve various loopholes, we are told. The one I like the best is the one that was introduced in about 1999 as the result of a typographic error in a tax bill. That typo saved oil and gas companies several billion dollars per year and it wasn't even put there on purpose. Yet that loophole has been renewed every time it was in danger of expiring.
And on and on and on. And the GOP has worked vigorously to preserve the carried interest loophole. They fight like the dickens to keep these kinds of loopholes in the code. And they also throw roadblock after roadblock in the way of closing corporate loopholes. Unfortunately, Democrats often do the same thing. But rich people and the executives who run large corporations are much cozier with GOP politicians. So GOP politicians get a lot more TLC from these people than Democrats do. And when it comes to the next loopy tax proposal from a GOP candidate, save yourself a lot of trouble and apply my two tests to it. As for the big bucks people, they know these crazy tax schemes are just red meat for the base that will never be implemented so they ignore them.
The Iran Nuclear deal is a bad one
Here the GOP has some Democratic fellow travelers like Chuck Schumer. But while most Democrats and pretty much all the experts (outside of the Israeli ones) are in favor of it the GOP is united in opposition. The general idea is similar to the Obamacare one: repeal and replace. Here the logic is just as bad. Generally two ideas are advanced. The first is that the Obama people did not negotiate hard enough. The fact that the negotiations dragged on and on is ignored. And the fact that the negotiations involved not just the US and Iran but other countries, specifically Russia and China, is also ignored.
The Bush administration had eight years to do something about the Iran nuclear program. They managed to achieve ineffective sanctions while the Obama administration managed to achieve sanctions that were extremely hard on the Iranians. The Bush administration allowed the Iranians to build out their nuclear program to the point that they had more than ten thousand centrifuges running. The Obama administration has managed to curtail Iranian centrifuges. If the deal goes into effect the number of centrifuges in use will plunge and Iran will be allowed to run only old inefficient centrifuges. The Obama administration was also able to get the Russians and the Chinese on board with the sanctions, with the negotiations, and with the final deal. The Bush administration was able to do none of this.
So during the entire Bush period the Iranians had a free hand which they used to massively expand their nuclear program. Yet nothing was said about this by conservatives, either at the time or since. The Obama administration managed to massively strengthen their hand by working with everyone to get stringent sanctions in place. They then turned those sanctions into an agreement that has been signed off on by Russia, China, apparently Iran, the Europeans, and most of the rest of the world. It hangs by a thread in the US because of united opposition by Republicans. If they weren't operating in lock step then a few Democratic defections by the likes of Schumer wouldn't matter.
The second argument is that the agreement can be fixed to make it better once it has been rejected, typically "on day one". This too is ridiculous. One of the most important parties to the current sanctions regime is Russia. Russia is a nuclear power and has a long border with Iran. And we are currently in a fight with Russia over Ukraine. And that fight has resulted in serious sanctions being imposed on Russia. Russia has a lot of reasons to want to bail on the Iranian sanctions. If the US rejects the deal they have no reason to stick with the sanctions and less than no reason to be interested in even tougher sanctions. To a lesser extent the same is true with China. We are not sanctioning them but they do a lot of business with Iran and would like to do more (i.e. buy Oil). A US rejection might cause China to dial back on their participation. And they too would have less than no interest in going along with even tougher sanctions. Without Russia and China the whole sanctions regime falls apart.
The GOP response to this is to suggest the US go it alone. We would impose even more draconian sanctions while the rest of the world would drop sanctions and normalize their relations with Iran. We tried what eventually turned into stand alone sanctions against Cuba for over 50 years. It didn't work. It would work even less effectively with Iran. They are farther away and have many more options for getting around unilateral US sanctions than Cuba has had.
Then there's the $150 billion we would be "giving Iran". Except it's not our money. It's Iran's money. And we don't have possession of it. It's in banks around the world. So as soon as the rest of the world decides that the sanctions can come down they will direct their banks to free up the money and it will revert to Iranian control.
This whole argument is just an example of a larger problem. The "W" Bush administration adopted a "go it alone" foreign policy. The results they had with the Iranian nuclear program were typical of their overall level of success. The Obama administration has used a cooperative approach that has resulted in strong sanctions against both Iran and Russia. The strength of these sanctions rests on the fact that they are international sanctions enforced by many countries instead of go it alone sanctions like the ones we unsuccessfully used against Cuba. The current crop of GOP candidates seems for the most part to follow the "W" camp. I expect they would have no more success with a "W" style approach than the actual "W" did.
The US military is weak and Cut Foreign Aid
I am lumping these two ideas together because they both flow from ignorance. The US spends more money on its military than any other country and it does so by a large margin. We have by far the most powerful military in the world and it is extremely expensive. Yet Republicans want to grow it. And by "grow" I mean spend more money. What's going on here? There is a legitimate argument that we don't have enough men and women in uniform. I think the argument is weak but assume it is correct for a moment. What's going on? The bulk of our military budget goes to contractors mostly to buy expensive equipment not to pay for soldiers. A lot of this is wasteful spending. But it is also pork. And pork is a great source of campaign contributions.
If we cut waste we could cut the military budget while increasing troop levels. But we aren't going to do that even though, like closing loopholes, it is the fiscally responsible thing to do. So we buy tanks we don't need. We buy jet planes we don't need. We buy all kinds of very expensive high tech gadgets we don't need because some defense contractor makes a pile of money on the contract. And some of that pile of money gets recycled as campaign contributions. The Pentagon budget is yet another place where conservatives talk small government and fiscal responsibility but do big government and fiscal irresponsibility.
Foreign Aid is another long standing GOP bugaboo. Most conservative voters think the foreign aid budget is about ten times larger than it actually is and they have a very distorted idea of what it is spent on. Very little of our "foreign aid" is actual general aid to the needy and deserving. Mostly it is some kind of payoff or another. We give lots of money to Israel just because. Then we give lots of money to Egypt to encourage them to not invade Israel. We give Pakistan lots of money as a bribe so that we can get military supplies into Afghanistan. We give lots of money to Afghanistan so they won't go Taliban. And so it goes. If you subtract these kinds of bribes and sweet heart deals out there is almost nothing left. So even if the rest is badly spent, which mostly it is not, then there's not much to fight over. But the GOP continues to tilt at the Foreign Aid windmill because they have convinced their base that big bad things are happening there.
Executive orders
Pretty much every candidate promised to rescind all of the Obama executive orders on day one. That is pretty idiotic. Are there absolutely no Obama executive orders that are a good idea? This is actually part of a larger conservative trope called "executive overreach". The idea is that Obama has wildly exceeded his authority and is an "imperial president". Other than rhetoric there is absolutely no evidence to support this.
And actually the concept of an imperial presidency was firmly rooted in the "W" Bush presidency. Cheney and his group came up with this wild theory. The President is "Commander in Chief" of the military. This is normally interpreted to mean that he is a super-general that outranks everyone else in the military. That means he can order any soldier around, regardless of the soldier's rank, because he outranks all of them. But this only applies to the military. The Cheney interpretation is that the President in his "Commander in Chief" role has unlimited power to protect the "national security" and, since there is always some kind of handy national security threat around, and since everything impacts national security sooner or later, he can do whatever he wants whenever he wants to.
So the Bush people did a lot of things that were clearly unconstitutional and the courts caught them out several times. But somehow it is not the Bush administration but the Obama administration that is out of control. So far the courts, including the conservatives on the Supreme Court, have blessed Obama administration executive orders. And the Bush administration issued a raft of executive orders without a peep from conservatives in the GOP. But, since the Obama administration has issued a number of executive orders that conservatives don't like, his must be examples of executive overreach.
And then there's the fact that most people don't understand executive orders. Congress passes laws. In this modern very complex world laws don't cover everything. So the laws direct the appropriate executive branches to issue regulations covering implementation details. These laws are the actual source of most federal regulations. If GOP legislators were serious about reining in regulations they should change laws and remove executive discretion. In any case, executive orders are just part of the implementation process. In fact they are often absolutely necessary. Vowing to repeal all executive orders in mindless stupidity.
And then there are the many promises GOP candidates have made to do this or that "on day one". Many of these things can not be achieved without passing laws. The legislative branch is the "passing laws" branch. So in many cases these promises, should an attempt be made to fulfill them, are classic examples of the very executive overreach that these same candidates claim to be so opposed to. And, should they be successful, their actions could be fairly characterized as the very same "imperial presidency" they claim to abhor. This is just another example of hypocrisy in action.
Enough
There's more I could go into. And then there are many subjects that were not touched in the debate. But I think all of us have had enough so I will leave it there.
Let me conclude by observing a common thread through all of this. Politicians striving to get their message out over-simplify. That's an important thing and not necessarily a bad thing. But in example after example I see simplification to the point where there is literally nothing left. Some problems are actually simple when you get down to their core. But many are not.
As I have laid out above, the Iran nuclear deal is an example of necessary complexity. And more generally, the middle east is an extremely complicated place right now. In conflict after conflict we find the players lining up in ways that are specific only to that conflict. We find ourselves on the opposite side from Iran in Syria but have common cause with them with respect to ISIS. We would like to support the Baghdad government in Iraq but we would also like to support the Kurds. But Baghdad hates the Kurds as do the Turks. Yet the Kurds are the most effective group, perhaps the only effective fighting group, that is actively opposing ISIS. I could keep going with Israel, Egypt, Saudi Arabia, etc., pointing out where they can be found on one side in one conflict and on the other side in a different one. Yet there is absolutely no acknowledgment of the necessary complexity this engenders by GOP candidates when they are talking about middle east issues.
The same is true with immigration, the budget, health care, abortion, you name it. Getting policies right in these areas involves acknowledging the complexities involved and accounting for them. The poster child for this kind of thinking is regulation. They are all against it all the time, or so they say. I devoted a post to the subject that you can find at http://sigma5.blogspot.com/2014/05/regulations.html. I can summarize the argument in a sentence or so. There are good regulations and bad regulations. The trick is to keep the good (and maybe add some more where they are needed) and fix or eliminate the bad. And, of course, the devil is in the details.
Yet you will be hard pressed to find anyone on the GOP side articulating the idea that an issue is complex and that a complex solution is required. All issues are simple and mostly can be fixed by shouting simple slogans loudly and frequently. I recently devoted a post to The Donald (see http://sigma5.blogspot.com/2015/07/the-donald.html). But in a nutshell, The Donald embodies this "shout simple solutions backed up by absolutely nothing loudly and often" concept better than any of the others. And he's leading in the polls by a wide margin so he must be doing something right. The Donald also frequently articulates ideas that directly contradict others of his ideas but that doesn't detract from his popularity either. Given the kind of thinking that characterizes the modern Republican party he deserves his lead. He's just doing what the other guys are. Only he's doing it bigger and better.
The debates took place on August 6. They consisted of an earlier "kids table" round featuring mainstream candidates who are lower in the polls (7 of them) and a later "main event" round featuring the ten candidates that were doing the best in the polls in the days immediately preceding the debate. To keep things simple I am going to focus on the main event. I found a transcript that is relatively ad free at www.presidency.ucsb.edu/ws/index.php?pid=110489.
This post is a variant on the "Ken Ham Creationism" post I did a year or so ago. It can be found at http://sigma5.blogspot.com/2014/02/ken-ham-creationism.html. In that post I focused on Ham's beliefs on the theory that a lot of "creationists" would find that they actually disagreed with some of Ham's beliefs. This could drive a wedge into and eventually weaken the movement. In this case I want to focus not on differences but on commonalities. So, unlike with the Ham post where I refrained from challenging Ham's beliefs, here I intend to challenge them.
The debate was run by Fox News. They are the unofficial spokesmen for the conservative movement and for Republicans in general. So I looked for implicit beliefs of the Fox operation as manifested in the content of the questions the Fox anchors asked. And I looked for beliefs expressed by one or more participant that went unchallenged by the others. That constitutes GOP orthodoxy. Since I do not subscribe to that orthodoxy I will challenge it. Here goes.
Cutting taxes is always good.
The GOP contends that is the party of fiscal conservatism and that "liberals" are spendthrifts. Fiscal conservatives believe in balanced budgets or even surpluses. You can get there in two ways. You can have a "small" government (low tax revenue) and low spending or you can have "big" government (high tax revenue) and high spending. The only thing that is necessary is that tax revenue meet or exceed spending. A combination of low taxes and high spending is not being fiscally conservative. But that is what the GOP actually delivers. Currently the GOP wants to expand military spending and perhaps cut other programs. (There is disagreement on the latter.) That is anti-conservative fiscal policy. If you are unwilling to cut spending and, in fact, want to increase spending on the military, you need to raise taxes.
Balancing the budget is good
This sounds like a good argument. But most economists don't buy it. A "reasonable" continuous deficit is the best place to be for the long term. There is certainly a lot of discussion and disagreement about what constitutes reasonable. But I am going to leave that aside and just ask if the GOP actually does what it says. The answer is no. Back at almost the beginning of this blog I wrote a post entitled "There are no fiscally conservative Republicans" (see http://sigma5.blogspot.com/2010/10/there-are-no-fiscally-conservative.html). Nothing has changed since then.
Political Correctness is bad.
This is a diversion. The GOP is just as much in favor of political correctness as the Democrats. They just believe in enforcing different political correctness rules than the Democrats. In particular the GOP accuses the Democrats of enforcing political correctness rules when the Democrats effectively challenge GOP orthodoxy. To some extent the Democrats do the same thing to the GOP but not as loudly or as effectively.
Abortion is bad.
Actually everyone agrees with this one. No one thinks abortions are a good thing. The difference is what the various sides think ought to be done. Generally speaking liberals see abortion in some cases as a necessary evil. Republicans believe it is almost always unnecessary and therefore almost always evil. The debate on the GOP side had been reduced to asking whether abortion is ok in the cases of rape, incest, or the life of the mother. There is some disagreement on this but the consensus is moving toward a "no exceptions" position. Conservatives also incorrectly characterize the position of their opponents as "pro abortion" (see "Political Correctness" above).
I think there is a reasoned and ethical anti-abortion position. I just don't see it as being held by many conservatives. If you are anti-abortion then you should be for policies that actually reduce the need for abortions. The GOP position is "no sex outside of marriage" and "abstinence only". Bristol Palin is the poster child for why these positions are idiotic. She is now pregnant with her second out of wedlock child and each child has been fathered by a different man. Whatever Sara Palin, her mother and a former GOP Vice Presidential Candidate did, it was ineffective.
The best thing to do is to reduce unplanned and unwanted pregnancies. This eliminates the need for most abortions. There are effective ways to do this. They are sex education and birth control. A very large multi-year study was just completed in Colorado. If provided very effective birth control (IUDs) to a large group of women. The abortion rate plunged. GOP legislators have now terminated this program. And in general, there is a nearly complete overlap between conservatives who oppose abortion and conservatives who oppose sex education and birth control.
Conservatives need to stop lying about the positions held by opponents and need to start aggressively supporting sex education and birth control. Until they do we should ignore what they have to say on abortion.
Defund Planned Parenthood
This goes hand and hand with the abortion discussion above. Planned Parenthood does provide abortion services in some states. But there are laws banning them from doing this with federal money and they abide by those laws. To the extent that abortions are subsidized they are subsidized with money that does not come from the federal government. And 97% of what Planned Parenthood does is not abortion related. What they do for the most part is basic women's health and it is mostly delivered to poor women. This is a vastly underserved market. Defunding Planned Parenthood defunds these programs. In other words it is an attack on women's health in general and poor women's health in particular.
So what is going on? Well, a lot of the non-abortion services Planned Parenthood provides are sex education, birth control, and "female issues" kinds of services. Lost in the abortion hubbub is the more important discussion about sex ed, birth control, and "female issues". Most conservatives fall into one of two categories: "we hate sex ed and birth control" or "we are ignorant of and/or grossed out by female issues". Defunding Planned Parenthood advances the agenda of conservatives in either category. Defunding Planned Parenthood is somewhat about abortion. But it is mostly about the other subjects. Our pea brained media can't separate this out from the abortion fight so these more important issues go completely unreported on.
Medicaid expansion is bad
Part of Obamacare is an effort to get medical insurance to poor people. The idea was to expand Medicaid. The Supreme Court partially blocked this by giving states the option to opt out of this part of Obamacare. The GOP argument is that this is some kind of evil intrusion of federal power. Medicaid programs are administered by the states and there are a lot of federal regulations that the state programs must follow. But, while the Obamacare expansion increased the scope of Medicaid programs it did not change the basic structure of the programs.
The Medicaid expansion has been wildly successful, where it has been implemented, in getting a lot of poor people out of emergency rooms and into the standard health care system. The program is also "private" as opposed to being government run because the federal subsidies go to reducing the cost of standard private health insurance plans. So the amount of "government expansion" or "government intrusion" involved is modest. And the GOP has not advanced any alternative that would get good medical coverage to these people. They don't say it but the actual policy of the GOP is to leave these people uninsured.
In spite of the state "opt out" provision many states have opted in and expanded Medicaid. But in many states where the GOP runs the state government the state has opted out. We see large drops in the number of uninsured poor in those states that have adopted the expansion and little to no drop in those states that have opted out. The emergency room is also the most expensive way to deliver medicine. Obamacare has been very successful in cutting emergency room costs. More improvement is expected as poor people get out of the emergency room channel and into regular channels that include preventative services.
The Obamacare $700 billion bonanza
GOP candidates railed in 2012 and are still railing about the $700 billion bonanza. Medicaid expansion and other Obamacare components cost money. Obama, as a good fiscal conservative, put taxes and other "revenue enhancers" into Obamacare to the tune of about $700 billion to cover these increased costs. So far the revenue is on track but the costs are coming in lower than the estimates. But there still needs to be sufficient revenue to offset the additional costs. Conservatives have teen trying to raid this piggy bank since it was enacted into law. No true fiscal conservative would do this but then the GOP is not the party of fiscal conservatism if you ignore their rhetoric and examine their actions. The current crop of GOP presidential hopefuls are continuing this tradition.
And it goes hand in hand with the "massive state costs" of the Medicare expansion. This argument is the foundational one the GOP uses to justify denying health care to millions. But that $700 billion went in part to cover 100% of the costs of the Medicaid expansion in the first three years and 90% of the cost afterwards. Sure the states are on the hook at some point for the last 10%. But the business the 90% part brings into the state is probably enough to boost state revenues enough to cover a good chunk of the 10% so the program is close to free to the states.
Illegal Immigration is bad
We have had restrictive immigration policies dating back to roughly 1900. The US has struggled for generations with the question of how to let "good" immigrants in and keep "bad" immigrants out. And a big problem has been that what constitutes "good" and what constitutes "bad" keeps changing. For a long time we liked blacks as long as they came in slave ships. Then we didn't. We have pretty much always liked immigrants from the UK. But they have been covered by quotas as long as we have had them. We liked the Chinese when they were building the Transcontinental Railroad (middle 1800s) until we didn't but now we do again if they are tech types or rich and powerful. We were hostile to Indians from India until very recently when we weren't (again tech types). We were hostile to the Irish until there were enough of them to constitute an important voting block. The same was true for Italians.
Currently we are hostile to Mexicans ("they are rapists") except that few Mexicans are crossing our borders. And we are hostile to other Central and South American peoples. But, except for the Indians (American Indians - not tech types from India) we are all the descendants of immigrants. The argument is basically a reflection of our prejudices. But we pretend that the problem is that they are an economic drag. The evidence for an over all drag as opposed to a drag in small areas like border towns is slim to non-existent.
I am pro immigration. Our population is aging. Immigrants tend to be young, working age people and their children. They boost the economy and contribute to Social Security. There is currently a shortage of jobs but that is because of the influence of the wealthy and powerful. They like low wages for all but the 1%. The best way to achieve this is to keep employment low.
The thing that started the current immigration cycle was the importation of low wage workers to work on farms in the mid 1900s. That primed the pump. Agriculture (generally GOP leaning) and other business interests (also generally GOP leaning) actually like the current mess because it gives them access to low wage workers. The people who are already here are already here. The cheapest thing to do would be to legalize them. But then they would agitate for better wages. So the actual GOP policy is to stir the pot and make sure nothing happens.
Sanctuary Cities
These too are seen as some kind of evil plot. The fact that most sanctuary cities were set up to deal with the public safety problems caused by the terrible way Immigration enforcement was implemented is never acknowledged. Cities found they had violent crime problems in their immigrant communities. The cops couldn't get anywhere because no one wanted to deal with the cops. This was because the cops were fronting for ICE, the federal Immigration people. Sanctuary programs eased tensions and resulted in more effective policing which reduced crime. And the current argument for why sanctuary cities are so bad rests on exactly one case. Applying the same logic to guns would have eliminated the right to sell or own a gun in this country.
Build/Fix the damn fence
I am old enough to remember the Cold War. At that time it was important to contrast ourselves with the "evil commies". And they had built the infamous Iron Curtain. So we had to draw the starkest contrast we could. So we touted the fact that we had the longest undefended border in the world. At that time conservatives were staunch anti-communists so they were all in favor of our "no fence" open borders policy. And it worked just fine. There wasn't any great problem in the hundreds of years of open borders. There was even little or no problem during the many decades when our restrictive immigration policies overlapped our open borders policy. Then a decade or so ago conservatives decided there was a big problem.
And by the way there is a much more effective solution to the illegal immigrant problem than trying to build an impenetrable fence thousands of miles long. It is called a government ID card. The Nazis were famous for this sort of thing. You could be asked at any time for your "papers". If you didn't have them or they were out of order you were in deep trouble. The same thing, only now applied to illegal aliens, would for better or worse work just fine.
This requirement to carry papers is also something the "dirty commies" liked. So to maintain the contrast between us and them we were the land of the "I don't need no stinking papers" people. I don't know if its true for new cards but my decades old Social Security card says "not to be used for identification". A consistent thread among the libertarians and many others is that citizens should not be required to carry government ID. But without the rank and file noticing this is changing within conservative circles. It is also effectively being changed in society at large.
Thanks to conservatives in many places you now need the right ID, the modern equivalent of "papers", to vote. We have an e-verify system to validate employment status. It's not fool proof but it works pretty well. You now need a social security number for anything financial (Income tax) or medical (Social Security/Medicare/Medicaid). To get a drink you need a driver's license. We are quickly heading toward a time when a government issued ID card will be a necessity. So the libertarian idea of no government issued ID cards (Social Security cards and driver's licenses are both issued by the government as are "concealed carry" permits for guns) seems to be on the way out. Libertarians should be complaining bitterly but they are mostly quiet.
I am leaning to the thought that a government ID card is inevitable. I already have an "enhanced ID" driver's license. The system we now have is a hodge podge that is looking worse every day. In the mean time somehow a super-fence is supposed to be the solution to all of our Immigration related problems. It is unlikely that a super-fence is even feasible. I think the actual motivation is that contractors will build it and they are good sources for campaign contributions. And the GOP strategy is "the fence MUST be completed before anything else can be done". This is complete nonsense.
NSA Data Collection
We now know that the government is collecting massive amounts of data on innocent citizens. This program was started under the "W" Bush administration in response to 9/11 but has been continued pretty much intact by the Obama administration. Senator Paul is an outlier in the GOP in arguing that something should be done to scale it back. In the debates Governor Christie stridently defended it. My reading is that Senator Paul is pretty much alone in his stand. Certainly nothing was heard from the Fox moderators or the other participants.
This program is a clear violation of the Fourth Amendment. Many conservatives style themselves as "strict constructionists", people who try to hew closely to the words of the Constitution. Any strict constructionist should be outraged by this program. They should be loudly joined by libertarians. But there is broad support among conservatives for vacuuming up all kinds of data by and about innocent American citizens.
This is made even worse by the fact that this extra data collection (and "enhanced interrogation", an idea implicitly supported by most panelists) has so far shown itself to be completely ineffective. The 9/11 commission report demonstrated that the hijackers could and should have been caught using the techniques and procedures in place before 9/11 and before the mass data collection programs were in place. Their conclusion was that the big problem was not a lack of data or staff or computers. Instead the problem was "silo"ing, various agencies keeping the data they had close and not sharing it with other agencies. Between them the CIA, FBI, and NSA had everything they needed. Its just that each agency was so busy defending its turf that none of them were in a position to pull it all together.
ISIS is bad and the Obama Administration is ineffective
As with abortion you will actually get no disagreement about the first one, although conservatives pretend that liberals/Obama are somehow pro or at least soft on ISIS (more GOP political correctness on display). And the Obama Administration has not been very effective in rolling ISIS up so far. But the primary problem is there are no options that are likely to be effective. The politics of the region are extremely complex. The result is that there are no completely dependable allies. This leaves us to depend on allies like Turkey that has its own agendas like wiping the Kurds out or the Saudis who see themselves as the defenders of all things Sunni (ISIS is and the areas they occupy are Sunni) or the Iraqis who see Sunnis and the Kurds as the enemy. It is no surprise that progress has been modest at best.
So the real question has to do with whether there is a better alternative. Senator Graham in the "kids" debate suggested putting American boots on the ground. This has been and would likely be bitterly resented. The Trump solution is to put Carly Fiorina (current GOP presidential candidate and ex-CEO of Hewlett Packard) in charge of negotiations. She bungled things at Hewlett Packard pretty badly. She has little or no foreign policy expertise and little or no expert knowledge of the region. That doesn't sound like a very good idea either. This "talk louder and threaten more" approach does not sound promising to me. For more on this consult "ISIS - Do Something Stupid Now" at http://sigma5.blogspot.com/2014/09/isis-do-something-stupid-now.html.
Obamacare is a Complete Disaster
This is and has been the standard GOP line since it passed. But all the data suggests that it has generally worked. It has moved millions into the mainstream medical system. And cost estimates have turned out to be wrong. Costs so far seem to be lower than forecast and there is no indication yet of a reason to believe that this trend will not continue indefinitely. But everyone in the GOP fold always characterized it as a "complete disaster". They fail to acknowledge that its core concepts were developed by a conservative think tank, The Heritage Foundation, and that its implementation was based closely on a successful program implemented by a Republican Governor who went on to unsuccessfully run for President, Mitt Romney. The national experience has closely paralleled the Massachusetts experience.
Republicans have also failed to propose an alternative. Instead they proclaim "repeal and replace". Then there's the $700 billion. I discussed the actual situation with respect to this money above.
It's good to be born into modest circumstances
Some of the GOP candidates have risen up from modest circumstances. But so did Bill Clinton, Hillary Clinton, and Barak Obama. Obama currently has no significant wealth and neither Clinton had any significant wealth until after Bill left office. And then there's Jeb Bush the pre-Trump favorite. He is the son of a President, the brother of a President, and the grandson of a US Senator. Or there's Mitt Romney, the 2012 GOP standard bearer. He was the son of a CEO of a major American car company (and one time GOP candidate for the Presidency). Or, going back another 4 years, there's John McCain. He was the son of a US Navy Admiral and the grandson of another US Navy Admiral. Finally, consider the current GOP front runnier - Donald J. Trump. He claims to be worth TEN BILLION DOLLARS (the capitol letters are his). He is the son of a successful New York real estate developer. Being born into modest circumstances doesn't seem to be a recipe for success within the GOP.
Replace the current tax code with (insert quack system here)
The standard GOP line seems to be that the current tax code is completely broken and needs to be completely replaced. Then we get one quack idea or another as to what it should be replaced with. There was the Herman Can 9-9-9 system in the 2012 election cycle. Some variation on a "flat tax" has popped up several times. A recent example of this would be (as far as I can tell - the details are hard to find and harder to figure out) the Huckabee "tithe" 10% system. And on and on and on. No professional thinks any of them are workable or would be effective. But that doesn't stop the proposals from being floated.
There are too many of them to make it worth while to keep track of each one. So I just apply two tests. What does the proposal do to the overall amount of taxes corporations pay? For any GOP proposal the answer is guaranteed to be "corporations pay less". Aggregate corporate income tax revenue used to pretty much match aggregate revenue collected on the income of individuals. But that was a long time ago. Now corporations as a group pay far less than individuals as a group do. Most of the savings on the corporate side are the result of loopholes engineered to reduce the taxes of large corporations. Most large corporations pay income tax on only a small percentage of their total income. Many pay no income tax at all. Small businesses, without the same lobbying clout, get hit much harder.
The second thing I look at is how the burden shifts between low income individuals and high income individuals. All the GOP proposals I have seen lighten the burden on high income individuals and make it up by increasing the burden on low income individuals. It is important to note that wealthy individuals game the system the same way corporations do. They lobby for loopholes that reduce their taxes. That's as wrong as the shifting of the burden away from corporations and toward individuals. Looking at these two tests allows me to quickly analyze these proposals and discard them. I have yet to see a GOP tax proposal that moves things in the right direction.
There is an approach that does not involve a wholesale revision of the tax code. It also makes the current code more fair and shifts the burden in the right direction (toward large corporations and wealthy individuals). That's closing loopholes. There are dozens of loopholes any one of which amounts to more than a billion dollars in lost revenue. So loophole closing can make a real difference. On the individual side a good example is "carried interest". The details are complex but the effect is that hedge fund managers, some of whom make more than a billion dollars, get to use a low 15% Income tax rate rather than the standard 39.5% that would normally apply. This cuts a billionaire's tax bill by $250 million dollars.
Hedge fund managers claim they work hard for the money. But it is a nine-to-five office job in a very nice office with maybe some overtime thrown in. There are millions of people who work as hard or harder, often putting in more hours under much worse conditions, but earning a pittance for their hard work. Yet they pay a larger portion of their income to the Federal government in Social Security and Income tax withholding than the hedge fund manager does. And while this is going on we will be told that the government can't do this or that because it doesn't have enough money. And one of the GOP candidates is being funded by one of these hedge fund managers. It's a good investment for the hedge fund guy. Whose call do you think a politician takes - yours or the hedge fund guy's? And who does the politician want to take care of - you or the hedge fund guy?
On the corporate side we have a similar situation. The poster child for loopholes is the oil and gas industry. These are some of the most profitable companies in the world. They have enough money to pay their senior executives obscene salaries. But they need and deserve various loopholes, we are told. The one I like the best is the one that was introduced in about 1999 as the result of a typographic error in a tax bill. That typo saved oil and gas companies several billion dollars per year and it wasn't even put there on purpose. Yet that loophole has been renewed every time it was in danger of expiring.
And on and on and on. And the GOP has worked vigorously to preserve the carried interest loophole. They fight like the dickens to keep these kinds of loopholes in the code. And they also throw roadblock after roadblock in the way of closing corporate loopholes. Unfortunately, Democrats often do the same thing. But rich people and the executives who run large corporations are much cozier with GOP politicians. So GOP politicians get a lot more TLC from these people than Democrats do. And when it comes to the next loopy tax proposal from a GOP candidate, save yourself a lot of trouble and apply my two tests to it. As for the big bucks people, they know these crazy tax schemes are just red meat for the base that will never be implemented so they ignore them.
The Iran Nuclear deal is a bad one
Here the GOP has some Democratic fellow travelers like Chuck Schumer. But while most Democrats and pretty much all the experts (outside of the Israeli ones) are in favor of it the GOP is united in opposition. The general idea is similar to the Obamacare one: repeal and replace. Here the logic is just as bad. Generally two ideas are advanced. The first is that the Obama people did not negotiate hard enough. The fact that the negotiations dragged on and on is ignored. And the fact that the negotiations involved not just the US and Iran but other countries, specifically Russia and China, is also ignored.
The Bush administration had eight years to do something about the Iran nuclear program. They managed to achieve ineffective sanctions while the Obama administration managed to achieve sanctions that were extremely hard on the Iranians. The Bush administration allowed the Iranians to build out their nuclear program to the point that they had more than ten thousand centrifuges running. The Obama administration has managed to curtail Iranian centrifuges. If the deal goes into effect the number of centrifuges in use will plunge and Iran will be allowed to run only old inefficient centrifuges. The Obama administration was also able to get the Russians and the Chinese on board with the sanctions, with the negotiations, and with the final deal. The Bush administration was able to do none of this.
So during the entire Bush period the Iranians had a free hand which they used to massively expand their nuclear program. Yet nothing was said about this by conservatives, either at the time or since. The Obama administration managed to massively strengthen their hand by working with everyone to get stringent sanctions in place. They then turned those sanctions into an agreement that has been signed off on by Russia, China, apparently Iran, the Europeans, and most of the rest of the world. It hangs by a thread in the US because of united opposition by Republicans. If they weren't operating in lock step then a few Democratic defections by the likes of Schumer wouldn't matter.
The second argument is that the agreement can be fixed to make it better once it has been rejected, typically "on day one". This too is ridiculous. One of the most important parties to the current sanctions regime is Russia. Russia is a nuclear power and has a long border with Iran. And we are currently in a fight with Russia over Ukraine. And that fight has resulted in serious sanctions being imposed on Russia. Russia has a lot of reasons to want to bail on the Iranian sanctions. If the US rejects the deal they have no reason to stick with the sanctions and less than no reason to be interested in even tougher sanctions. To a lesser extent the same is true with China. We are not sanctioning them but they do a lot of business with Iran and would like to do more (i.e. buy Oil). A US rejection might cause China to dial back on their participation. And they too would have less than no interest in going along with even tougher sanctions. Without Russia and China the whole sanctions regime falls apart.
The GOP response to this is to suggest the US go it alone. We would impose even more draconian sanctions while the rest of the world would drop sanctions and normalize their relations with Iran. We tried what eventually turned into stand alone sanctions against Cuba for over 50 years. It didn't work. It would work even less effectively with Iran. They are farther away and have many more options for getting around unilateral US sanctions than Cuba has had.
Then there's the $150 billion we would be "giving Iran". Except it's not our money. It's Iran's money. And we don't have possession of it. It's in banks around the world. So as soon as the rest of the world decides that the sanctions can come down they will direct their banks to free up the money and it will revert to Iranian control.
This whole argument is just an example of a larger problem. The "W" Bush administration adopted a "go it alone" foreign policy. The results they had with the Iranian nuclear program were typical of their overall level of success. The Obama administration has used a cooperative approach that has resulted in strong sanctions against both Iran and Russia. The strength of these sanctions rests on the fact that they are international sanctions enforced by many countries instead of go it alone sanctions like the ones we unsuccessfully used against Cuba. The current crop of GOP candidates seems for the most part to follow the "W" camp. I expect they would have no more success with a "W" style approach than the actual "W" did.
The US military is weak and Cut Foreign Aid
I am lumping these two ideas together because they both flow from ignorance. The US spends more money on its military than any other country and it does so by a large margin. We have by far the most powerful military in the world and it is extremely expensive. Yet Republicans want to grow it. And by "grow" I mean spend more money. What's going on here? There is a legitimate argument that we don't have enough men and women in uniform. I think the argument is weak but assume it is correct for a moment. What's going on? The bulk of our military budget goes to contractors mostly to buy expensive equipment not to pay for soldiers. A lot of this is wasteful spending. But it is also pork. And pork is a great source of campaign contributions.
If we cut waste we could cut the military budget while increasing troop levels. But we aren't going to do that even though, like closing loopholes, it is the fiscally responsible thing to do. So we buy tanks we don't need. We buy jet planes we don't need. We buy all kinds of very expensive high tech gadgets we don't need because some defense contractor makes a pile of money on the contract. And some of that pile of money gets recycled as campaign contributions. The Pentagon budget is yet another place where conservatives talk small government and fiscal responsibility but do big government and fiscal irresponsibility.
Foreign Aid is another long standing GOP bugaboo. Most conservative voters think the foreign aid budget is about ten times larger than it actually is and they have a very distorted idea of what it is spent on. Very little of our "foreign aid" is actual general aid to the needy and deserving. Mostly it is some kind of payoff or another. We give lots of money to Israel just because. Then we give lots of money to Egypt to encourage them to not invade Israel. We give Pakistan lots of money as a bribe so that we can get military supplies into Afghanistan. We give lots of money to Afghanistan so they won't go Taliban. And so it goes. If you subtract these kinds of bribes and sweet heart deals out there is almost nothing left. So even if the rest is badly spent, which mostly it is not, then there's not much to fight over. But the GOP continues to tilt at the Foreign Aid windmill because they have convinced their base that big bad things are happening there.
Executive orders
Pretty much every candidate promised to rescind all of the Obama executive orders on day one. That is pretty idiotic. Are there absolutely no Obama executive orders that are a good idea? This is actually part of a larger conservative trope called "executive overreach". The idea is that Obama has wildly exceeded his authority and is an "imperial president". Other than rhetoric there is absolutely no evidence to support this.
And actually the concept of an imperial presidency was firmly rooted in the "W" Bush presidency. Cheney and his group came up with this wild theory. The President is "Commander in Chief" of the military. This is normally interpreted to mean that he is a super-general that outranks everyone else in the military. That means he can order any soldier around, regardless of the soldier's rank, because he outranks all of them. But this only applies to the military. The Cheney interpretation is that the President in his "Commander in Chief" role has unlimited power to protect the "national security" and, since there is always some kind of handy national security threat around, and since everything impacts national security sooner or later, he can do whatever he wants whenever he wants to.
So the Bush people did a lot of things that were clearly unconstitutional and the courts caught them out several times. But somehow it is not the Bush administration but the Obama administration that is out of control. So far the courts, including the conservatives on the Supreme Court, have blessed Obama administration executive orders. And the Bush administration issued a raft of executive orders without a peep from conservatives in the GOP. But, since the Obama administration has issued a number of executive orders that conservatives don't like, his must be examples of executive overreach.
And then there's the fact that most people don't understand executive orders. Congress passes laws. In this modern very complex world laws don't cover everything. So the laws direct the appropriate executive branches to issue regulations covering implementation details. These laws are the actual source of most federal regulations. If GOP legislators were serious about reining in regulations they should change laws and remove executive discretion. In any case, executive orders are just part of the implementation process. In fact they are often absolutely necessary. Vowing to repeal all executive orders in mindless stupidity.
And then there are the many promises GOP candidates have made to do this or that "on day one". Many of these things can not be achieved without passing laws. The legislative branch is the "passing laws" branch. So in many cases these promises, should an attempt be made to fulfill them, are classic examples of the very executive overreach that these same candidates claim to be so opposed to. And, should they be successful, their actions could be fairly characterized as the very same "imperial presidency" they claim to abhor. This is just another example of hypocrisy in action.
Enough
There's more I could go into. And then there are many subjects that were not touched in the debate. But I think all of us have had enough so I will leave it there.
Let me conclude by observing a common thread through all of this. Politicians striving to get their message out over-simplify. That's an important thing and not necessarily a bad thing. But in example after example I see simplification to the point where there is literally nothing left. Some problems are actually simple when you get down to their core. But many are not.
As I have laid out above, the Iran nuclear deal is an example of necessary complexity. And more generally, the middle east is an extremely complicated place right now. In conflict after conflict we find the players lining up in ways that are specific only to that conflict. We find ourselves on the opposite side from Iran in Syria but have common cause with them with respect to ISIS. We would like to support the Baghdad government in Iraq but we would also like to support the Kurds. But Baghdad hates the Kurds as do the Turks. Yet the Kurds are the most effective group, perhaps the only effective fighting group, that is actively opposing ISIS. I could keep going with Israel, Egypt, Saudi Arabia, etc., pointing out where they can be found on one side in one conflict and on the other side in a different one. Yet there is absolutely no acknowledgment of the necessary complexity this engenders by GOP candidates when they are talking about middle east issues.
The same is true with immigration, the budget, health care, abortion, you name it. Getting policies right in these areas involves acknowledging the complexities involved and accounting for them. The poster child for this kind of thinking is regulation. They are all against it all the time, or so they say. I devoted a post to the subject that you can find at http://sigma5.blogspot.com/2014/05/regulations.html. I can summarize the argument in a sentence or so. There are good regulations and bad regulations. The trick is to keep the good (and maybe add some more where they are needed) and fix or eliminate the bad. And, of course, the devil is in the details.
Yet you will be hard pressed to find anyone on the GOP side articulating the idea that an issue is complex and that a complex solution is required. All issues are simple and mostly can be fixed by shouting simple slogans loudly and frequently. I recently devoted a post to The Donald (see http://sigma5.blogspot.com/2015/07/the-donald.html). But in a nutshell, The Donald embodies this "shout simple solutions backed up by absolutely nothing loudly and often" concept better than any of the others. And he's leading in the polls by a wide margin so he must be doing something right. The Donald also frequently articulates ideas that directly contradict others of his ideas but that doesn't detract from his popularity either. Given the kind of thinking that characterizes the modern Republican party he deserves his lead. He's just doing what the other guys are. Only he's doing it bigger and better.
Subscribe to:
Posts (Atom)