This post is short and punchy, at least, for me. And when I say "binging" I am NOT talking about drinking. Binge drinking is definitely a bad thing. Rater I am talking about binge watching entertainment type TV shows.
I am old enough to remember "those thrilling days of yesteryear" (that's a steal from the introduction to an old radio (and later TV) show called "The Lone Ranger"). Back when I was a kid the typical TV show aired thirty-nine new episodes per year. You got a new episode every week from Labor Day to Memorial Day. Reruns were confined to "Summer" season that ran between those two major holidays.
But then and now it takes about a week to shoot a half hour of scripted TV. So back then most shows ran a half an hour. And shows of this period had no story arc. Each episode was pretty much a stand alone affair. To take a typical example, "Peter Gunn" was a detective show.
Every week Pete would chase after a different bad guy. His gal pal Edie would croon a number and his cop buddy Lieutenant Jacoby would be world weary as he stepped in at key moments to help Pete out. There were some other repeat characters. But they were mostly there to provide atmosphere and comic relief. You could watch episodes in pretty much any order you wanted and everything would make perfect sense.
But audiences yearned for shows with plots that could not be resolved in a half hour so hour long shows appeared. In the beginning these were often anthologies where each story hewed to the same theme or genre but often the cast was completely different for each show. This made it possible to shoot parts of two or more shows at once. This, in turn, allowed the shooting of an individual episode to stretch across two weeks or occasionally a little more.
But the "soap opera" had been invented for radio in the thirties. These "daytime dramas" featured elaborate story arcs. And that was one of the features that made them popular. These shows aired daily so they chewed through a lot of story in a year. These elaborate story arcs, when combined with a lot of repetition and recapitulation, allowed writers to churn out enough material make a year's worth of shows in a year in spite of the fact that soaps aired daily year round with no reruns.
And an episode of any kind of show could be churned out quickly on radio. Once the script was ready all you needed was some actors huddled around a microphone. And they didn't need to memorize their lines. They could just read straight from the script. All of this was impossible when radio morphed into TV.
All of a sudden you needed sets, costumes, locations, etc. And the actors now needed to learn their lines. Somehow soaps succeeded in making the transition. One trick was to use a big cast. A specific story line would only involve part of the cast. This allowed parts of the show to be broken into semi-autonomous units. Another trick was to write dialog on cue cards. These were positioned so that the camera couldn't see them but each actor could.
A soap production had to be economical so they generally stuck to pre-constructed sets that were located on sound stages. This limited the number of locations but meant that the show was not dependent on the weather or the availability or suitability of a particular location.
Audiences would put up with this on a daytime soap but they expected more from an evening show. They demanded a more movie-like experience. And this required custom sets, exterior locations, actors not woodenly reading lines off cue cards, etc.
And even with the use of the most efficient and economical production methods, it turned out that it took 5-6 days to shoot everything necessary to assemble a single half hour episode of a scripted night-time TV show. You could employ teams of writers, directors, editors, etc. But people tuned in for the cast so you were stuck with only one cast. Or were you?
An early and very popular TV show was "Maverick". It was a "western" built around a roving gambler named Bret Maverick. Regardless of what tricks anyone came up with it became literally impossible to turn out the requisite number of hour long episodes each year. So a brother, Bart, was introduced. This meant that two episodes could be shot at a time. One featured James Garner as Bret and the other featured Jack Kelly as Bart.
The problem was that Garner was much more popular than Kelly. I went back and watched a bunch of episodes a couple of years ago. Kelly was actually quite good. He just wasn't as well likes by audiences as Garner. This haunted Kelly and he didn't do much acting after the series wrapped. He just couldn't shake the reputation of being the guy who wasn't as good as Garner. And that's a bad thing because he really was very talented.
Using tricks like Bret/Bart made it possible to put out 39 hour episodes per year. But there was another problem. It was fantastically expensive. So first shows started cutting back to 32 episodes per year. Then they cut back to 26. Finally, they settled on 22 episodes as a "full season". If it takes two weeks to shoot an episode that means 44 weeks of work followed by some time off for the cast. No modern show shoots more than 22 episodes per year. And many shoot less. It is now not unusual for as few as ten episodes to be shot per year and still be labeled a "full season".
And one way or another, this idea of story arcs spanning multiple episodes got tried out and gained quick acceptance with audiences. Possibly this grew out of mini-series. These became popular in the '70s. "Roots" was one of the most popular and memorable. It was a dramatization of the book of the same name by Alex Haley.
People had been turning books into movies for a long time. "Gone With the Wind" premiered in 1939. It ran 3 1/2 to 4 hours, depending on the version. But that was hard to pull off so most movies made from books clocked in at about two hours. You have to throw a lot of a book away to do this.
And it is possible to get it down to a single hour if the material is of the right kind. A popular TV show of this early period was "Perry Mason". It was a TV series with pretty much the same cast (and many of the same sets) showing up in each episode. And many of the "Perry Mason" books that Erle Stanley Gardner wrote were translated into a single one hour episode. But the books were short and the puzzle that Mason had to solve could be simplified so that all the key material could be condensed down to fit into a single 44 minute (if you ignore the titles and commercials) "hour" of TV.
But over time story arcs, often lasting an entire season, and sometimes spanning multiple seasons, became the norm. "How to Get Away with Murder" is a classic contemporary example of this. You can watch just a single episode and make sense of it without having watched previous episodes. But you get much more out of it if you have watched enough previous episodes to be familiar with the several multi-episode arcs the show features. And that means you are not getting the full experience the show offers if you watch an episode in isolation rather than in its proper sequence.
Another problem arises because it is common for four, eight, or even more, weeks elapse between the airing of consecutive new episodes. Twenty-Two episodes fill up about four and half months of the year. So if a show airs four episodes per month in October and November to get viewers hooked then there isn't much left.
If, for instance, four episodes are aired in February, a "sweeps" month (a month where more intense ratings information is gathered), and another four are aired in May, another sweeps month, then that leaves only two other episodes to slot in anywhere else. And, assuming no new episodes air in the Summer, that means that there are two different two month gaps and a four month gap each year. That's a lot of opportunity for viewers to forget about the show, or lose track of the current status of various story arcs, or otherwise become disconnected from the show.
And there's still another problem. If you like a type of show you tend to watch several of them. And often the shows with the most elaborate and convoluted story arcs are the most popular. So, it becomes very easy to mix the details of one show up with the details of another show. We can all sort this out over time. But it takes effort and the whole point of TV is that it demands little effort of us.
But binge watching avoids these problems. Once you finish watching an episode of a particular show, you can immediately move on to the next one. All the characters, details of various story arcs, etc., are already at the front of your mind. You can easily get deeply immersed in all aspects of a particular show. Once you have finished binging one show you can move on to binging a different show.
On paper, binging has been possible for a long time now. Decades ago I collected sets of episodes of old TV shows I liked on video tape. The technology then moved on to DVD. They were more compact and the picture quality was much better but it was the same idea. Now we have streaming services. I currently subscribe to Amazon Prime and Netflix. They are probably the two most popular options. But there are a more and more streaming services popping up all the time. Disney is about to get into the game in a big way, for instance.
And very recently I got a chance to do a "side by side" comparison of binging versus the traditional model. I have watched every episode of a show called "Lucifer". Its first three seasons aired in the traditional manner on a regular TV channel.
This show is mostly episodic. It is primarily a standard "police procedural" but other elements have been added to spice it up. So we get the "crime of the week" to solve. Each weekly crime is pretty much independent of the other crimes in other episodes. In that aspect it follows the old "Perry Mason" model. But, unlike "Perry Mason", it also has "arc" aspects that play out over several episodes, and even multiple seasons.
But then the regular TV version of "Lucifer" got cancelled. Luckily for me, Netflix swooped in almost immediately and picked it up. Netflix made "season 4" available for streaming recently. As a result of Netflix moving so quickly the "between season" gap has been about the same as it was previously. So I binged a bunch of episodes from the new season.
And it is very much an "apples to apples" comparison. The original "TV" structure has been carried over unchanged. You can tell where the "commercial breaks" are because the screen goes black for a second or so in all the places a regular show would have to break for commercials. There aren't any commercials because it's Netflix. But the "commercial break" structure has been retained anyway. So, in terms of how the show is structured and shot, nothing has changed.
But the viewing experience is quite different. It is much easier to get into the show, re-establish and then maintain my connections to the various characters, and to follow the various "arc" components of the show. It is just a better way to experience the show. It also helps, of course, to not be yanked out of the show for a couple of minutes every 10-15 minutes by a block of commercials.
Something that is an even better showcase for binge watching is a show called "Bosch". It too is pretty much a standard cop show. And, like "Perry Mason", it is based on a series of popular books. In this case, the author is Michael Connelly. But in spite of this high degree of similarity there are points where it diverges significantly from "Perry Mason" or even "Lucifer".
This is not because a typical season is extracted from three books. That allows for a more interesting and varied show. But you wouldn't even notice if you hadn't read the books. Instead the differences follow from the fact that "Bosch" is built from the ground up to be streamed. As a result, watching a season of "Bosch" is very much like reading a "Bosch" book. It can go into the same kind of depth as a book can. It can explore character more thoroughly like a book can.
Since it was made to be streamed the length of each episode varies slightly. The creators have the luxury of letting the amount of story they want to cover in an episode dictate the exact length of the episode. The episode is created to be about an hour long. But it no longer has to have a run time that exactly fits a "one hour" time slot on a TV channel. As a result the show flows more smoothly than it otherwise would.
There are also no artificial mini-climaxes ever 10-15 minutes designed to hold an audience across a commercial break. Some scenes are long. Some are short. But each no longer needs to be constructed to lead us into or out of a commercial break. This lets the show creators focus on the needs of the story and character.
And, of course, what should be the most bingable show out there is not, in fact, bingable. That's "Game of Thrones". GoT is on HBO and HBO is sticking with the old "an episode a week" model left over from TV. HBO has been around long enough that it long predates streaming. Back in the day they needed to conform to some of the TV rules like airing an episode a week and slotting it into the same date and time every week. They have not been able (or perhaps willing) to deviate from that old model.
HBO has figured out that this is an old model that is losing ground to streaming services. So they are moving away from it a step at a time. They now offer "HBO go", a way to stream HBO shows. It is an open question whether they can survive as a stand-alone entity. I expect them to go "all streaming" at some point.
But they will probably eventually end up as a part of some bigger streaming service. If they time it right, they can probably sell themselves for a lot of money to some service that is having trouble braking through the clutter and needs something to get them on the map. But HBO has been selling their content on DVD for some time. Neither Netflix nor Amazon Prime do this. If you want access to their content you have to sign up with their service.
The easiest call ever is that streaming is the future when it comes to entertainment programming. Traditional TV still does the better job for news and live sports. But the future does not look good for traditional TV and cable stations that currently depend primarily on entertainment programming.
ESPN has demonstrated that there is an "all sports" business model that works, or at least used to. And we have "all news" cable channels that are also doing fine at the moment. But things look bleak for "local" TV stations and the networks that feed them. And things look even bleaker for entertainment oriented cable channels. Unless, of course, they are able to successfully transition to streaming.
Tuesday, May 14, 2019
Friday, April 19, 2019
On Writing
These are some observations about writing. They are based in part on my own experiences and in part on the studying I have done about the process of writing. Here goes.
Writing is hard work. Physically it is easy. We now sit at a keyboard and hammer away. If you are looking for a good way to take some pounds off, the writing process will be absolutely no help. More active forms of exercise (or a good diet) will be required. But the mental processes involved in writing are hard work. I believe that is true even for those who are gifted writers. It is one of those professions where the best advice is:
Writing is not as bad as that but it is close. There are probably a few more writers who earn enough at writing to do it full time than there are full time actors. But neither group will ever constitute an employment category large enough to justify its own dedicated set of statistics. There are just too few in either category who earn enough to do it full time.
And, as I said, writing is hard. I dabble at it, mostly for my own interest and edification. I feel a compulsion to "go on record". And I am retired and well enough off that I can afford to. But at no time did I ever consider trying to write for a living. Sure, before I retired I wrote the odd study or position paper, but that was it. The time I spent writing constituted such a small part of my work day that it did not merit a mention in the job description for any job I have ever held.
That said, I do have a brother who has managed to earn a living by writing. That's an impressive feat. Congradulations, Jim.
So what was my path from there to here?
The earliest writing related event in my life that I remember happened to me in the ninth grade. I got a grade of C+ on an essay I wrote for class. I talked to my teacher. I said "I think it is a fair grade but, if it's not too much trouble, could you give me some pointers on how I could have done better?"
She returned the paper a day or so later. She had slightly increased my grade (not what I asked for) but provided no guidance (the thing I actually had asked for). For whatever reason, that was a very discouraging moment when it came to my interest in writing. I decided then and there that writing was not my thing. It was a long time before that changed.
Looking back on my first year in college I now realize that something happened then that should have given me encouragement as a writer but it didn't. I started out as an "Arts and Sciences" major (I later changed to Engineering). At the time there was a requirement that all A&S majors take three one credit English classes. You took them in sequence and each class required you to submit a series of essays.
This was the late '60s and the cutting edge writing technology of the day was the typewriter. Being a poor struggling college student I had a cheap "manual" (entirely mechanical - see any number of movies from the '30s and '40s if you are not sure what I am talking about) typewriter. The key attribute of all but a few fancy (and expensive) electric "office" typewriters was that they didn't permit you to go back and correct anything.
Even the fancy models would only give you a way to fix the odd typographical error. Whatever first hit the page, that's what you were stuck with. Unless, of course, you were willing to retype the whole thing over from the beginning. So that was a problem. I was a terrible typist and I had a low opinion of my writing ability. It would have been nice to be able to revise my first draft but at the time there was no practical way to do that. Even if I was willing to retype the whole thing there was not enough time (see below).
The other thing I learned at the time was that words would not flow unless I was "on deadline". If it had gotten to the point where if I typed like hell I had barely enough time to finish up the paper in time to turn it in at the start of class then the words flowed. Before that it was all "writer's block" all the time. Sitting down the night before, or even several hours before class started, was a complete waste of time.
As you can imagine, this did not lead me to believe that what I was turning in was any good. In fact, one time I accidently left page 2 behind in my dorm room when I turned the paper in. I almost threw it away figuring the paper was so bad it would not make any difference. Fortunately, I changed my mind. I turned the missing page in at the start of the next class and the instructor was gracious enough to accept it. So the instructor at least had the whole paper to evaluate.
But if you did well enough in the first two classes you could skip the third one. I did. That should have told me that whatever I was doing was working better than I thought. But it didn't. I was happy to not have to take the third class but did not conclude, as I should have, that there was actually some hope for me as a writer.
I soon moved over from A&S to the School of Engineering, specifically the Electrical Engineering Department and, most specifically of all, to the Computer track. I left the world of Arts and Sciences and all that entailed behind. Well, not entirely. The School of Engineering had a series of "social studies for Engineers" classes that us students were required to take. The School of Engineering thought that it's graduates shouldn't be totally clueless with respect to "the finer things in life".
And I actually liked these classes. I thought they did a good job of paring away a lot of the BS and focusing on the heart of the matter. And one of these classes was called "Technical Report Writing". It was only a one credit course but I thought it was excellent.
The course was organized around a number of questions and observations. These were designed to focus you on what was important, not about the technical details of whatever you were writing about, but on how to make your written communication effective. And the most important of these was "Who is your audience?"
The object of a report is to communicate information effectively. To do so it is important to focus on who will be reading the report. What do they know? What don't they know? What do they need to learn? These are critical questions. You can't answer them unless you first know who your most important readers will be. Then you must figure out how to answer these questions with respect to those people. I very much took that lesson to heart.
And a good many years later I found myself writing a series of reports, explainers, and recommendations as part of my job. The key member of the audience for these writings was the IT Director. So I did my best to answer the above questions as it related to him.
Then a magical thing happened. He stopped by one day and said "I like your reports. Keep them coming." That turned my life around when it came to how I felt about writing. I now believed I could do an at least adequate job and I no longer feared writing. Of course, the technology had advanced light years since my Freshman College days. It was now easy to "revise and extend" to your hearts content. And we now had spell-check.
So I have internalized that experience from my college days. It is now "put something down, anything. You can always fix it later." I no longer worry if my first draft is good or not. It is only a starting point. I think I am a good editor. I can figure out what's working and what isn't. Then I can set about fixing what needs improvement. And, by the way, spell-check is a good starting point. But it catches far from everything.
If the word on the page is the wrong word but it is spelled correctly a spell-checker will not flag it. So you have to go through your writing to make sure the word on the page is the one you intended and not some correctly spelled but entirely inappropriate word. For instance, I often type "form" when I mean "from". A spell checker is completely happy with "form".
So when I am composing one of these blog posts I start by just banging something out. Sometimes this first draft works pretty well except for the odd word that got past the spell checker and perhaps a few other things that are pretty easy to spot and fix. But some first drafts require a lot of rework. It just depends on the subject and how the juices are flowing that day.
And I used to be able to compose, revise, and publish one of these posts in a single long session. Now I find it often takes me two days to complete a first draft and then whip it into decent shape. (This piece is taking about an average amount of editing and rework.) And, since I do this mostly for my own interest and edification, at some point I decide "it's close enough" and hit the "Publish" button.
Is it the best it could be at that point? No! But I want to avoid the "this is getting to be more trouble than it is worth" stage and that involves deciding that "enough is enough" and letting the world see whatever warts may remain. If I was doing this for a living I couldn't get away with that. But, if I was doing this for a living, I would have an editor to help me out. Trust me. Editors perform a valuable service.
And along the way I have developed an interest in the art and craft of writing. I have read some books and I regularly listen to a podcast called "Writing Excuses". The final line of every episode is "you are out of excuses - now go write". I don't write fiction. But let me pass along some of what I have learned about writing fiction. Why? Because it's fun.
There are two basic approaches to writing fiction. One approach is called "discovery" writing and the other is called "outlining". Most writers are not pure one or the other but they tend to lean more heavily toward one approach over the other.
And lots of writers use one approach in some situations and the other approach in others. Brandon Sanderson, one of the "Writing Excuses" regulars, is an outliner when it comes to the overall structure and most of the plot of his books. But he does a lot of discovery writing to develop and fill out his characters. This "one from column A and one from column B" approach is very common. But I am going to ignore that for the moment and explore the writing process used by two popular and successful authors.
John Grisham is an outliner. I heard an extended interview he once gave in which he talked at length about his writing process. He keeps a file. He puts every idea he comes across or thinks up, good, bad, or indifferent, and big or small, into the file. Then when he sits down to start a new book he combs through the file. He is looking for one or two big ideas he can hang a book on. This is very obvious if you look at his books. Here's a one sentence summary of several of them:
A Time to Kill - What if someone murders a guilty person who is likely to get away with it?
The Firm - What if a "respected and legitimate" law firm is actually a front for the mob?
The Pelican Brief - What if someone murdered a Supreme Court Justice to hide a crime?
The Client - What if someone know something dangerous to the mob and it was a kid?
You get the picture. He starts out with one or two big ideas. Then he goes through his file a second time looking for small ideas he can use to spice the book up and keep it from being a straight line march from crime to conviction. Then he outlines his story. He puts down a summary of the important content of each chapter. It may only be a single sentence. At most it is a short paragraph.
Once he has about 60 of these chapter summaries he is ready to sit down and start writing. He works on one chapter at a time and not necessarily in sequence. He expands his short summary out to a full chapter. He adds description and detail. He may also add extraneous detail. There are two key points to keep in mind while he is writing. The first is to make sure that whatever was in the summary ends up in the chapter. The second is that none of the "filler" material screws up any of the rest of the outline.
Occasionally he will run into a serious problem when writing a chapter. This may necessitate revising the outline. And this in turn may necessitate rewriting some chapters that have already been written. He tries to avoid this and, I presume, he is generally successful. And sometimes he has to abandon an idea because he can't figure out how to make it work.
But this almost always happens at the outline stage. At that point he doesn't have that much invested because he has yet to write a single word of the actual book. Abandoning what originally seemed like a good idea but that ultimately did not pan out comes with only a modest cost. Fortunately, this has rarely happened. And, when it comes to expanding his chapter description into a full blown chapter, he rarely runs into trouble. And when he does usually only a small part of the rest of the book needs to be rewritten.
Someone whose approach is the compete opposite is Lee Child, author of the "Jack Reacher" books. Child let Andy Martin follow him around for a year while he wrote "Make Me". The result is a very interesting book called "Reacher Said Nothing: Lee Child and the Making of 'Make Me'". Child, whose real name is James D. "Jim" Grant, started out in TV in the UK. This gave him an excellent visual sensibility. And that informed his approach to writing the Reacher books.
Everything starts as an image. In "Make Me" he started by imagining it is eleven PM. A train has just pulled into a station of a "wide spot in the road" town in the middle of the American prairie. This is a very evocative image and the prose description he creates from it gets the book off to a good start.
A mostly dark train rolls into a mostly dark station with its windows all lit up. The station itself consists of pools of light separating by the vague outlines of buildings that can barely be made out. Now throw in a couple of mysterious characters as Child does. And, of course, Reacher. If he isn't present there is no book. So Reacher gets off the train. Why? Because that's what he does. He looks around and chapter one is now in the can.
Who are these people and what's going on? At this point Child has no idea. But he moves on to the next scene and the next chapter. Reacher needs someplace to sleep for the night. So that gets us down the street and into a seedy motel. And some more mysterious characters are introduced. What are they up to? We don't know and, at this point, Child doesn't either.
The next day Reacher gets up and wanders the town. That gives Child more opportunities to turn images of what Reacher might see into prose. And it gives Child more opportunities to introduce more characters and have them do interesting things. And, since this is an "action" book, at various points Reacher gets into fights, which he inevitably wins. Why? 'Cause that's just the kind of guy he is and we love him for it.
This process of "where would Reacher go next?" and "what would he see?" and "who would he meet?" and "what would they do?" continues. Child keeps making it up as he goes along. He has a strong visual sense so he keeps maneuvering Reacher into interesting places and situations.
Reacher has to come across good guys, or more likely girls, that he can defend and protect, and bad guys that he can get into fights with. But at this point the construction of the book is driven by this process of stringing together interesting scene after interesting scene. And each scene starts out as a picture in Child's imagination.
Somewhere around the middle of the book Child stops and reviews what he has written so far. The two problems he has to solve at this point are "who are the main bad guys?" and "what are they up to?". Once he settles on the answers to these questions the rest of the book starts to take shape.
This means that Child is now more constrained. He still has a lot of options but he must eventually maneuver Reacher into a situation where we can get to the climax, the big fight in which he defeats the bad guys. Finally, in true Western Movie style, he rides (the bus or train or walks or hitchhikes) off into the sunset in the last chapter.
Child prides himself on writing the book in sequence from beginning to end. And he almost never goes back and makes substantial revisions to earlier parts of the book. It happens but not often.
In the outline method the author knows where she is going before she writes the first word of the first sentence. That is restrictive but it results in a coherent book that seems to have a sense of where it is going from start to finish. Discovery writing allows for more creativity, at least in the early parts of the book. At that point there are literally no restraints. But it may be hard to get to a satisfactory conclusion and for the book to have a clear "thru line".
Child is good enough to pull it off but most writers aren't. And the kinds of books Child writes make it easier to hide the fact that he has literally made it up as he went along. So most writers stumble into the outline method for creating the spine of their book after repeatedly writing themselves into a corner they can't find a way out of. Child is one of the few writers who seems to be consistently able to "work without a net".
But notice that there is an "outline" to Child's work. Reacher starts off somewhere. He wanders around and gets involved. At some point he figures out who the bad guys are and defeats them. Then he rides off into the sunset.
It is certainly not as detailed an outline as the one Grisham uses. On the other hand, I suspect that Grisham does a lot of discovery writing at the chapter level. He knows the chapter needs to hit a couple of key beats. But it also needs some local color and some action, much of which will not end up contributing anything to the final resolution. That leaves a lot of room for discovery writing when filling out the details of the location, the characteristics of minor characters who will not return, and so on.
So now that I have become an expert in the writing of fiction am I about to go out and create the next great American novel, or at least a successful thriller? No! As I said previously, writing he way too much work. And this blog is more than enough to scratch my writing itch.
Finally, some homework. The "Writing Excuses" podcast always assigns homework right before enunciating their sign-off line. Your assignment is to pick a book. It has to be a work of fiction that was written since say 1980. (This outliner/discovery dichotomy was well known among writers by then.) And it must NOT have been written by Sanderson or Grisham or Child. Read the book and determine if the author was primarily and outliner or a primarily a discovery writer.
You don't have to actually turn your homework in. (Being a teacher, someone who actually cares whether someone else does their homework, is another of those "much harder than I want to work at this point in my life" professions.) Instead, we will operate on the "honor system" here. So, you are the only person who will actually know if you did it or not. We'll just assume you did it and got it right. Also, since this is not a blog about how to write fiction you don't have to "now go write". Unless, of course, that's what you want to do.
Writing is hard work. Physically it is easy. We now sit at a keyboard and hammer away. If you are looking for a good way to take some pounds off, the writing process will be absolutely no help. More active forms of exercise (or a good diet) will be required. But the mental processes involved in writing are hard work. I believe that is true even for those who are gifted writers. It is one of those professions where the best advice is:
If you are not driven to do it, don't.This is often applied to the profession of acting. It is well known that in the acting business a tiny few make bales of money. A larger but still small number of additional people make a bare bones living. Everybody else starves to death or, worse yet, never even scores a single paid acting job.
Writing is not as bad as that but it is close. There are probably a few more writers who earn enough at writing to do it full time than there are full time actors. But neither group will ever constitute an employment category large enough to justify its own dedicated set of statistics. There are just too few in either category who earn enough to do it full time.
And, as I said, writing is hard. I dabble at it, mostly for my own interest and edification. I feel a compulsion to "go on record". And I am retired and well enough off that I can afford to. But at no time did I ever consider trying to write for a living. Sure, before I retired I wrote the odd study or position paper, but that was it. The time I spent writing constituted such a small part of my work day that it did not merit a mention in the job description for any job I have ever held.
That said, I do have a brother who has managed to earn a living by writing. That's an impressive feat. Congradulations, Jim.
So what was my path from there to here?
The earliest writing related event in my life that I remember happened to me in the ninth grade. I got a grade of C+ on an essay I wrote for class. I talked to my teacher. I said "I think it is a fair grade but, if it's not too much trouble, could you give me some pointers on how I could have done better?"
She returned the paper a day or so later. She had slightly increased my grade (not what I asked for) but provided no guidance (the thing I actually had asked for). For whatever reason, that was a very discouraging moment when it came to my interest in writing. I decided then and there that writing was not my thing. It was a long time before that changed.
Looking back on my first year in college I now realize that something happened then that should have given me encouragement as a writer but it didn't. I started out as an "Arts and Sciences" major (I later changed to Engineering). At the time there was a requirement that all A&S majors take three one credit English classes. You took them in sequence and each class required you to submit a series of essays.
This was the late '60s and the cutting edge writing technology of the day was the typewriter. Being a poor struggling college student I had a cheap "manual" (entirely mechanical - see any number of movies from the '30s and '40s if you are not sure what I am talking about) typewriter. The key attribute of all but a few fancy (and expensive) electric "office" typewriters was that they didn't permit you to go back and correct anything.
Even the fancy models would only give you a way to fix the odd typographical error. Whatever first hit the page, that's what you were stuck with. Unless, of course, you were willing to retype the whole thing over from the beginning. So that was a problem. I was a terrible typist and I had a low opinion of my writing ability. It would have been nice to be able to revise my first draft but at the time there was no practical way to do that. Even if I was willing to retype the whole thing there was not enough time (see below).
The other thing I learned at the time was that words would not flow unless I was "on deadline". If it had gotten to the point where if I typed like hell I had barely enough time to finish up the paper in time to turn it in at the start of class then the words flowed. Before that it was all "writer's block" all the time. Sitting down the night before, or even several hours before class started, was a complete waste of time.
As you can imagine, this did not lead me to believe that what I was turning in was any good. In fact, one time I accidently left page 2 behind in my dorm room when I turned the paper in. I almost threw it away figuring the paper was so bad it would not make any difference. Fortunately, I changed my mind. I turned the missing page in at the start of the next class and the instructor was gracious enough to accept it. So the instructor at least had the whole paper to evaluate.
But if you did well enough in the first two classes you could skip the third one. I did. That should have told me that whatever I was doing was working better than I thought. But it didn't. I was happy to not have to take the third class but did not conclude, as I should have, that there was actually some hope for me as a writer.
I soon moved over from A&S to the School of Engineering, specifically the Electrical Engineering Department and, most specifically of all, to the Computer track. I left the world of Arts and Sciences and all that entailed behind. Well, not entirely. The School of Engineering had a series of "social studies for Engineers" classes that us students were required to take. The School of Engineering thought that it's graduates shouldn't be totally clueless with respect to "the finer things in life".
And I actually liked these classes. I thought they did a good job of paring away a lot of the BS and focusing on the heart of the matter. And one of these classes was called "Technical Report Writing". It was only a one credit course but I thought it was excellent.
The course was organized around a number of questions and observations. These were designed to focus you on what was important, not about the technical details of whatever you were writing about, but on how to make your written communication effective. And the most important of these was "Who is your audience?"
The object of a report is to communicate information effectively. To do so it is important to focus on who will be reading the report. What do they know? What don't they know? What do they need to learn? These are critical questions. You can't answer them unless you first know who your most important readers will be. Then you must figure out how to answer these questions with respect to those people. I very much took that lesson to heart.
And a good many years later I found myself writing a series of reports, explainers, and recommendations as part of my job. The key member of the audience for these writings was the IT Director. So I did my best to answer the above questions as it related to him.
Then a magical thing happened. He stopped by one day and said "I like your reports. Keep them coming." That turned my life around when it came to how I felt about writing. I now believed I could do an at least adequate job and I no longer feared writing. Of course, the technology had advanced light years since my Freshman College days. It was now easy to "revise and extend" to your hearts content. And we now had spell-check.
So I have internalized that experience from my college days. It is now "put something down, anything. You can always fix it later." I no longer worry if my first draft is good or not. It is only a starting point. I think I am a good editor. I can figure out what's working and what isn't. Then I can set about fixing what needs improvement. And, by the way, spell-check is a good starting point. But it catches far from everything.
If the word on the page is the wrong word but it is spelled correctly a spell-checker will not flag it. So you have to go through your writing to make sure the word on the page is the one you intended and not some correctly spelled but entirely inappropriate word. For instance, I often type "form" when I mean "from". A spell checker is completely happy with "form".
So when I am composing one of these blog posts I start by just banging something out. Sometimes this first draft works pretty well except for the odd word that got past the spell checker and perhaps a few other things that are pretty easy to spot and fix. But some first drafts require a lot of rework. It just depends on the subject and how the juices are flowing that day.
And I used to be able to compose, revise, and publish one of these posts in a single long session. Now I find it often takes me two days to complete a first draft and then whip it into decent shape. (This piece is taking about an average amount of editing and rework.) And, since I do this mostly for my own interest and edification, at some point I decide "it's close enough" and hit the "Publish" button.
Is it the best it could be at that point? No! But I want to avoid the "this is getting to be more trouble than it is worth" stage and that involves deciding that "enough is enough" and letting the world see whatever warts may remain. If I was doing this for a living I couldn't get away with that. But, if I was doing this for a living, I would have an editor to help me out. Trust me. Editors perform a valuable service.
And along the way I have developed an interest in the art and craft of writing. I have read some books and I regularly listen to a podcast called "Writing Excuses". The final line of every episode is "you are out of excuses - now go write". I don't write fiction. But let me pass along some of what I have learned about writing fiction. Why? Because it's fun.
There are two basic approaches to writing fiction. One approach is called "discovery" writing and the other is called "outlining". Most writers are not pure one or the other but they tend to lean more heavily toward one approach over the other.
And lots of writers use one approach in some situations and the other approach in others. Brandon Sanderson, one of the "Writing Excuses" regulars, is an outliner when it comes to the overall structure and most of the plot of his books. But he does a lot of discovery writing to develop and fill out his characters. This "one from column A and one from column B" approach is very common. But I am going to ignore that for the moment and explore the writing process used by two popular and successful authors.
John Grisham is an outliner. I heard an extended interview he once gave in which he talked at length about his writing process. He keeps a file. He puts every idea he comes across or thinks up, good, bad, or indifferent, and big or small, into the file. Then when he sits down to start a new book he combs through the file. He is looking for one or two big ideas he can hang a book on. This is very obvious if you look at his books. Here's a one sentence summary of several of them:
A Time to Kill - What if someone murders a guilty person who is likely to get away with it?
The Firm - What if a "respected and legitimate" law firm is actually a front for the mob?
The Pelican Brief - What if someone murdered a Supreme Court Justice to hide a crime?
The Client - What if someone know something dangerous to the mob and it was a kid?
You get the picture. He starts out with one or two big ideas. Then he goes through his file a second time looking for small ideas he can use to spice the book up and keep it from being a straight line march from crime to conviction. Then he outlines his story. He puts down a summary of the important content of each chapter. It may only be a single sentence. At most it is a short paragraph.
Once he has about 60 of these chapter summaries he is ready to sit down and start writing. He works on one chapter at a time and not necessarily in sequence. He expands his short summary out to a full chapter. He adds description and detail. He may also add extraneous detail. There are two key points to keep in mind while he is writing. The first is to make sure that whatever was in the summary ends up in the chapter. The second is that none of the "filler" material screws up any of the rest of the outline.
Occasionally he will run into a serious problem when writing a chapter. This may necessitate revising the outline. And this in turn may necessitate rewriting some chapters that have already been written. He tries to avoid this and, I presume, he is generally successful. And sometimes he has to abandon an idea because he can't figure out how to make it work.
But this almost always happens at the outline stage. At that point he doesn't have that much invested because he has yet to write a single word of the actual book. Abandoning what originally seemed like a good idea but that ultimately did not pan out comes with only a modest cost. Fortunately, this has rarely happened. And, when it comes to expanding his chapter description into a full blown chapter, he rarely runs into trouble. And when he does usually only a small part of the rest of the book needs to be rewritten.
Someone whose approach is the compete opposite is Lee Child, author of the "Jack Reacher" books. Child let Andy Martin follow him around for a year while he wrote "Make Me". The result is a very interesting book called "Reacher Said Nothing: Lee Child and the Making of 'Make Me'". Child, whose real name is James D. "Jim" Grant, started out in TV in the UK. This gave him an excellent visual sensibility. And that informed his approach to writing the Reacher books.
Everything starts as an image. In "Make Me" he started by imagining it is eleven PM. A train has just pulled into a station of a "wide spot in the road" town in the middle of the American prairie. This is a very evocative image and the prose description he creates from it gets the book off to a good start.
A mostly dark train rolls into a mostly dark station with its windows all lit up. The station itself consists of pools of light separating by the vague outlines of buildings that can barely be made out. Now throw in a couple of mysterious characters as Child does. And, of course, Reacher. If he isn't present there is no book. So Reacher gets off the train. Why? Because that's what he does. He looks around and chapter one is now in the can.
Who are these people and what's going on? At this point Child has no idea. But he moves on to the next scene and the next chapter. Reacher needs someplace to sleep for the night. So that gets us down the street and into a seedy motel. And some more mysterious characters are introduced. What are they up to? We don't know and, at this point, Child doesn't either.
The next day Reacher gets up and wanders the town. That gives Child more opportunities to turn images of what Reacher might see into prose. And it gives Child more opportunities to introduce more characters and have them do interesting things. And, since this is an "action" book, at various points Reacher gets into fights, which he inevitably wins. Why? 'Cause that's just the kind of guy he is and we love him for it.
This process of "where would Reacher go next?" and "what would he see?" and "who would he meet?" and "what would they do?" continues. Child keeps making it up as he goes along. He has a strong visual sense so he keeps maneuvering Reacher into interesting places and situations.
Reacher has to come across good guys, or more likely girls, that he can defend and protect, and bad guys that he can get into fights with. But at this point the construction of the book is driven by this process of stringing together interesting scene after interesting scene. And each scene starts out as a picture in Child's imagination.
Somewhere around the middle of the book Child stops and reviews what he has written so far. The two problems he has to solve at this point are "who are the main bad guys?" and "what are they up to?". Once he settles on the answers to these questions the rest of the book starts to take shape.
This means that Child is now more constrained. He still has a lot of options but he must eventually maneuver Reacher into a situation where we can get to the climax, the big fight in which he defeats the bad guys. Finally, in true Western Movie style, he rides (the bus or train or walks or hitchhikes) off into the sunset in the last chapter.
Child prides himself on writing the book in sequence from beginning to end. And he almost never goes back and makes substantial revisions to earlier parts of the book. It happens but not often.
In the outline method the author knows where she is going before she writes the first word of the first sentence. That is restrictive but it results in a coherent book that seems to have a sense of where it is going from start to finish. Discovery writing allows for more creativity, at least in the early parts of the book. At that point there are literally no restraints. But it may be hard to get to a satisfactory conclusion and for the book to have a clear "thru line".
Child is good enough to pull it off but most writers aren't. And the kinds of books Child writes make it easier to hide the fact that he has literally made it up as he went along. So most writers stumble into the outline method for creating the spine of their book after repeatedly writing themselves into a corner they can't find a way out of. Child is one of the few writers who seems to be consistently able to "work without a net".
But notice that there is an "outline" to Child's work. Reacher starts off somewhere. He wanders around and gets involved. At some point he figures out who the bad guys are and defeats them. Then he rides off into the sunset.
It is certainly not as detailed an outline as the one Grisham uses. On the other hand, I suspect that Grisham does a lot of discovery writing at the chapter level. He knows the chapter needs to hit a couple of key beats. But it also needs some local color and some action, much of which will not end up contributing anything to the final resolution. That leaves a lot of room for discovery writing when filling out the details of the location, the characteristics of minor characters who will not return, and so on.
So now that I have become an expert in the writing of fiction am I about to go out and create the next great American novel, or at least a successful thriller? No! As I said previously, writing he way too much work. And this blog is more than enough to scratch my writing itch.
Finally, some homework. The "Writing Excuses" podcast always assigns homework right before enunciating their sign-off line. Your assignment is to pick a book. It has to be a work of fiction that was written since say 1980. (This outliner/discovery dichotomy was well known among writers by then.) And it must NOT have been written by Sanderson or Grisham or Child. Read the book and determine if the author was primarily and outliner or a primarily a discovery writer.
You don't have to actually turn your homework in. (Being a teacher, someone who actually cares whether someone else does their homework, is another of those "much harder than I want to work at this point in my life" professions.) Instead, we will operate on the "honor system" here. So, you are the only person who will actually know if you did it or not. We'll just assume you did it and got it right. Also, since this is not a blog about how to write fiction you don't have to "now go write". Unless, of course, that's what you want to do.
Tuesday, April 9, 2019
Modern Monetary Theory
Just a couple of months ago I said, in effect, that I was done with Economics. Here's the link: http://sigma5.blogspot.com/2019/02/metaeconomics-wrap-up.html. And, while I normally recommend you go back and check out my older posts, I do not recommend it in this case. What the post I linked to boils down to is "there is a problems with economics - it doesn't work". That's all you really need to know about that post or the previous posts I linked to in that post.
But Alexandria Ocasio-Cortez (AOC) is a phenomenon and, at least at the moment, a media darling. And she has recently been talking about something I had never heard of before called "Modern Monetary Theory" (MMT). That sounds like economics, and not the old, dusty, Keynes/Friedman, stuff we have been arguing about for at least the last fifty years. So, I decided to check it out and let you know what I found.
MMT is an approach to making sense of fiscal/monetary policy. In doing so it relies on some basic ideas from accounting. As an example, the accounting equivalent of Newton's famous "for every action there is an equal and opposite reaction" is "for every Asset there is an equal and offsetting Liability". MMT analysis applies to countries like the US and the UK that control their currencies but not to countries like France and Germany that don't.
MMT then introduces something called "Circuit theory" and posits that it applies to "Currency". The economy consists of a "simple circuit" and Currency (their term for money) is confined to it. There are basic versions (simple circuits) and more elaborate versions that add more paths. But MMT argues that the simple circuit actually works pretty well in terms of telling us what's going on. The more elaborate circuits, while superficially representing the real economy more accurately, don't really tell us much that can't be derived from studying the simple circuit model So I am going to stick pretty much with the simple circuit.
In the simple circuit model the economy consists of two boxes. The first box is the government, in our case the US Federal Government including the Federal Reserve. The other box consists of what I am going to call the economy, roughly everything else. "Currency", the preferred term for money in MMT, circulates between the two boxes. Currency is created by transference from the government to the economy. It is destroyed by transference from the economy to the government. (BTW, this is not how money actually works.)
Put another way, government spending creates Currency and injects it into the economy and paying taxes takes Currency out of the economy and destroys it. (I am sticking with the definitions and usage of these terms and ideas in MMT so these statements do not always mean what they do in normal conversation.) To actually understand what's going on we need to drill down into the details but only a little.
In this model we look at assets and liabilities. Lumping everything into a few general categories works just fine. So we have "physical assets and financial claims owed to the government" (all one thing). We also have "monetary liabilities held by banks" (one kind of liability), and "other liabilities" (the only other kind of liability we need at this point).
We eventually end up needing some other things like "G", the government sector, "DP", the domestic private sector, and eventually, "F", the "foreign" (external to the US) sector. We also need "FA", financial assets, "RA", real assets, "FL", financial liabilities, and "NW", net worth. Finally, throw in a "T" for taxes. But that's all we really need. Not much to model the entire US (or, in the elaboration that requires adding "F" to the mix, all we need to model the economy the whole world).
That's not much but MMT contends it is enough (if we add some high school algebra) to understand how and why things work the way they do. I am not going to go into how all this is done. I am just going to skip to the results. It is these results that are why MMT is popular in some circles. They shed light on important economic issues and provide some interesting policy guidance So let me list some of them:
The government has an unlimited ability to provide funds. As such insolvency and bankruptcy is impossible. It can always pay what it owes in full. Constraints on borrowing and spending are artificial and can always be changed or eliminated given a little political will. What this means is there is no reason not to run a budget deficit.
In fact, because of the "closed loop" nature of the model, if the private sector is going to be allowed to grow by generating profits, savings, retained earnings, etc., this surplus is automatically offset by government debt and deficits. The private sector can't grow unless the government sector goes into debt. So government debt is actually a good thing. To a certain extent, the more, the merrier.
It is a necessary condition for currency to have value that the government accept it as the vehicle by which taxes and fees owed by the government are paid. MMT observes that this "you must pay your taxes in dollars (in the case of the US)" is what underpins the fact that the dollar is generally accepted as a repository of value and becomes accepted for payment by the private sector. MMT loudly proclaims that this is a "necessary" condition but MMT makes no claim that it is a sufficient condition. History seems to bolster this argument.
Taxing, borrowing, and money creation are not exclusive methods of funding the government. In other words, an increase in one does NOT require a decrease on one of the others. This means that none of the traditional reasons for running a balanced budget are actually true. And this is a good thing. Because, as noted above, running a deficit is actually good for the economy. This is also where MMT people say this whole "family budget" model for how the government should be run is wrong so it shouldn't be used.
Now we get to the interesting part. MMT demonstrates that there is no reason not to maintain a full employment posture. Full employment can be maintained while meeting other objectives like low inflation.
In fact, MMT supporters believe that the ideal Fed Funds Rate (FFR - the rate at which banks can borrow money over short terms) should be zero. This, in turn, implies an inflation rate at or near zero. If you follow the MMT logic it is entirely possible to simultaneously have a growing economy, low inflation, and full employment. There are enough "economic knobs" to do all of this at the same time.
I think you can now see why MMT is so popular in some circles. And if MMT was the whole story then we should all jump on the bandwagon. But, alas, there is no such thing as a free lunch.
The first, and perhaps most important question to ask is - does the economy behave as MMT says it should? The answer should not be a surprise, It is "no". So should we toss MMT out the window? Also, "no". MMT does provide a lot of useful insight. And remember that its older and more mainstream competitors also get it wrong regularly. If we only looked at economic theories that always worked we would have no economic theories to study.
The first and most basic problem with MMT is a problem that plagues all economic theories, the "rational person" assumption. Economic theories assume that people always act rationally and always act to advance their own interests. So how does this play out with respect to MMT?
Government injection of money ("Currency", in MMT speak) is supposed to result in increased economic activity. So if you inject more and more money into the private sector it hires and hires and eventually the economy will reach full employment. That in short, (and not only in MMT but in pretty much any economic theory) is one way to get to full employment.
Is this how the real world works? To answer that question let's start by ignoring company activities and instead just focus exclusively on workers. (We will add companies back later so don't panic just yet.) We will arbitrarily divide them into three groups, the rich, the poor, and the middle class. Now let's examine the behavior of a typical member of each group if given more money.
If we give more money to poor people they will go out and spend it immediately. They have many heretofore unmet needs (food, clothing, shelter) and the extra money allows them to better satisfy those needs. So, from an economic perspective the economy sees pretty much all the money going back into the economy where it generates additional economic activity including additional hiring. (Giving more money to poor people is a good way to move the economy rapidly toward full employment.)
But rich people don't have unsatisfied needs. They have had the money necessary to buy everything they want or need for some time so they have already bought it. As a result, the extra money is likely to go into savings and investment. So the extra money has little economic impact. And we see little or no movement toward full employment.
The middle class case falls somewhere in between. Their needs are not as acute as it is in the case of poor people. But they do have unmet wants and needs. So a good chunk of the new money they get will be put right back into the economy with only some of it going toward savings and investment.
Rich people on a per-capita basis spend more and thus put more into the economy than a poor or middle class person. But there are only a few of them so the aggregate economic impact of making the rich richer is small. Putting money into the pockets of the poor gives the economy an immediate and substantial boost. The boost attained from adding to the incomes of middle class people is not as pronounced as in the case of poor people but there are a lot of them (we hope) and most of the added income goes back into the economy relatively quickly.
The same is true of businesses and corporations. Many large companies generate a lot of "free cash" which they can use to take care of pretty much all of their spending and investing needs internally. So making it easier for these businesses to get a loan or forcing interest rates low makes little or no difference to their behavior. They just keep on keeping on without their behavior being influenced much by the availability or cost of loans.
Small businesses behave much like poor people. If loans are easier to get or interest rates are lowered they tend to borrow more and immediately put the borrowings back into the economy. The economy grows.
Medium sized businesses behave analogously to middle class people. So the above analysis is easily extended to realistically cover the whole economy.
Another problem with MMT is Currency versus money. They sound like the same thing. And the "Monetary" in MMT has to do with money. But money in the real economy does not behave like Currency does in MMT.
In the real economy money is created by making loans. Consider bank A. Alice deposits $1,000 in bank A. Now the books show total deposits of $1,000. Bob now gets a $500 loan from bank A. This is okay because the bank holds Alice's $1,000. But if we look at the bank's books total deposits now amount to $1,500. $500 has just been created out of thin air. Now, if Bob pays Charlie $100 his balance goes down by $100 and the bank's total assets are now $1,400. But most likely Charley will deposit the $100 into his account in bank B. If we add the total deposits of both banks together we get the same $1,500. So Bob paying Charlie changes nothing.
Similarly, if Bob uses the loan for a productive purpose he soon has some extra money so he pays off $50 of his loan. This causes Bank A's total deposits to go down by $50. The loan payment has caused $50 of money to be destroyed. If Bob also pays $5 in interest that money just goes into a different account in bank A. So, like when Bob paid Charlie and nothing changed, the interest payment changes nothing.
In short, money is created by the origination of new loans and money is destroyed as a loan's principal is paid off. And this loan origination/payoff happens in the "economy" box and does not involve the movement of Currency between the boxes. So Currency does NOT behave like money. And MMT explicitly says that Currency is created and destroyed only as it moves around the loop. Since loan origination and payment happens within the "economy" box no Currency is created or destroyed. What this means is that the Currency of MMT is not the "money supply" that people usually talk about.
Government spending can be directed to the public good like investments in infrastructure, education, aid to the poor, support of retired people, etc. Since the MMT idea that the government can run substantial deficits and the economy will purr along means that the "we can't do that because it will run up the deficit" argument is a specious one.
That said, the MMT people apply "rational person" thinking to government policy. The officials in charge will do reasonable things, they opine. So inflation, for instance, is a manageable problem. But if recent political history has demonstrated anything, it's that you can't count on the government sticking to the reasonable and avoiding the unreasonable.
And that's a problem when with the MMT approach to both full employment and inflation. I think it is important to remember that the last time inflation was a serious problem in the US was in the late '70s. A lot of MMT advocates weren't even born at that time so they have no personal experience to guide them.
On the other hand, for the entire time these same people have been alive getting and keeping good, well paying jobs has never been easy. They have a deep personal understanding of what it feels like to do a bad job with employment policy. A government policy of guaranteeing full employment sounds like something that is a critical priority if it can be done in a responsible way. And MMT says it can..
There are MMT solutions to controlling inflation if it looks like it is getting out of hand. And, on paper, there is nothing wrong with what they recommend. The problem is that the political will may be lacking when the time comes.
To see the problem we need only look at the "Fed", the Federal Reserve System. The Fed has a responsibility to manage the economy to avoid extremes of either growth or the lack thereof. Inflation is closely tied to an inappropriate rate of growth. Historically, the Fed has used interest rate manipulation to do this. More recently, it has also used "quantitative easing" (QE) when interest rate manipulation proved inadequate.
Generally speaking the solution to not enough growth or inflation is to increase economic activity. Generally speaking the solution to too much growth or inflation is to decrease economic activity. Lowering interest rates is supposed to result in an increase in economic activity and vice versa. But this assumes the level of economic activity is influenced by interest rates.
Historically, it has been. But our modern income structure means that rich people and large corporations are insensitive to interest rates. We saw in the crash of '08 that driving interest rates to zero was insufficient to get the economy growing. A near zero interest rate was supposed to product more loan origination which was supposed to increase the rate of economic activity. But it is important to understand that this is an indirect effect.
Lowering interest does not in and of itself increase economic activity. Instead it is supposed to change behavior (the low rates cause people and companies to change their behavior by borrowing more money and spending it). This increased spending increases economic activity.
But, for the reasons I outlined above, this indirect effect, which had been completely reliable in the past when wealth was far less concentrated, was missing in action in '09. So the Fed had to resort to adding QE to the mix. Fortunately, that eventually worked. So now the economy is in good shape and the Fed should be recharging its batteries.
This in turn means raising the interest rate so that it will be possible to substantially lower it when a recession looms. It also means unwinding QE so that it too can be wound up when it is needed. But as a result of political influence the Fed has changed course.
Last year they were gradually raising the interest rate and gradually unwinding QE. The interest rate was still below historical norms and by the end of the year a lot of QE had yet to be unwound. The Fed moved slowly and carefully enough in both cases that the economy was still able to do pretty well last year.
But, in a short term effort to goose the economy this year, the White House has pressured the Fed to stop increasing the interest rate and to slow or halt the unwinding of QE. It looks like the Fed is going to go along even though it puts at risk the Fed's ability to deal with future problems. When the next recession looms on the horizon, which it inevitably will at some point, the Fed may not have the tools it needs to manage it.
When it comes to full employment I think New York City's experience with rent control (another idea MMT people like) is very informative. Many decades ago NYC instituted rent control policies in an effort to provide affordable housing for people of modest means. They expanded the program several times over the decades because the problem kept getting worse and worse.
But eventually the whole thing became unsustainable. Without rent control developers build buildings because they think they can make a profit and landlords maintain them so they can keep rents high enough to make a profit. But with rent control maintenance no longer made economic sense (you couldn't get your money back from high rents) and developers only built buildings for rich people because those buildings were exempted from rent control.
So buildings that people could afford were allowed to deteriorate. Over time the deterioration was so severe that they became unsafe and had to be torn down. The only new buildings that went up were targeted at rich people. People couldn't find housing yet buildings were falling apart and eventually were torn down. New buildings that people of moderate means could afford to live in were not built to replace them. Things got worse instead of better in terms of the availability of decent moderately priced housing. Eventually NYC was forced to dismantle most of its rent control regime.
I am worried that the same thing would happen with the MMT people's preferred method of getting to full employment. Instead of continuing to goose the economy until full employment was achieved they envision the government becoming the hirer of last resort. The government would directly hire anyone who wanted to work and couldn't find employment elsewhere. They would be paid (under one formulation) the minimum wage, and presumably be put to work on worthwhile projects. But the Federal minimum wage in now $7.25. That is a ridiculously low number. It is not a living wage in even the most depressed economic areas.
Well, the "reasonable person" solution to that problem is to raise the Federal minimum wage. But it is currently so low because it has proved politically impossible to raise. There is a shortage of reasonable people in Congress. But assume that problem somehow gets magically fixed and the program starts working as advertised. Everybody who wants to work but can't find a job gets a government job on some worthwhile project. So far so good.
But a big concern I now have is with structural problems in the job market. We currently have, for instance, a bunch of unemployed (or underemployed) coal miners. The coal mining industry is contracting so there are less jobs than there used to be. This means they can't even move somewhere else to get a coal mining job. What should the country do? Frankly, we already have too many coal miners. We need to change things so there are fewer coal miners. This means things like retraining programs and not programs that put coal miners to work as coal miners.
More broadly, a big employment problem the economy has is structural. We have lots of people who are ready, willing, and able, to perform jobs that don't need doing. At the same time there are other jobs that go begging because there is a shortage of people with the necessary skills. If a job has only temporarily gone away it makes sense to park people somewhere for a short while and wait for things to right themselves. Then the person can go back to doing what they know how to do.
That's what unemployment insurance does and does well. What it doesn't do is match skills training to actual needs. And an "employer of last resort" system doesn't do that either. This is a problem with the structure of the job market, not with the number of jobs on offer. MMT people have nothing to say about this problem. And even if you are able to create the kind of program MMT envisions, over time it will be subject to the same forces that eventually doomed the NYC rent control system.
The NYC rent control program was initially well suited to the problem at hand. But over time, structural changes kept making it less and less effective. But the will did not exist to make the changes necessary to keep up with these structural changes. It was always easiest to put changes off for another year or so, especially if the changes were unpopular, and they usually were. I am concerned that even an initially well designed jobs program would suffer the same fate over time.
So does that mean that MMT is absolutely useless and should just be ignored? No! I think it has many useful things to say. It is a theory that rests on some simple assumptions. If the assumptions were correct than it makes complete sense. But I think it can be very illuminating to explore exactly how reality and the MMT model world differ.
It is definitely true that, as MMT says, most of the discussion about debt and the deficit is nonsense and should be rejected. And MMT gets us closer to how the real world actually works when it comes to full employment and its economic impacts.
Conventional economic theory says that certain unemployment numbers represent effective full employment. Dropping below them was supposed to immediately result in all kinds of bad things (a big increase in inflation, an immediate economic downturn, etc.). None of those things happened. We are currently are at historic lows when it comes to the official unemployment rate. But, again, the bad things that are supposed to be happening aren't happening. The economy is instead behaving like we are a substantial distance away from full employment.
And there are a number of areas where MMT analysis is valuable and illuminating. The current problems the European countries that use the Euro are having is much easier to understand if you apply MMT to the situation.
I also found what MMT had to say about the ways the Fed and the Treasury department informally work together in a joint effort to make sure both departments can perform effectively in their separate roles, for instance, quite illuminating. I could go on but I won't.
Can MMT help make sense of this and many other things? Definitely yes, in some cases, and maybe in others. I am going to skip over the several other areas I found MMT analysis quite helpful in. So, let's give it a chance. It is no nuttier than other economic theories I have seen.
But Alexandria Ocasio-Cortez (AOC) is a phenomenon and, at least at the moment, a media darling. And she has recently been talking about something I had never heard of before called "Modern Monetary Theory" (MMT). That sounds like economics, and not the old, dusty, Keynes/Friedman, stuff we have been arguing about for at least the last fifty years. So, I decided to check it out and let you know what I found.
Warning: This is definitely NOT a light and fluffy post. But, on the other hand, I think it is easy to follow. You've been warned.A good place to start is with the Wikipedia article on the subject. Even though I recommend the article I'm not going to do that in this case. Instead I am going to use a paper by Tymoigne and Wray that was published by the Levy Economics Institute of Bard College as the basis for my discussion of MMT. You can find it by checking the references in the Wikipedia article. So what is MMT, also called Mosler Economics (ME), and some other things? (MMT is the name AOC used so that's what I am going to go with.).
MMT is an approach to making sense of fiscal/monetary policy. In doing so it relies on some basic ideas from accounting. As an example, the accounting equivalent of Newton's famous "for every action there is an equal and opposite reaction" is "for every Asset there is an equal and offsetting Liability". MMT analysis applies to countries like the US and the UK that control their currencies but not to countries like France and Germany that don't.
MMT then introduces something called "Circuit theory" and posits that it applies to "Currency". The economy consists of a "simple circuit" and Currency (their term for money) is confined to it. There are basic versions (simple circuits) and more elaborate versions that add more paths. But MMT argues that the simple circuit actually works pretty well in terms of telling us what's going on. The more elaborate circuits, while superficially representing the real economy more accurately, don't really tell us much that can't be derived from studying the simple circuit model So I am going to stick pretty much with the simple circuit.
In the simple circuit model the economy consists of two boxes. The first box is the government, in our case the US Federal Government including the Federal Reserve. The other box consists of what I am going to call the economy, roughly everything else. "Currency", the preferred term for money in MMT, circulates between the two boxes. Currency is created by transference from the government to the economy. It is destroyed by transference from the economy to the government. (BTW, this is not how money actually works.)
Put another way, government spending creates Currency and injects it into the economy and paying taxes takes Currency out of the economy and destroys it. (I am sticking with the definitions and usage of these terms and ideas in MMT so these statements do not always mean what they do in normal conversation.) To actually understand what's going on we need to drill down into the details but only a little.
In this model we look at assets and liabilities. Lumping everything into a few general categories works just fine. So we have "physical assets and financial claims owed to the government" (all one thing). We also have "monetary liabilities held by banks" (one kind of liability), and "other liabilities" (the only other kind of liability we need at this point).
We eventually end up needing some other things like "G", the government sector, "DP", the domestic private sector, and eventually, "F", the "foreign" (external to the US) sector. We also need "FA", financial assets, "RA", real assets, "FL", financial liabilities, and "NW", net worth. Finally, throw in a "T" for taxes. But that's all we really need. Not much to model the entire US (or, in the elaboration that requires adding "F" to the mix, all we need to model the economy the whole world).
That's not much but MMT contends it is enough (if we add some high school algebra) to understand how and why things work the way they do. I am not going to go into how all this is done. I am just going to skip to the results. It is these results that are why MMT is popular in some circles. They shed light on important economic issues and provide some interesting policy guidance So let me list some of them:
The government has an unlimited ability to provide funds. As such insolvency and bankruptcy is impossible. It can always pay what it owes in full. Constraints on borrowing and spending are artificial and can always be changed or eliminated given a little political will. What this means is there is no reason not to run a budget deficit.
In fact, because of the "closed loop" nature of the model, if the private sector is going to be allowed to grow by generating profits, savings, retained earnings, etc., this surplus is automatically offset by government debt and deficits. The private sector can't grow unless the government sector goes into debt. So government debt is actually a good thing. To a certain extent, the more, the merrier.
It is a necessary condition for currency to have value that the government accept it as the vehicle by which taxes and fees owed by the government are paid. MMT observes that this "you must pay your taxes in dollars (in the case of the US)" is what underpins the fact that the dollar is generally accepted as a repository of value and becomes accepted for payment by the private sector. MMT loudly proclaims that this is a "necessary" condition but MMT makes no claim that it is a sufficient condition. History seems to bolster this argument.
Taxing, borrowing, and money creation are not exclusive methods of funding the government. In other words, an increase in one does NOT require a decrease on one of the others. This means that none of the traditional reasons for running a balanced budget are actually true. And this is a good thing. Because, as noted above, running a deficit is actually good for the economy. This is also where MMT people say this whole "family budget" model for how the government should be run is wrong so it shouldn't be used.
Now we get to the interesting part. MMT demonstrates that there is no reason not to maintain a full employment posture. Full employment can be maintained while meeting other objectives like low inflation.
In fact, MMT supporters believe that the ideal Fed Funds Rate (FFR - the rate at which banks can borrow money over short terms) should be zero. This, in turn, implies an inflation rate at or near zero. If you follow the MMT logic it is entirely possible to simultaneously have a growing economy, low inflation, and full employment. There are enough "economic knobs" to do all of this at the same time.
I think you can now see why MMT is so popular in some circles. And if MMT was the whole story then we should all jump on the bandwagon. But, alas, there is no such thing as a free lunch.
The first, and perhaps most important question to ask is - does the economy behave as MMT says it should? The answer should not be a surprise, It is "no". So should we toss MMT out the window? Also, "no". MMT does provide a lot of useful insight. And remember that its older and more mainstream competitors also get it wrong regularly. If we only looked at economic theories that always worked we would have no economic theories to study.
The first and most basic problem with MMT is a problem that plagues all economic theories, the "rational person" assumption. Economic theories assume that people always act rationally and always act to advance their own interests. So how does this play out with respect to MMT?
Government injection of money ("Currency", in MMT speak) is supposed to result in increased economic activity. So if you inject more and more money into the private sector it hires and hires and eventually the economy will reach full employment. That in short, (and not only in MMT but in pretty much any economic theory) is one way to get to full employment.
Is this how the real world works? To answer that question let's start by ignoring company activities and instead just focus exclusively on workers. (We will add companies back later so don't panic just yet.) We will arbitrarily divide them into three groups, the rich, the poor, and the middle class. Now let's examine the behavior of a typical member of each group if given more money.
If we give more money to poor people they will go out and spend it immediately. They have many heretofore unmet needs (food, clothing, shelter) and the extra money allows them to better satisfy those needs. So, from an economic perspective the economy sees pretty much all the money going back into the economy where it generates additional economic activity including additional hiring. (Giving more money to poor people is a good way to move the economy rapidly toward full employment.)
But rich people don't have unsatisfied needs. They have had the money necessary to buy everything they want or need for some time so they have already bought it. As a result, the extra money is likely to go into savings and investment. So the extra money has little economic impact. And we see little or no movement toward full employment.
The middle class case falls somewhere in between. Their needs are not as acute as it is in the case of poor people. But they do have unmet wants and needs. So a good chunk of the new money they get will be put right back into the economy with only some of it going toward savings and investment.
Rich people on a per-capita basis spend more and thus put more into the economy than a poor or middle class person. But there are only a few of them so the aggregate economic impact of making the rich richer is small. Putting money into the pockets of the poor gives the economy an immediate and substantial boost. The boost attained from adding to the incomes of middle class people is not as pronounced as in the case of poor people but there are a lot of them (we hope) and most of the added income goes back into the economy relatively quickly.
The same is true of businesses and corporations. Many large companies generate a lot of "free cash" which they can use to take care of pretty much all of their spending and investing needs internally. So making it easier for these businesses to get a loan or forcing interest rates low makes little or no difference to their behavior. They just keep on keeping on without their behavior being influenced much by the availability or cost of loans.
Small businesses behave much like poor people. If loans are easier to get or interest rates are lowered they tend to borrow more and immediately put the borrowings back into the economy. The economy grows.
Medium sized businesses behave analogously to middle class people. So the above analysis is easily extended to realistically cover the whole economy.
Another problem with MMT is Currency versus money. They sound like the same thing. And the "Monetary" in MMT has to do with money. But money in the real economy does not behave like Currency does in MMT.
In the real economy money is created by making loans. Consider bank A. Alice deposits $1,000 in bank A. Now the books show total deposits of $1,000. Bob now gets a $500 loan from bank A. This is okay because the bank holds Alice's $1,000. But if we look at the bank's books total deposits now amount to $1,500. $500 has just been created out of thin air. Now, if Bob pays Charlie $100 his balance goes down by $100 and the bank's total assets are now $1,400. But most likely Charley will deposit the $100 into his account in bank B. If we add the total deposits of both banks together we get the same $1,500. So Bob paying Charlie changes nothing.
Similarly, if Bob uses the loan for a productive purpose he soon has some extra money so he pays off $50 of his loan. This causes Bank A's total deposits to go down by $50. The loan payment has caused $50 of money to be destroyed. If Bob also pays $5 in interest that money just goes into a different account in bank A. So, like when Bob paid Charlie and nothing changed, the interest payment changes nothing.
In short, money is created by the origination of new loans and money is destroyed as a loan's principal is paid off. And this loan origination/payoff happens in the "economy" box and does not involve the movement of Currency between the boxes. So Currency does NOT behave like money. And MMT explicitly says that Currency is created and destroyed only as it moves around the loop. Since loan origination and payment happens within the "economy" box no Currency is created or destroyed. What this means is that the Currency of MMT is not the "money supply" that people usually talk about.
Government spending can be directed to the public good like investments in infrastructure, education, aid to the poor, support of retired people, etc. Since the MMT idea that the government can run substantial deficits and the economy will purr along means that the "we can't do that because it will run up the deficit" argument is a specious one.
That said, the MMT people apply "rational person" thinking to government policy. The officials in charge will do reasonable things, they opine. So inflation, for instance, is a manageable problem. But if recent political history has demonstrated anything, it's that you can't count on the government sticking to the reasonable and avoiding the unreasonable.
And that's a problem when with the MMT approach to both full employment and inflation. I think it is important to remember that the last time inflation was a serious problem in the US was in the late '70s. A lot of MMT advocates weren't even born at that time so they have no personal experience to guide them.
On the other hand, for the entire time these same people have been alive getting and keeping good, well paying jobs has never been easy. They have a deep personal understanding of what it feels like to do a bad job with employment policy. A government policy of guaranteeing full employment sounds like something that is a critical priority if it can be done in a responsible way. And MMT says it can..
There are MMT solutions to controlling inflation if it looks like it is getting out of hand. And, on paper, there is nothing wrong with what they recommend. The problem is that the political will may be lacking when the time comes.
To see the problem we need only look at the "Fed", the Federal Reserve System. The Fed has a responsibility to manage the economy to avoid extremes of either growth or the lack thereof. Inflation is closely tied to an inappropriate rate of growth. Historically, the Fed has used interest rate manipulation to do this. More recently, it has also used "quantitative easing" (QE) when interest rate manipulation proved inadequate.
Generally speaking the solution to not enough growth or inflation is to increase economic activity. Generally speaking the solution to too much growth or inflation is to decrease economic activity. Lowering interest rates is supposed to result in an increase in economic activity and vice versa. But this assumes the level of economic activity is influenced by interest rates.
Historically, it has been. But our modern income structure means that rich people and large corporations are insensitive to interest rates. We saw in the crash of '08 that driving interest rates to zero was insufficient to get the economy growing. A near zero interest rate was supposed to product more loan origination which was supposed to increase the rate of economic activity. But it is important to understand that this is an indirect effect.
Lowering interest does not in and of itself increase economic activity. Instead it is supposed to change behavior (the low rates cause people and companies to change their behavior by borrowing more money and spending it). This increased spending increases economic activity.
But, for the reasons I outlined above, this indirect effect, which had been completely reliable in the past when wealth was far less concentrated, was missing in action in '09. So the Fed had to resort to adding QE to the mix. Fortunately, that eventually worked. So now the economy is in good shape and the Fed should be recharging its batteries.
This in turn means raising the interest rate so that it will be possible to substantially lower it when a recession looms. It also means unwinding QE so that it too can be wound up when it is needed. But as a result of political influence the Fed has changed course.
Last year they were gradually raising the interest rate and gradually unwinding QE. The interest rate was still below historical norms and by the end of the year a lot of QE had yet to be unwound. The Fed moved slowly and carefully enough in both cases that the economy was still able to do pretty well last year.
But, in a short term effort to goose the economy this year, the White House has pressured the Fed to stop increasing the interest rate and to slow or halt the unwinding of QE. It looks like the Fed is going to go along even though it puts at risk the Fed's ability to deal with future problems. When the next recession looms on the horizon, which it inevitably will at some point, the Fed may not have the tools it needs to manage it.
When it comes to full employment I think New York City's experience with rent control (another idea MMT people like) is very informative. Many decades ago NYC instituted rent control policies in an effort to provide affordable housing for people of modest means. They expanded the program several times over the decades because the problem kept getting worse and worse.
But eventually the whole thing became unsustainable. Without rent control developers build buildings because they think they can make a profit and landlords maintain them so they can keep rents high enough to make a profit. But with rent control maintenance no longer made economic sense (you couldn't get your money back from high rents) and developers only built buildings for rich people because those buildings were exempted from rent control.
So buildings that people could afford were allowed to deteriorate. Over time the deterioration was so severe that they became unsafe and had to be torn down. The only new buildings that went up were targeted at rich people. People couldn't find housing yet buildings were falling apart and eventually were torn down. New buildings that people of moderate means could afford to live in were not built to replace them. Things got worse instead of better in terms of the availability of decent moderately priced housing. Eventually NYC was forced to dismantle most of its rent control regime.
I am worried that the same thing would happen with the MMT people's preferred method of getting to full employment. Instead of continuing to goose the economy until full employment was achieved they envision the government becoming the hirer of last resort. The government would directly hire anyone who wanted to work and couldn't find employment elsewhere. They would be paid (under one formulation) the minimum wage, and presumably be put to work on worthwhile projects. But the Federal minimum wage in now $7.25. That is a ridiculously low number. It is not a living wage in even the most depressed economic areas.
Well, the "reasonable person" solution to that problem is to raise the Federal minimum wage. But it is currently so low because it has proved politically impossible to raise. There is a shortage of reasonable people in Congress. But assume that problem somehow gets magically fixed and the program starts working as advertised. Everybody who wants to work but can't find a job gets a government job on some worthwhile project. So far so good.
But a big concern I now have is with structural problems in the job market. We currently have, for instance, a bunch of unemployed (or underemployed) coal miners. The coal mining industry is contracting so there are less jobs than there used to be. This means they can't even move somewhere else to get a coal mining job. What should the country do? Frankly, we already have too many coal miners. We need to change things so there are fewer coal miners. This means things like retraining programs and not programs that put coal miners to work as coal miners.
More broadly, a big employment problem the economy has is structural. We have lots of people who are ready, willing, and able, to perform jobs that don't need doing. At the same time there are other jobs that go begging because there is a shortage of people with the necessary skills. If a job has only temporarily gone away it makes sense to park people somewhere for a short while and wait for things to right themselves. Then the person can go back to doing what they know how to do.
That's what unemployment insurance does and does well. What it doesn't do is match skills training to actual needs. And an "employer of last resort" system doesn't do that either. This is a problem with the structure of the job market, not with the number of jobs on offer. MMT people have nothing to say about this problem. And even if you are able to create the kind of program MMT envisions, over time it will be subject to the same forces that eventually doomed the NYC rent control system.
The NYC rent control program was initially well suited to the problem at hand. But over time, structural changes kept making it less and less effective. But the will did not exist to make the changes necessary to keep up with these structural changes. It was always easiest to put changes off for another year or so, especially if the changes were unpopular, and they usually were. I am concerned that even an initially well designed jobs program would suffer the same fate over time.
So does that mean that MMT is absolutely useless and should just be ignored? No! I think it has many useful things to say. It is a theory that rests on some simple assumptions. If the assumptions were correct than it makes complete sense. But I think it can be very illuminating to explore exactly how reality and the MMT model world differ.
It is definitely true that, as MMT says, most of the discussion about debt and the deficit is nonsense and should be rejected. And MMT gets us closer to how the real world actually works when it comes to full employment and its economic impacts.
Conventional economic theory says that certain unemployment numbers represent effective full employment. Dropping below them was supposed to immediately result in all kinds of bad things (a big increase in inflation, an immediate economic downturn, etc.). None of those things happened. We are currently are at historic lows when it comes to the official unemployment rate. But, again, the bad things that are supposed to be happening aren't happening. The economy is instead behaving like we are a substantial distance away from full employment.
And there are a number of areas where MMT analysis is valuable and illuminating. The current problems the European countries that use the Euro are having is much easier to understand if you apply MMT to the situation.
I also found what MMT had to say about the ways the Fed and the Treasury department informally work together in a joint effort to make sure both departments can perform effectively in their separate roles, for instance, quite illuminating. I could go on but I won't.
Can MMT help make sense of this and many other things? Definitely yes, in some cases, and maybe in others. I am going to skip over the several other areas I found MMT analysis quite helpful in. So, let's give it a chance. It is no nuttier than other economic theories I have seen.
Monday, March 18, 2019
50 Years of Science - Part 13
This post is the next in a series dating back several years. In fact, it has been going on long enough that, as of this year, it would be more accurate to call it "60 Years of Science". But I am going to continue to stick with the old title. Chalk it up to nostalgia. And, as the title indicates, this is the thirteenth post in the series. Your can go to: http://sigma5.blogspot.com/2017/04/50-years-of-science-links.html for a post that contains links to all the entries in the series. I will update that post to include a link to this entry right after I post this entry.
I take Isaac Asimov's book "The Intelligent Man's Guide to the Physical Sciences" as my baseline for the state of the science when he wrote the book (1959 - 60). In these posts I am reviewing what he reported and what's changed since. For this post I am starting with the section he titled "The Nuclear Atom". I will then move on to the section he titled "Isotopes". Both are from the chapter he titled "The Particles".
The book was written at an interesting time in the evolution of our understanding of things subatomic. As Asimov notes "it was known by 1900 that the atom was not a simple, indivisible particle". By the time Asimov wrote the book the situation had reached maximum complexity. Roughly a hundred subatomic particles had been identified. This drove nuclear physicists nuts as there are only about a hundred different elements. The subatomic world was supposed to be simpler (i. e. composed of fewer parts and pieces) than the atomic world, not more complicated.
The impasse was broken a few years later by the introduction of "Quark theory". Quark theory made sense out of this large zoo of subatomic particles. One component of this idea was to organize them into families. Auto makers have developed "lines" of cars. Ford, for instance, used to have (it has now been discontinued) the Ford "Crown Victoria", the Mercury "Grand Marquis", and the Lincoln "Town Car".
To a great extend they were the same car. The Crown Victoria was the least expensive "base line" version for the economy end of the market. The Town Car was the most expensive "luxury" version for the carriage trade. And the Grand Marquis was midway between the two, both is terms of price and in terms of "trim level" and other features. It was fancier (and more expensive) than the Crown Victoria but not as fancy (or expensive) as the Town Car. But all three shared a lot of common design elements, parts, etc.
Nuclear physicists determined that there were similar familial relationships between subatomic particles. Particle families were grouped into "generations". In the case of one family of particles, the first generation was the Electron. It's second generation was the "Muon", originally called the "Mu Mason". Both particles shared a lot of attributes. The principal difference between the two was their mass. The Muon was much heavier and, therefore, held a lot more energy. The third generation was represented by the "Tau", originally called the "Tau Lepton". Again, the principal difference between it and the other two generations was a mass and, therefore, energy that was much larger than the other particles in the same family.
And with the introduction of the generations concept it became possible to line up various generations of one family of particles with the appropriate generational member of other families. So the cousin of the Electron that was a member of the Neutrino family ended up being named the "Electron Neutrino". Similarly, the second generation particle was eventually named the "Muon Neutrino". Unsurprisingly, the third generation ended up being named the "Tau Neutrino".
The second component of the new theory was the Quarks themselves. In the same way that atoms were composed of subatomic particles, some (but not all) of what had been thought to be indivisible subatomic particles like the Proton, turned out to be composites of new and heretofore unsuspected truly fundamental particles. And these newly discovered truly fundamental particles were called Quarks. And, cutting to the chase, Quarks could also be put into the same kind of "three generations" structure I have talked about above. But that's getting ahead of the story. Back to Asimov.
The Electron was identified by J. J. Thompson in about 1900. He was also the first to propose a model of the atom. It was like a cookie, specifically like an oatmeal raisin cookie. An atom consisted of some unspecified material playing the role of the oatmeal batter. Into it was stuck the Electrons, which played the role of the raisins. This model didn't last long but you have to start somewhere. Things quickly got complicated due to the study of radioactivity.
Becquerel did a lot of the early work. He quickly determined that in a lot of cases radioactivity looked like a particle shooting out of the atom. And some of there particles seemed to be Electrons. So far, so good. But another kind of emission was what he called an "Alpha" particle. It had a positive charge so was presumably a chunk of the oatmeal part of the atomic cookie. There were definitely other kinds of emissions. Following the convention he set up, he named a certain class "Beta" particles and another class "Gamma" particles. It didn't take long to determine that a Beta particle and an Electron were the same thing but the name "Beta" stuck and is still in use. And it also turned out that "Gamma" emissions looked like high energy X-Rays but the name "Gamma ray" also stuck and is still in use.
Good experimental work determined that Alpha particles were at least twice as heavy as Hydrogen atoms. More good experimental work soon determined that they were a form of Helium that weighed four times what a single atom of Hydrogen weighed. Other scientists followed up other clues and identified the Proton at about the same time the Alpha work was being done. Protons and electrons have equal but opposite charges. But a proton is roughly two thousand times as heavy as an electron. This large difference in masses was a puzzle that had no solution at the time of Asimov's book.
But the identification of the Proton led to the next iteration of the model for the atom. Now it consisted of Electrons orbiting a "nucleus" consisting of Protons. This was analogous to the solar system where the Sun is in the center (nucleus) and planets (Electrons) orbit it. This model led to a lot of questions. But it also led to some answers. The identity of an element was tied to the number of Protons in the nucleus. Hydrogen is Hydrogen because it has a nucleus containing one Proton. Helium is Helium because it's nucleus contains two Protons. Lithium is Lithium because its nucleus contains three Protons. And so on.
Also, chemistry is all about Electrons. They occupy the outer regions of the atom so when two atoms come close to each other, what they mostly see is the other's Electrons. Remove the Electron from a Hydrogen atom and it is still a Hydrogen atom. It just has a net positive electrical charge that attracts the electrons in the outer regions of other atoms. And that is the basis of how chemical bonds work. Similarly, a Beta particle is a Helium atom from which both outer electrons have been removed. It has a positive electrical charge that is twice as strong as that of a Hydrogen atom whose single electron has been stripped away. This was real progress.
One question that was quickly identified was the "mass" question. The Helium atom should weigh twice as much as a Hydrogen atom but it actually seemed to weigh roughly four times as much. Other, similar discrepancies popped up all over the place. One quick fix to this problem was to assume that a nucleus also contained Electrons. If a Helium nucleus contained four Protons and two Electrons then the mass would come out about right because the electrons weighed so little. And the charge would come out right because the two Electrons would cancel out two of the four Protons.
And there was another, more subtle version of this problem. According to this theory a Helium nucleus contained four Protons and two Electrons. But the weight of the Electrons could be neglected so the mass of the Helium nucleus should be exactly four times that of Hydrogen. But it was off by a bit. All masses for all atoms were off by a bit, a little bit in some cases, and a lot in others. What was going on? The next chapter is "what's going on". So let's move on to "Isotopes".
The obvious base for calculating the relative weights of the various elements is Hydrogen. But, as we have seen with Helium, that doesn't work very well. Helium does not end up having a weight that is exactly four times that of Hydrogen. Various things were tried and eventually it was decided to use Oxygen as the base. It seemed to be the least worst choice. (The reason for this will be explained below.) It was given a standard weight of 16. The weight of other elements then often fell close to an integer number. But not always. Chlorine, for instance, came in at 35.457 instead of a nice round 35. It took a while to figure out what was going on.
Becquerel found that if you purified Uranium, then left it lying around undisturbed for a while, it actually got more radioactive. He speculated that somehow a small portion of the not very radioactive Uranium was mysteriously transforming itself into highly radioactive "Uranium X". And if you carefully separated out the Uranium X then the remaining "regular" Uranium would, over time, just make some more Uranium X.
Rutherford found out that the same thing happened with Thorium. And it had already been determined that Radium, if left alone, would somehow create Radon gas. As this general phenomenon was further investigated it slowly dawned that elements were miraculously transforming themselves into other elements. And in every case radioactivity was involved.
Soddy in 1913 finally cottoned on to what was happening. If a radioactive transformation involved the emission of an Alpha particle then the source element was transformed into a different element that was down two places in the periodic table. What was happening was fission. An element broke into two pieces, One of them was an Alpha particle that carried off two Protons. The element that remained behind retained all of the other Protons but was now a different element due to the smaller number of Protons its nucleus now contained.
There were obviously multiple versions of elements. They all had the same number of Protons so they had to differ in some other way. He called these different versions "Isotopes" without worrying about what the difference was. An obvious "fix" was to assume that "the nucleus consists of a certain number of Protons and a certain number of Electrons". If we added extra Protons but also added the same number of Electrons to the nucleus at the same time then the atomic number stays the same. This trick allows us to account for all then known nuclear transformations.
We still ignore the masses of the nuclear Electrons as they are so light that their effect on the mass of the nucleus is what accountants call "not material". But we now have a new number, the "mass number". The mass number, according to our new theory is the total or "gross" number of Protons. The atomic number is the net Proton count, all nuclear Protons minus however many nuclear Electrons are present. Two isotopes of the same element have the same atomic number (net number of Protons) but different mass numbers (gross number of Protons).
And this isotope business helped to explain why the weight of a particular element did not end up to be a round number. If a typical sample of, say, Helium, contained some Helium -3 (atomic number 2, mass number 3) and Helium-4 (atomic number 2, mass number 4) then the atomic mass of the sample could come out anywhere between three and four depending on the ratio of the two isotopes. Things became clearer when the "mass spectrometer" was invented.
You turn your sample into a gas, then you "ionize" it (strip one or more Electrons off of each atom so it has an electric charge). Then you make it fly through a magnetic field at a constant speed. The magnetic field will make the trajectory of each atom bend. How much will it bend? Well, that depends on the mass, the speed, and the electric charge. If we can keep the speed and electric charge constant then if the mass is higher the sample's trajectory will be bent by a smaller amount. If we can pull this trick off (which is very hard to do in lots of circumstances) then we can weigh each individual particle.
The mass spectrometer allowed many individual isotopes of many elements to be weighed. And by measuring how much of each isotope a representative sample contained the "isotopic composition" of various elements could be determined. And, as an interesting side effect of this work. it was determined that some isotopes of some elements were "stable", they never engaged in radioactive decay. And, of course, some isotopes were determined to be mildly radioactive (they "decayed" slowly into other elements and isotopes) and others were highly radioactive (they quickly decayed into other elements and isotopes).
And it turned out that even Oxygen, the standard against which other elements were weighed when Asimov wrote his book, was a combination of isotopes. It's just that it was 99.9% Oxygen-16 and only a tiny amount of other isotopes of Oxygen. Since Asimov wrote the book, the reference standard against which the relative atomic mass of each isotope of each element is compared, has been changed from "Oxygen" to "Carbon-12".
Carbon is carefully separated out and pure Carbon-12 is isolated. Then it is weighed and given an arbitrary "atomic mass" of 12. The relative atomic mass of other isotopes relative to that of Carbon-12 is determined and that ratio is used to determine that isotope's atomic mass. This resulted in a small change in the atomic masses assigned to other elements.
This change was made because it improved the situation. It brought a lot of atomic masses closer to being integral numbers once isotope ratios were accounted for. And each isotope was now handled separately for the purposes of determining its atomic mass. Most discrepancies are now small, but with the exception of Carbon-12, none of them is an exact integer. The reason for this had been solved by the time Asimov wrote his book. But that's something he gets into later.
So physicists were pretty happy at this point. The "nucleus is a mix of protons and electrons" theory worked very well. But then Rutherford came up with an experimental setup that allowed him to probe the nucleus in new ways. He figured out how to fire Alpha particles at a target. The target was made from Zinc Sulfide which would "scintillate" (throw off a spark of light that could be seen with the naked eye) when hit by an Alpha particle.
He then put a metal disk in the path to see what would happen. At first the scintillations stopped. But then he added Hydrogen to the mix and things changed. He concluded that single Protons, presumably from the Hydrogen, were now striking the target because they had enough energy to penetrate his metal disk. Very interesting.
He tried some different things before switching to what is now called a "Wilson cloud chamber". If you have air with a lot of water vapor in it then lots of things will cause the water vapor to condense into small droplets that are visible using just your naked eye. By carefully tweaking the apparatus you can see the path of ionized particles. If you then add a magnetic field the paths of the ionized particles will bend just like they do in a mass spectrometer. Because you can see the paths of ionized particles you can take the same kinds of measurements. This is a classic example of a better apparatus leading directly to better science.
A careful analysis of an Alpha particle striking the nucleus of a Nitrogen atom led to a determination that the Nitrogen nucleus could sometimes absorb the Alpha particle. It immediately threw off a Proton and transmuted into Oxygen. The Proton's path could be easily seen because at this point it was ionized.
This is the first example of a man-made process that could transmute one element into another. Alchemists had hoped to transmute "base metal", by which they meant lead, into gold. This can now be done. But the process is fantastically expensive. You are far better off just buying gold in the first place.
The method of viewing a cloud chamber as it made the paths of charged particles visible using a "mark one eyeball" was quickly replaced by taking photographs. Photographs could capture more detail and resulted in a permanent record that could be reviewed by others.
Asimov notes that the scientist who nailed down the Nitrogen to Oxygen transmutation had to take and examine 20,000 photographs to find 8 in which the event he was interested in occurred. By the time Asimov's book was published scientists were employing rafts of graduate students to examine hundreds of thousands of photographs looking for interesting events.
But the rate at which scientific instruments could churn out photographs, all of which had to be examined for events of potential interest, kept increasing. It soon reached a practical limit. Fortunately, at about the time the practical limit was reached solid state devices came along that were capable of replacing the cloud chamber.
Detectors capable of collecting the same kind of data (particle path, speed, mass, etc.) that had been extracted from cloud chamber photographs now exist. And they work pretty well for charged particles. But there is only a very limited capability to observe and measure the attributes of uncharged particles. In some cases it is possible to detect the presence of an uncharged particle. It is also sometimes possible to measure the energy it carries. But that's about it. Still, that's better than nothing. There is no doubt that the business of detecting and tracking uncharged particles doesn't work nearly as well as scientists would like.
But it is now possible to hook detectors up to computers and have them look for and measure events. That gets grad students out of the business of going blind by looking at zillions of photographs. It might sound like that puts them out of work, but don't worry. They still have lots to do.
Even after computers do a lot of preliminary work it is still necessary for a trained person to look at the result. The detectors at CERN, the home of the LHC, the largest particle detector in the world, can generate the equivalent of those 20,000 photographs in a small fraction of a second. Even with all the computer filtering the LHC can turn out hundreds of potentially interesting events per day. That's why the staff of each detector runs into the thousands.
We have now reached the point where we have an atom with a nucleus of protons and, as far as we know at this point, some electrons. The nucleus is surrounded by electrons in some mysterious configuration. This is just the beginning of the story. But there is where I must leave it in this installment. To be continued . . .
I take Isaac Asimov's book "The Intelligent Man's Guide to the Physical Sciences" as my baseline for the state of the science when he wrote the book (1959 - 60). In these posts I am reviewing what he reported and what's changed since. For this post I am starting with the section he titled "The Nuclear Atom". I will then move on to the section he titled "Isotopes". Both are from the chapter he titled "The Particles".
The book was written at an interesting time in the evolution of our understanding of things subatomic. As Asimov notes "it was known by 1900 that the atom was not a simple, indivisible particle". By the time Asimov wrote the book the situation had reached maximum complexity. Roughly a hundred subatomic particles had been identified. This drove nuclear physicists nuts as there are only about a hundred different elements. The subatomic world was supposed to be simpler (i. e. composed of fewer parts and pieces) than the atomic world, not more complicated.
The impasse was broken a few years later by the introduction of "Quark theory". Quark theory made sense out of this large zoo of subatomic particles. One component of this idea was to organize them into families. Auto makers have developed "lines" of cars. Ford, for instance, used to have (it has now been discontinued) the Ford "Crown Victoria", the Mercury "Grand Marquis", and the Lincoln "Town Car".
To a great extend they were the same car. The Crown Victoria was the least expensive "base line" version for the economy end of the market. The Town Car was the most expensive "luxury" version for the carriage trade. And the Grand Marquis was midway between the two, both is terms of price and in terms of "trim level" and other features. It was fancier (and more expensive) than the Crown Victoria but not as fancy (or expensive) as the Town Car. But all three shared a lot of common design elements, parts, etc.
Nuclear physicists determined that there were similar familial relationships between subatomic particles. Particle families were grouped into "generations". In the case of one family of particles, the first generation was the Electron. It's second generation was the "Muon", originally called the "Mu Mason". Both particles shared a lot of attributes. The principal difference between the two was their mass. The Muon was much heavier and, therefore, held a lot more energy. The third generation was represented by the "Tau", originally called the "Tau Lepton". Again, the principal difference between it and the other two generations was a mass and, therefore, energy that was much larger than the other particles in the same family.
And with the introduction of the generations concept it became possible to line up various generations of one family of particles with the appropriate generational member of other families. So the cousin of the Electron that was a member of the Neutrino family ended up being named the "Electron Neutrino". Similarly, the second generation particle was eventually named the "Muon Neutrino". Unsurprisingly, the third generation ended up being named the "Tau Neutrino".
The second component of the new theory was the Quarks themselves. In the same way that atoms were composed of subatomic particles, some (but not all) of what had been thought to be indivisible subatomic particles like the Proton, turned out to be composites of new and heretofore unsuspected truly fundamental particles. And these newly discovered truly fundamental particles were called Quarks. And, cutting to the chase, Quarks could also be put into the same kind of "three generations" structure I have talked about above. But that's getting ahead of the story. Back to Asimov.
The Electron was identified by J. J. Thompson in about 1900. He was also the first to propose a model of the atom. It was like a cookie, specifically like an oatmeal raisin cookie. An atom consisted of some unspecified material playing the role of the oatmeal batter. Into it was stuck the Electrons, which played the role of the raisins. This model didn't last long but you have to start somewhere. Things quickly got complicated due to the study of radioactivity.
Becquerel did a lot of the early work. He quickly determined that in a lot of cases radioactivity looked like a particle shooting out of the atom. And some of there particles seemed to be Electrons. So far, so good. But another kind of emission was what he called an "Alpha" particle. It had a positive charge so was presumably a chunk of the oatmeal part of the atomic cookie. There were definitely other kinds of emissions. Following the convention he set up, he named a certain class "Beta" particles and another class "Gamma" particles. It didn't take long to determine that a Beta particle and an Electron were the same thing but the name "Beta" stuck and is still in use. And it also turned out that "Gamma" emissions looked like high energy X-Rays but the name "Gamma ray" also stuck and is still in use.
Good experimental work determined that Alpha particles were at least twice as heavy as Hydrogen atoms. More good experimental work soon determined that they were a form of Helium that weighed four times what a single atom of Hydrogen weighed. Other scientists followed up other clues and identified the Proton at about the same time the Alpha work was being done. Protons and electrons have equal but opposite charges. But a proton is roughly two thousand times as heavy as an electron. This large difference in masses was a puzzle that had no solution at the time of Asimov's book.
But the identification of the Proton led to the next iteration of the model for the atom. Now it consisted of Electrons orbiting a "nucleus" consisting of Protons. This was analogous to the solar system where the Sun is in the center (nucleus) and planets (Electrons) orbit it. This model led to a lot of questions. But it also led to some answers. The identity of an element was tied to the number of Protons in the nucleus. Hydrogen is Hydrogen because it has a nucleus containing one Proton. Helium is Helium because it's nucleus contains two Protons. Lithium is Lithium because its nucleus contains three Protons. And so on.
Also, chemistry is all about Electrons. They occupy the outer regions of the atom so when two atoms come close to each other, what they mostly see is the other's Electrons. Remove the Electron from a Hydrogen atom and it is still a Hydrogen atom. It just has a net positive electrical charge that attracts the electrons in the outer regions of other atoms. And that is the basis of how chemical bonds work. Similarly, a Beta particle is a Helium atom from which both outer electrons have been removed. It has a positive electrical charge that is twice as strong as that of a Hydrogen atom whose single electron has been stripped away. This was real progress.
One question that was quickly identified was the "mass" question. The Helium atom should weigh twice as much as a Hydrogen atom but it actually seemed to weigh roughly four times as much. Other, similar discrepancies popped up all over the place. One quick fix to this problem was to assume that a nucleus also contained Electrons. If a Helium nucleus contained four Protons and two Electrons then the mass would come out about right because the electrons weighed so little. And the charge would come out right because the two Electrons would cancel out two of the four Protons.
And there was another, more subtle version of this problem. According to this theory a Helium nucleus contained four Protons and two Electrons. But the weight of the Electrons could be neglected so the mass of the Helium nucleus should be exactly four times that of Hydrogen. But it was off by a bit. All masses for all atoms were off by a bit, a little bit in some cases, and a lot in others. What was going on? The next chapter is "what's going on". So let's move on to "Isotopes".
The obvious base for calculating the relative weights of the various elements is Hydrogen. But, as we have seen with Helium, that doesn't work very well. Helium does not end up having a weight that is exactly four times that of Hydrogen. Various things were tried and eventually it was decided to use Oxygen as the base. It seemed to be the least worst choice. (The reason for this will be explained below.) It was given a standard weight of 16. The weight of other elements then often fell close to an integer number. But not always. Chlorine, for instance, came in at 35.457 instead of a nice round 35. It took a while to figure out what was going on.
Becquerel found that if you purified Uranium, then left it lying around undisturbed for a while, it actually got more radioactive. He speculated that somehow a small portion of the not very radioactive Uranium was mysteriously transforming itself into highly radioactive "Uranium X". And if you carefully separated out the Uranium X then the remaining "regular" Uranium would, over time, just make some more Uranium X.
Rutherford found out that the same thing happened with Thorium. And it had already been determined that Radium, if left alone, would somehow create Radon gas. As this general phenomenon was further investigated it slowly dawned that elements were miraculously transforming themselves into other elements. And in every case radioactivity was involved.
Soddy in 1913 finally cottoned on to what was happening. If a radioactive transformation involved the emission of an Alpha particle then the source element was transformed into a different element that was down two places in the periodic table. What was happening was fission. An element broke into two pieces, One of them was an Alpha particle that carried off two Protons. The element that remained behind retained all of the other Protons but was now a different element due to the smaller number of Protons its nucleus now contained.
There were obviously multiple versions of elements. They all had the same number of Protons so they had to differ in some other way. He called these different versions "Isotopes" without worrying about what the difference was. An obvious "fix" was to assume that "the nucleus consists of a certain number of Protons and a certain number of Electrons". If we added extra Protons but also added the same number of Electrons to the nucleus at the same time then the atomic number stays the same. This trick allows us to account for all then known nuclear transformations.
We still ignore the masses of the nuclear Electrons as they are so light that their effect on the mass of the nucleus is what accountants call "not material". But we now have a new number, the "mass number". The mass number, according to our new theory is the total or "gross" number of Protons. The atomic number is the net Proton count, all nuclear Protons minus however many nuclear Electrons are present. Two isotopes of the same element have the same atomic number (net number of Protons) but different mass numbers (gross number of Protons).
And this isotope business helped to explain why the weight of a particular element did not end up to be a round number. If a typical sample of, say, Helium, contained some Helium -3 (atomic number 2, mass number 3) and Helium-4 (atomic number 2, mass number 4) then the atomic mass of the sample could come out anywhere between three and four depending on the ratio of the two isotopes. Things became clearer when the "mass spectrometer" was invented.
You turn your sample into a gas, then you "ionize" it (strip one or more Electrons off of each atom so it has an electric charge). Then you make it fly through a magnetic field at a constant speed. The magnetic field will make the trajectory of each atom bend. How much will it bend? Well, that depends on the mass, the speed, and the electric charge. If we can keep the speed and electric charge constant then if the mass is higher the sample's trajectory will be bent by a smaller amount. If we can pull this trick off (which is very hard to do in lots of circumstances) then we can weigh each individual particle.
The mass spectrometer allowed many individual isotopes of many elements to be weighed. And by measuring how much of each isotope a representative sample contained the "isotopic composition" of various elements could be determined. And, as an interesting side effect of this work. it was determined that some isotopes of some elements were "stable", they never engaged in radioactive decay. And, of course, some isotopes were determined to be mildly radioactive (they "decayed" slowly into other elements and isotopes) and others were highly radioactive (they quickly decayed into other elements and isotopes).
And it turned out that even Oxygen, the standard against which other elements were weighed when Asimov wrote his book, was a combination of isotopes. It's just that it was 99.9% Oxygen-16 and only a tiny amount of other isotopes of Oxygen. Since Asimov wrote the book, the reference standard against which the relative atomic mass of each isotope of each element is compared, has been changed from "Oxygen" to "Carbon-12".
Carbon is carefully separated out and pure Carbon-12 is isolated. Then it is weighed and given an arbitrary "atomic mass" of 12. The relative atomic mass of other isotopes relative to that of Carbon-12 is determined and that ratio is used to determine that isotope's atomic mass. This resulted in a small change in the atomic masses assigned to other elements.
This change was made because it improved the situation. It brought a lot of atomic masses closer to being integral numbers once isotope ratios were accounted for. And each isotope was now handled separately for the purposes of determining its atomic mass. Most discrepancies are now small, but with the exception of Carbon-12, none of them is an exact integer. The reason for this had been solved by the time Asimov wrote his book. But that's something he gets into later.
So physicists were pretty happy at this point. The "nucleus is a mix of protons and electrons" theory worked very well. But then Rutherford came up with an experimental setup that allowed him to probe the nucleus in new ways. He figured out how to fire Alpha particles at a target. The target was made from Zinc Sulfide which would "scintillate" (throw off a spark of light that could be seen with the naked eye) when hit by an Alpha particle.
He then put a metal disk in the path to see what would happen. At first the scintillations stopped. But then he added Hydrogen to the mix and things changed. He concluded that single Protons, presumably from the Hydrogen, were now striking the target because they had enough energy to penetrate his metal disk. Very interesting.
He tried some different things before switching to what is now called a "Wilson cloud chamber". If you have air with a lot of water vapor in it then lots of things will cause the water vapor to condense into small droplets that are visible using just your naked eye. By carefully tweaking the apparatus you can see the path of ionized particles. If you then add a magnetic field the paths of the ionized particles will bend just like they do in a mass spectrometer. Because you can see the paths of ionized particles you can take the same kinds of measurements. This is a classic example of a better apparatus leading directly to better science.
A careful analysis of an Alpha particle striking the nucleus of a Nitrogen atom led to a determination that the Nitrogen nucleus could sometimes absorb the Alpha particle. It immediately threw off a Proton and transmuted into Oxygen. The Proton's path could be easily seen because at this point it was ionized.
This is the first example of a man-made process that could transmute one element into another. Alchemists had hoped to transmute "base metal", by which they meant lead, into gold. This can now be done. But the process is fantastically expensive. You are far better off just buying gold in the first place.
The method of viewing a cloud chamber as it made the paths of charged particles visible using a "mark one eyeball" was quickly replaced by taking photographs. Photographs could capture more detail and resulted in a permanent record that could be reviewed by others.
Asimov notes that the scientist who nailed down the Nitrogen to Oxygen transmutation had to take and examine 20,000 photographs to find 8 in which the event he was interested in occurred. By the time Asimov's book was published scientists were employing rafts of graduate students to examine hundreds of thousands of photographs looking for interesting events.
But the rate at which scientific instruments could churn out photographs, all of which had to be examined for events of potential interest, kept increasing. It soon reached a practical limit. Fortunately, at about the time the practical limit was reached solid state devices came along that were capable of replacing the cloud chamber.
Detectors capable of collecting the same kind of data (particle path, speed, mass, etc.) that had been extracted from cloud chamber photographs now exist. And they work pretty well for charged particles. But there is only a very limited capability to observe and measure the attributes of uncharged particles. In some cases it is possible to detect the presence of an uncharged particle. It is also sometimes possible to measure the energy it carries. But that's about it. Still, that's better than nothing. There is no doubt that the business of detecting and tracking uncharged particles doesn't work nearly as well as scientists would like.
But it is now possible to hook detectors up to computers and have them look for and measure events. That gets grad students out of the business of going blind by looking at zillions of photographs. It might sound like that puts them out of work, but don't worry. They still have lots to do.
Even after computers do a lot of preliminary work it is still necessary for a trained person to look at the result. The detectors at CERN, the home of the LHC, the largest particle detector in the world, can generate the equivalent of those 20,000 photographs in a small fraction of a second. Even with all the computer filtering the LHC can turn out hundreds of potentially interesting events per day. That's why the staff of each detector runs into the thousands.
We have now reached the point where we have an atom with a nucleus of protons and, as far as we know at this point, some electrons. The nucleus is surrounded by electrons in some mysterious configuration. This is just the beginning of the story. But there is where I must leave it in this installment. To be continued . . .
Saturday, March 9, 2019
From Ito to Ellis
Like a lot of my posts, this one builds on previous work. In my last post I suggested you NOT go back and read my previous work on the subject. This time around I can strongly recommend you do the opposite, that you reread my previous post. And the reason is simple. One of the two people I am featuring in this post is someone most people have forgotten about. His name is Lance Ito. Who? My point exactly.
Mr. Ito was the presiding judge in "the trial of the century". The century in question is not our current one but the one that immediately preceded it, the twentieth century. Specifically, the trial ran from November of 1994 to June of 1995. The trial was a murder case that was handled by the State of California and the defendant was one Orenthal Julius "OJ" Simpson. He was acquitted even though most observers, including myself, thought he was guilty.
The trial was "the trial of the century" on one sense. It was covered more extensively and more intensively than any other trial from any century. The cable news landscape was quite different at the time. CNN had been founded in 1980 but both MSNBC and Fox News only date back to 1996. There were a couple of "business news" channels around (CNBC, founded in 1989, and Bloomberg Television, founded just in time at the beginning of 1994). But the business news channels covered business, And the OJ trial had no "business" hook. And CNN considered itself a "serious news" channel at the time. They devoted a lot of coverage to the OJ trial but did not go "wall to wall".
But the trial, and the events leading up to it, were covered intensively, not only in southern California, where all the events happened, but nationally. All of the network affiliated TV stations in my market (Seattle) broke in to cover the infamous "Bronco chase" (see my previous post for details). This left the local stations no alternative so you could literally flip from channel to channel to channel and see roughly the same feed on all of them.
Okay. That should give you enough information to convince you that you should definitely read my previous post. So here's the link: http://sigma5.blogspot.com/2014/06/the-oj-trial.html. These events happened long enough ago that even people who were paying attention at the time have forgotten the details. In my previous post I covered many aspects of the trial. Here I want to focus on only one of them.
Judge Ito is still alive, although he retired from the bench a few hears ago in 2015. Going into the trial he had a good reputation. He emerged from the trial with his reputation in tatters. It never recovered. The OJ trial is a prime example of the influence, for good or ill, that a judge can have on judcicial proceedings.
And the OJ trial was unlike most trials in one aspect, an aspect that turned out to be critical. The state of California gives judges the option of letting proceedings be televised. Judge Ito okayed television coverage. So literally everyone could effectively sit in the courtroom and observe the proceedings.
Courts have been experimenting with TV coverage as long as TV has been around. A couple of early experiments led to a circus atmosphere. For one thing, the lights had to be extremely bright for the TV cameras of the time to work. For another, lots of people played to the cameras rather than observing standard courtroom decorum. That led to most courts banning TV cameras most of the time.
But by the time of the OJ trial those sorts of problems had been ironed out. TV cameras had gotten a lot better so lighting did not need to be especially bright for everything to work. And everybody had gotten used to the process. Judges had developed effective techniques for keeping everyone in line. So the presence of TV cameras did not have a direct effect on the proceedings.
What did have an effect was that the audience could see and judge the behavior and effectiveness of the various players. And they did. And the score card had a profound effect on several of the key players. As I reported in my previous post, the reputation of F. Lee Bailey, up to this point considered a superstar lawyer, plummeted. He just didn't seem to have a good grasp of what an effective defense strategy would be. The person who did was Johnny Cochran. He went from being someone that no one had heard of to having the kind of superstar reputation that Bailey lost.
Bailey lost the most but Judge Ito was also a big looser. Again, as I laid out in my previous post, Ito received a failing grade from most observers for the way he managed his courtroom. Rather than rehash my previous post let me make two observations. The key prosecution witness was a cop named Mark Furman. It turns out Ito's wife had been his supervisor at one point and Furman was on record as having said derogatory things about her.
Th other observation I want to make is that Ito's behavior had a substantial negative effect on the prosecution's case and a substantial positive effect on the defense's case. This is probably a good thing in many cases as the prosecution has substantial resources and most defendants have few or none. The prosecution should be forced to make their case. On the other hand, if the prosecution has s solid case they should be allowed to put it on. Ito really didn't let the prosecution do this.
And that's not the worst of it. Furman entered Simpson's property even though it was secured by citing a combination of "probable cause" and "exigent circumstances". It is likely that they didn't exist. If the evidence obtained this way had been thrown out then it is not clear that the prosecution had a case. In my opinion, ruling that the evidence, and all evidence that followed from it, was inadmissible, would have resulted in the prosecution losing fairly.
Finally, Mr. Simpson was not a normal defendant. He was well known, had many powerful friends, and had substantial financial resources. In these cases there is no justification for the judge being anything but even handed.
Now let me move on to the second person named in my subject line, Mr. Ellis. He is Federal Judge Thomas Selby Ellis III. He is a "Senior" judge, which means he is semi-retired. This is appropriate because he is close to eighty years in age and has been a Federal judge for more than thirty years. What has brought Judge Ellis to my attention is that he was the presiding judge on one of two cases recently brought against Paul Manafort. It turns out that there are a surprising number of similarities between the Manafort and the OJ cases.
In both cases the Judge was much harder on the prosecution than on the defense. In both cases the defendant had the wealth and power to mount a substantial, well resourced, defense. And in both cases the prosecution was burdened with putting on a complicated case while the defense had a much easier time of it.
The venue in which Judge Ellis serves is notorious as being the home of the "rocket docket". Judges try hard to move cases through quickly. They try hard to get both sides to pare their cases down to a few key items in dispute and to provide a minimum of support for their view of the issue.
Again, in the abstract, this is a good thing. It is unjust to force defendants to wait long periods of time, often in jail, simply waiting for their case to come up. If cases are short then more cases can be heard in a year and the backlog can stay short and cases spend minimal time in scheduling limbo.
But again it is incumbent on the Judge to let the prosecution prove their case if they can. In the OJ case a serious argument can be made that the prosecution did not make their case. This is definitely not true in the Manafort case. The prosecution presented substantial and compelling evidence to support each and every aspect in their case. And, unlike in the OJ case, the Manafort team provided little or no reason to call any of the prosecution's case into question.
In the OJ case, Judge Ito was a fan of Johnny Cochran, the eventual lead defense attorney. In the Manafort case, the Judge regularly expressed the opinion that the crimes Manafort was being charged with were "chicken feed" (not his characterization but equivalent to what he did have to say). He also opined that the only reason Manafort was in front of him was because he was as a stepping stone to bigger fish. And not all of this happened out of the hearing of jurors.
And a case can be made that the Judge was right. It has long been the case that white collar crimes, anything illegal perpetrated by men in suits and not using violence, does tend to result in a light sentence. It is probably true that the Judge could find earlier cases prosecuted in his district that had resulted in sentences being handed down that were roughly in line with the sentence Manafort received.
But that's the problem. Lots of people have received far harsher sentences than Manafort's for crimes most of us would characterize as far less serious, There are lots of people serving hard time in the Federal Prison system for non-violent drug crimes. But that just makes it worse.
And the decades long effort to make sure that sentencing is uniform and appropriate is a response to this. That's why the federal sentencing guidelines exist. They can get it wrong. But this only comes about if there is something in a particular case that is not appropriately handled by the guidelines. There was nothing like this in Manafort's case.
The guidelines start with the "type" of each crime the defendant has been convicted of and assigns a score. The scores are added up to produce a preliminary sentencing range. Then adjustments are made based on "mitigating" and "aggravating" circumstances. This is an entirely mechanical process of so much for this and so much for that.
If a defendant does this (i. e. enter into and execute a "cooperation agreement" with the authorities, demonstrate an understanding of his crime and show true remorse, etc.) then then based on the type of mitigation, the sentencing range is adjusted down. If a defendant does that (i. e. is a repeat offender, attempts to tamper with a witness, etc.) then a similar process is used to adjust the sentencing range up.
This process was done in the Manafort case. In short, there were several aggravating factors and no mitigating factors. The defense team did not challenge any of the findings that went into the sentencing recommendation. The Judge had spent the entire trial haranguing the prosecution to speed things up and don't take any detours. So the prosecution kept is short and said "we agree with the sentencing guidelines as is" rather than spending a lot of time on the subject.
So what did the Judge do? He in effect threw the sentencing guidelines out the window and, on his own, issued a sentence that was roughly 20% as long as the guidelines. Judges are given wide discretion to reduce sentences but they are expected to provide justification. Technically, the Judge did provide a justification. But the justification was wholly inadequate.
The first thing he did was ignore or grossly mischaracterize facts entered into the record as the case proceeded. He characterized Manafort as having led a "blameless" life. The trial record says differently. Evidence was introduced of Manafort engaging in various criminal activities over at least a decade. These crimes were perpetrated solely to increase the wealth and power of Mr. Manafort. In short, they were the kinds of things a Mafia kingpin would do.
He made a lot of money promoting the activities of various thugs and criminals who spent a lot of time and effort in opposing the interests of the United States. So Manafort was manifestly anti-American. These people also spent a lot of time undermining and subverting the institutions on which civilization depends, things like the very court system that Judge Ellis is a key part of. This is hardly the behavior of a blameless man.
And then there is all the lawlessness he engaged in after he was convicted of eight crimes and pled guilty to a slew of others. He then chose to enter into a cooperation agreement with the authorities and violate it repeatedly.
On the other hand, the Judge assigned heavy weight to a number of routine letters of support. Anyone as rich, powerful, and well connected as Manafort would have no trouble wrangling such letters. I'm sure Mafiosi could too. And they would be similarly effusive. And similarly meaningless.
The only thing that sticks in this whole sorry mess is the Judge's contention that white collar crimes usually result in light sentences. This, unfortunately, is true. Horrible damage was done to the economy and the lives of many thousands of people by the wretched excesses of Wall Street that led to the crash of '08. Nobody, with the possible exception of a single low level flunky, went to jail. There were almost no prosecutions.
And that means that there is little reason for the people who perpetrated that great disaster and similar other lesser disasters have any reason to change their behavior. And they get paid outrageous sums to keep doing the same thing. We should not be surprised that rich and powerful people often engage in bad behavior. They have every reason to do so and, thanks to people like Judge Ellis, there is little likelihood that they will pay a heavy price for their bad behavior even if they are caught and convicted.
The reforms that led to the sentencing guidelines that the Judge ignored were one feeble attempt to put things right. And the Manafort case is the poster child for why prosecutors are reluctant to bring these kinds of cases. They are hard to develop. They take a lot of hours of work by skilled people to put together. They require the prosecution to place a complicated case before jurors, keep them from getting confused, and prove all the elements.
That is very hard to do. It is particularly hard if a Judge Ito is permitting the defense to throw in interruption after interruption. Or if a Judge Ellis is disparaging the fact that you even brought the case in the first place while simultaneously saying "move things along" and " stick only to the essentials".
And you have the results in these cases. OJ gets off completely. Manafort gets a ridiculously light sentence. And the Manafort case in particular was a slam dunk. The prosecution had extensive documentation (which they were repeatedly told to keep to a minimum) and compelling witnesses like "salt of the earth" employees of small businesses that Manafort did business with. The OJ case could have been a slam dunk if the LAPD had done high quality police work. But they didn't. Even so, that case was still pretty compelling.
In many cases white collar crimes are much messier. Multiple bad actors can be blamed. This is definitely true in the crash of '08. It wasn't caused by a single individual but by a whole corrupt system. But cases normally need to be brought individual by individual. The Manafort case involved substantial, voluminous, and substantially complete documentation. (Manafort's number two, Rick Gates, flipped and was able to provide invaluable assistance). In white collar crimes the record is often far from complete.
But white collar crimes are often more damaging to society than other types of crimes that are routinely dealt with far more harshly. And the Manafort case brought the dual nature of our justice system into sharp focus. There is one system of justice for Manafort and others with wealth, power, and a network of well connected friends. Then there is an entirely different system of justice for the poor, marginalized, and powerless.
Those people do not have the resources to mount the kind of defense Manafort did. His defense was incredibly weak. They did not seriously challenge a single aspect of the government's case. They certainly had the resources to locate and exploit any weaknesses. The only thing I can conclude is that there were no weaknesses in the prosecution's case.
But in the end what they did do was effective. They said Manafort was a nice, well educated, and successful man who had not been caught before and who knew a bunch of people who would attest to the fact that he was a family man and the kind of guy they liked to be associated with. That turned out to be enough to get 80% of Manafort's sentence to go away.
Ultimately the OJ case accelerated the change in the "news" away from news and toward sensationalism. It had no impact on how the LAPD did business or how courts, either at the state level or anywhere else, operated. OJ is out of jail, finally, but he is now an old man.
It remains to be seen what impact this particular case will have. I think it will not have much. Manafort was convicted of crimes in two jurisdictions. The sentencing phase has not yet taken place in the second jurisdiction. More importantly, "Manafort" is only a small star in the much larger galaxy of scandals that is the Trump Administration. If substantial change is going to occur, it will most likely be a consequence of the gravitational pull of the black hole that is at the center of this large assemblage.
Finally, both cases make even more clear that "justice" is in the eye of the beholder. There are lots of people that feel that the OJ case was decided correctly. There are lots of people that feel that the Manafort case was decided correctly. It no longer matters much what the facts in either case are. All too often today beliefs are held not because of the facts but rather in spite of the facts. That, more than anything else, needs to change.
Mr. Ito was the presiding judge in "the trial of the century". The century in question is not our current one but the one that immediately preceded it, the twentieth century. Specifically, the trial ran from November of 1994 to June of 1995. The trial was a murder case that was handled by the State of California and the defendant was one Orenthal Julius "OJ" Simpson. He was acquitted even though most observers, including myself, thought he was guilty.
The trial was "the trial of the century" on one sense. It was covered more extensively and more intensively than any other trial from any century. The cable news landscape was quite different at the time. CNN had been founded in 1980 but both MSNBC and Fox News only date back to 1996. There were a couple of "business news" channels around (CNBC, founded in 1989, and Bloomberg Television, founded just in time at the beginning of 1994). But the business news channels covered business, And the OJ trial had no "business" hook. And CNN considered itself a "serious news" channel at the time. They devoted a lot of coverage to the OJ trial but did not go "wall to wall".
But the trial, and the events leading up to it, were covered intensively, not only in southern California, where all the events happened, but nationally. All of the network affiliated TV stations in my market (Seattle) broke in to cover the infamous "Bronco chase" (see my previous post for details). This left the local stations no alternative so you could literally flip from channel to channel to channel and see roughly the same feed on all of them.
Okay. That should give you enough information to convince you that you should definitely read my previous post. So here's the link: http://sigma5.blogspot.com/2014/06/the-oj-trial.html. These events happened long enough ago that even people who were paying attention at the time have forgotten the details. In my previous post I covered many aspects of the trial. Here I want to focus on only one of them.
Judge Ito is still alive, although he retired from the bench a few hears ago in 2015. Going into the trial he had a good reputation. He emerged from the trial with his reputation in tatters. It never recovered. The OJ trial is a prime example of the influence, for good or ill, that a judge can have on judcicial proceedings.
And the OJ trial was unlike most trials in one aspect, an aspect that turned out to be critical. The state of California gives judges the option of letting proceedings be televised. Judge Ito okayed television coverage. So literally everyone could effectively sit in the courtroom and observe the proceedings.
Courts have been experimenting with TV coverage as long as TV has been around. A couple of early experiments led to a circus atmosphere. For one thing, the lights had to be extremely bright for the TV cameras of the time to work. For another, lots of people played to the cameras rather than observing standard courtroom decorum. That led to most courts banning TV cameras most of the time.
But by the time of the OJ trial those sorts of problems had been ironed out. TV cameras had gotten a lot better so lighting did not need to be especially bright for everything to work. And everybody had gotten used to the process. Judges had developed effective techniques for keeping everyone in line. So the presence of TV cameras did not have a direct effect on the proceedings.
What did have an effect was that the audience could see and judge the behavior and effectiveness of the various players. And they did. And the score card had a profound effect on several of the key players. As I reported in my previous post, the reputation of F. Lee Bailey, up to this point considered a superstar lawyer, plummeted. He just didn't seem to have a good grasp of what an effective defense strategy would be. The person who did was Johnny Cochran. He went from being someone that no one had heard of to having the kind of superstar reputation that Bailey lost.
Bailey lost the most but Judge Ito was also a big looser. Again, as I laid out in my previous post, Ito received a failing grade from most observers for the way he managed his courtroom. Rather than rehash my previous post let me make two observations. The key prosecution witness was a cop named Mark Furman. It turns out Ito's wife had been his supervisor at one point and Furman was on record as having said derogatory things about her.
Th other observation I want to make is that Ito's behavior had a substantial negative effect on the prosecution's case and a substantial positive effect on the defense's case. This is probably a good thing in many cases as the prosecution has substantial resources and most defendants have few or none. The prosecution should be forced to make their case. On the other hand, if the prosecution has s solid case they should be allowed to put it on. Ito really didn't let the prosecution do this.
And that's not the worst of it. Furman entered Simpson's property even though it was secured by citing a combination of "probable cause" and "exigent circumstances". It is likely that they didn't exist. If the evidence obtained this way had been thrown out then it is not clear that the prosecution had a case. In my opinion, ruling that the evidence, and all evidence that followed from it, was inadmissible, would have resulted in the prosecution losing fairly.
Finally, Mr. Simpson was not a normal defendant. He was well known, had many powerful friends, and had substantial financial resources. In these cases there is no justification for the judge being anything but even handed.
Now let me move on to the second person named in my subject line, Mr. Ellis. He is Federal Judge Thomas Selby Ellis III. He is a "Senior" judge, which means he is semi-retired. This is appropriate because he is close to eighty years in age and has been a Federal judge for more than thirty years. What has brought Judge Ellis to my attention is that he was the presiding judge on one of two cases recently brought against Paul Manafort. It turns out that there are a surprising number of similarities between the Manafort and the OJ cases.
In both cases the Judge was much harder on the prosecution than on the defense. In both cases the defendant had the wealth and power to mount a substantial, well resourced, defense. And in both cases the prosecution was burdened with putting on a complicated case while the defense had a much easier time of it.
The venue in which Judge Ellis serves is notorious as being the home of the "rocket docket". Judges try hard to move cases through quickly. They try hard to get both sides to pare their cases down to a few key items in dispute and to provide a minimum of support for their view of the issue.
Again, in the abstract, this is a good thing. It is unjust to force defendants to wait long periods of time, often in jail, simply waiting for their case to come up. If cases are short then more cases can be heard in a year and the backlog can stay short and cases spend minimal time in scheduling limbo.
But again it is incumbent on the Judge to let the prosecution prove their case if they can. In the OJ case a serious argument can be made that the prosecution did not make their case. This is definitely not true in the Manafort case. The prosecution presented substantial and compelling evidence to support each and every aspect in their case. And, unlike in the OJ case, the Manafort team provided little or no reason to call any of the prosecution's case into question.
In the OJ case, Judge Ito was a fan of Johnny Cochran, the eventual lead defense attorney. In the Manafort case, the Judge regularly expressed the opinion that the crimes Manafort was being charged with were "chicken feed" (not his characterization but equivalent to what he did have to say). He also opined that the only reason Manafort was in front of him was because he was as a stepping stone to bigger fish. And not all of this happened out of the hearing of jurors.
And a case can be made that the Judge was right. It has long been the case that white collar crimes, anything illegal perpetrated by men in suits and not using violence, does tend to result in a light sentence. It is probably true that the Judge could find earlier cases prosecuted in his district that had resulted in sentences being handed down that were roughly in line with the sentence Manafort received.
But that's the problem. Lots of people have received far harsher sentences than Manafort's for crimes most of us would characterize as far less serious, There are lots of people serving hard time in the Federal Prison system for non-violent drug crimes. But that just makes it worse.
And the decades long effort to make sure that sentencing is uniform and appropriate is a response to this. That's why the federal sentencing guidelines exist. They can get it wrong. But this only comes about if there is something in a particular case that is not appropriately handled by the guidelines. There was nothing like this in Manafort's case.
The guidelines start with the "type" of each crime the defendant has been convicted of and assigns a score. The scores are added up to produce a preliminary sentencing range. Then adjustments are made based on "mitigating" and "aggravating" circumstances. This is an entirely mechanical process of so much for this and so much for that.
If a defendant does this (i. e. enter into and execute a "cooperation agreement" with the authorities, demonstrate an understanding of his crime and show true remorse, etc.) then then based on the type of mitigation, the sentencing range is adjusted down. If a defendant does that (i. e. is a repeat offender, attempts to tamper with a witness, etc.) then a similar process is used to adjust the sentencing range up.
This process was done in the Manafort case. In short, there were several aggravating factors and no mitigating factors. The defense team did not challenge any of the findings that went into the sentencing recommendation. The Judge had spent the entire trial haranguing the prosecution to speed things up and don't take any detours. So the prosecution kept is short and said "we agree with the sentencing guidelines as is" rather than spending a lot of time on the subject.
So what did the Judge do? He in effect threw the sentencing guidelines out the window and, on his own, issued a sentence that was roughly 20% as long as the guidelines. Judges are given wide discretion to reduce sentences but they are expected to provide justification. Technically, the Judge did provide a justification. But the justification was wholly inadequate.
The first thing he did was ignore or grossly mischaracterize facts entered into the record as the case proceeded. He characterized Manafort as having led a "blameless" life. The trial record says differently. Evidence was introduced of Manafort engaging in various criminal activities over at least a decade. These crimes were perpetrated solely to increase the wealth and power of Mr. Manafort. In short, they were the kinds of things a Mafia kingpin would do.
He made a lot of money promoting the activities of various thugs and criminals who spent a lot of time and effort in opposing the interests of the United States. So Manafort was manifestly anti-American. These people also spent a lot of time undermining and subverting the institutions on which civilization depends, things like the very court system that Judge Ellis is a key part of. This is hardly the behavior of a blameless man.
And then there is all the lawlessness he engaged in after he was convicted of eight crimes and pled guilty to a slew of others. He then chose to enter into a cooperation agreement with the authorities and violate it repeatedly.
On the other hand, the Judge assigned heavy weight to a number of routine letters of support. Anyone as rich, powerful, and well connected as Manafort would have no trouble wrangling such letters. I'm sure Mafiosi could too. And they would be similarly effusive. And similarly meaningless.
The only thing that sticks in this whole sorry mess is the Judge's contention that white collar crimes usually result in light sentences. This, unfortunately, is true. Horrible damage was done to the economy and the lives of many thousands of people by the wretched excesses of Wall Street that led to the crash of '08. Nobody, with the possible exception of a single low level flunky, went to jail. There were almost no prosecutions.
And that means that there is little reason for the people who perpetrated that great disaster and similar other lesser disasters have any reason to change their behavior. And they get paid outrageous sums to keep doing the same thing. We should not be surprised that rich and powerful people often engage in bad behavior. They have every reason to do so and, thanks to people like Judge Ellis, there is little likelihood that they will pay a heavy price for their bad behavior even if they are caught and convicted.
The reforms that led to the sentencing guidelines that the Judge ignored were one feeble attempt to put things right. And the Manafort case is the poster child for why prosecutors are reluctant to bring these kinds of cases. They are hard to develop. They take a lot of hours of work by skilled people to put together. They require the prosecution to place a complicated case before jurors, keep them from getting confused, and prove all the elements.
That is very hard to do. It is particularly hard if a Judge Ito is permitting the defense to throw in interruption after interruption. Or if a Judge Ellis is disparaging the fact that you even brought the case in the first place while simultaneously saying "move things along" and " stick only to the essentials".
And you have the results in these cases. OJ gets off completely. Manafort gets a ridiculously light sentence. And the Manafort case in particular was a slam dunk. The prosecution had extensive documentation (which they were repeatedly told to keep to a minimum) and compelling witnesses like "salt of the earth" employees of small businesses that Manafort did business with. The OJ case could have been a slam dunk if the LAPD had done high quality police work. But they didn't. Even so, that case was still pretty compelling.
In many cases white collar crimes are much messier. Multiple bad actors can be blamed. This is definitely true in the crash of '08. It wasn't caused by a single individual but by a whole corrupt system. But cases normally need to be brought individual by individual. The Manafort case involved substantial, voluminous, and substantially complete documentation. (Manafort's number two, Rick Gates, flipped and was able to provide invaluable assistance). In white collar crimes the record is often far from complete.
But white collar crimes are often more damaging to society than other types of crimes that are routinely dealt with far more harshly. And the Manafort case brought the dual nature of our justice system into sharp focus. There is one system of justice for Manafort and others with wealth, power, and a network of well connected friends. Then there is an entirely different system of justice for the poor, marginalized, and powerless.
Those people do not have the resources to mount the kind of defense Manafort did. His defense was incredibly weak. They did not seriously challenge a single aspect of the government's case. They certainly had the resources to locate and exploit any weaknesses. The only thing I can conclude is that there were no weaknesses in the prosecution's case.
But in the end what they did do was effective. They said Manafort was a nice, well educated, and successful man who had not been caught before and who knew a bunch of people who would attest to the fact that he was a family man and the kind of guy they liked to be associated with. That turned out to be enough to get 80% of Manafort's sentence to go away.
Ultimately the OJ case accelerated the change in the "news" away from news and toward sensationalism. It had no impact on how the LAPD did business or how courts, either at the state level or anywhere else, operated. OJ is out of jail, finally, but he is now an old man.
It remains to be seen what impact this particular case will have. I think it will not have much. Manafort was convicted of crimes in two jurisdictions. The sentencing phase has not yet taken place in the second jurisdiction. More importantly, "Manafort" is only a small star in the much larger galaxy of scandals that is the Trump Administration. If substantial change is going to occur, it will most likely be a consequence of the gravitational pull of the black hole that is at the center of this large assemblage.
Finally, both cases make even more clear that "justice" is in the eye of the beholder. There are lots of people that feel that the OJ case was decided correctly. There are lots of people that feel that the Manafort case was decided correctly. It no longer matters much what the facts in either case are. All too often today beliefs are held not because of the facts but rather in spite of the facts. That, more than anything else, needs to change.
Subscribe to:
Posts (Atom)