Showing posts with label A-Z Book Review Part 4. Show all posts
Showing posts with label A-Z Book Review Part 4. Show all posts

Bad Biography versus Good: James Herriot

I also discuss bad versus good biographies here. This particular review--for the last book on the A-Z List (Part 4)--is about James Herriot.

The review compares Graham Lord's biography of James Herriot to Jim Wight's biography of his father. I highly recommend the second; avoid the first.

Below is the review that I wrote on Amazon about Graham Lord's book. I comment on the review at the end.

* * *

The most positive thing about [Lord's] book is that it shows you what Jim Wight (James Herriot's son) was up against when he wrote his memoir. I highly recommend Jim Wight's memoir for anyone who is interested in learning about James Herriot (Alf Wight).

I think Mr. Lord may have been well-meaning when he wrote James Herriot: Life of a Country Vet but the book is really appallingly bad [actually, I now think that Graham Lord was jealously trying to capitalize from a tiny bit of knowledge of Alf Wright/James Herriot].

Mr. Lord has no feel for the WWII period, has done no practical research, seems to have little to no perception of human character and relies almost exclusively on gossip and word-of-mouth. One gets the impression that Mr. Lord decided before writing his book what he was going to find and proceeded to twist or ignore any information to the contrary. He relies on those "witnesses" who will tell him what he wants to hear without taking into consideration the inherent complexity of human beings. Single witnesses do not capture the entire truth--it is a gross error in judgment to think that one person can fully, and accurately, explain another person.

Alf and Jim Wight
The lack of reliable facts results in Mr. Lord relying almost exclusively on guesswork, and the assumptions inherent in Mr. Lord's guesswork are almost all negative. For instance, he assumes that because he, Mr. Lord couldn't find evidence that Alf Wight's parents were musicians, ergo, they weren't, therefore Alf Wight was lying when he referred to his parents as professional musicians. The point may be debatable but in the interests of good writing, the assumption is not enough. If Mr. Lord wasn't willing to do the required research to prove the point conclusively one way or the other, he should have left it out. [Alf Wright conclusively shows through artifacts and a paper-trail that his grandparents were musicians who played professionally in local venues.]

Mr. Lord strikes one as the kind of man who is continually surprised by the inconsistencies of human nature. He reports with something like glee that Alf once told someone that his father died in 1961, instead of 1960. This becomes evidence for...the mind boggles. I'm not sure Mr. Lord himself has a clue what he is trying to accomplish in this book. Whatever it is, it suffers from an utter lack of scholarship and is therefore deeply insulting both to Alf Wight's memory and to the reader.

* * *

At this later date, I think that Graham Lord was trying to "reveal" how "celebrities" twist "facts" to aggrandize themselves. Except, as Jim Wright shows, this is not how his father operated at all. James Herriot didn't change names, combine stories, and alter chronology to aggrandize himself or, even, remarkably enough, to make his life MORE DRAMATIC. He did it because he was part of a community; protection and self-protection required some degree of obfuscation. If one actually reads his books, rather than simply responding to him as a "celebrity", his natural, sweet self-effacement ("Okay, this is what I remember now about that event") shines through.

And Jim Wright points out how much his father actually didn't alter. Siegfried (Donald Sinclair) was in fact more extreme than even James Herriot paints him. The first season of the classic BBC series All Creatures Great & Small showcases the "real" Siegfried wonderfully. He is portrayed by one of my favorite actors of all time: Robert Hardy. Robert Hardy knew Siegfried (Donald Sinclair) and portrayed him accurately. He toned him down in later seasons due to the real man's grumbles. But everyone who actually knew the family agreed that James Herriot and Robert Hardy were faithful in their depictions.

I highly recommend The Real James Herriot by Jim Wight. A book written by a son about a father may sound too too maudlin, but in fact, Jim Wight is as level-headed, fair-minded, and fastidious in his writing as his father. The book is the book to go to . . .

After James Herriot's own books, of course.

The A-Z lists will continue in 2020. This time: Children's Picture Books!

New England History: Legend and Reality

The next section in the A-Z List, the 900s, tackles history. Like the 300s, this section is too big for me to adequately narrow down, so I may come back to it again.

After a very quick perusal through the stacks, I decided to focus on New England history although, technically, the first book refers to events in New Jersey, namely, Washington Crossing the Delaware.

And here is the legend, captured by Emanuel Leutze:

The reality is far more interesting. It is well-captured by The Crossing, starring Jeff Bridges as well as Roger Rees and Sebastian Roche. The movie is based on the book The Crossing by Howard Fast, 973.332 (in my local library), a quick read.

One of my favorite aspects of the movie--and the book--is the timing. The painting looks properly heroic. The truth is that the Marblehead watermen had to load and ferry the soldiers across the river on several trips. It took all night and is one of the most heroically deliberate acts of the American Revolution.

The other 900 book is Salem Possessed, which occasionally shows up in the 300s as well as the 900s. Although the book by Paul Boyer and Stephen Nissenbaum is often debated now, it was the first serious work of Salem that didn't fall back on psychology and conspiracy theories to explain what happened.

Boyer and Nissenbaum examine Salem Town and Salem Village as working environments with extensive histories. The accusers and accused become real people with family and business ties, long-running grudges, and strong religious opinions. With their work, Boyer and Nissenbaum updated the myth of "superstitious Puritans with Freudian tendencies" to "very real human beings who now-a-days would sue each other in civil court"--far more comprehensible.

The third 900 book is Sarah Vowell's Wordy Shipmates, referring to Puritans from the Mayflower specifically and the Massachusetts Bay Colony generally. I don't necessarily agree with Sarah Vowell's politics, but I love her dry tone (for those not in the know, she is the voice of Violet on The Incredibles) as well as her willingness to see multiple sides of any story. Her own family history is complexly American; consequently, she doesn't pit one group against the other. She wants to understand all her ancestors. I find this so refreshing--and insightful--I happily recommend any of her books:



Time to Talk About Tolkien (Not the Movie)

For the 800s (Literature), I am re-posting thoughts related to Tom Shippey's book Tolkien, Man of the Century.

The original post can be found on my Tolkien page. I have altered the below post slightly to be more of a review of the book (rather than thoughts on The Lord of the Rings generally).

Haven't seen the movie Tolkien yet--plan to!

***

In his excellent book, J.R.R. Tolkien: Author of the Century, Tom Shippey tackles Tolkien's use of evil. He makes the valid and fascinating point that Tolkien treads a line between Boethian and Manichaean views of evil, between a view of evil as coming from inside us (humans create evil through their bad desires and choices) versus coming from outside us (goodness and badness are in eternal and external conflict; every individual has to pick a side).

Shippey points out that Tolkien deliberately trends a line between these two views. Tolkien's approach isn't clumsy; rather, he uses the inherent tension between the two views to reflect on the human condition--because sometimes the badness we face is the result of our choices and sometimes the badness we face seems to come from something external, bigger than our simple, petty selves.

Shippey also tackles critics' arguments over Tolkien's use of the one ring, pointing out that it is not simplistic magic. He proposes the metaphor of "addiction" to explain how the ring works throughout Tolkien's trilogy.

I agree that addiction is a good metaphor for the ring's effects--with a caveat. I think it comes down to addiction + ambition.

The addiction element is definitely at work: the more the ring is used, the more it enslaves its owner; it engenders "cravings"; the closer it gets to its "source" (think stash), the stronger the cravings and more concentrated the dose. Consequently, the ring has less effect in the Shire than it does in Gondor.

At the same time, the intrinsic character of the user, specifically regarding ambition, is a major factor. Bilbo, after all, uses the ring quite often in both The Hobbit and before he gives it away in The Lord of the Rings (Meriadoc learns about the ring years before Frodo inherits it; he saw Bilbo use it to escape encountering the Sackville-Baggineses!).

Bilbo's protection/immunity is that he exhibits zero interest in running anything: a dwarf company, village, town, kingdom . . . He additionally demonstrates a complete readiness to hand over his "estate" to Frodo when the time comes. Only giving up the ring proves difficult.

In other words, Bilbo is the ultimate Libertarian-mind-my-own-business-leave-me-to-my-hobbies kind of guy. Consequently, the ring has little power over him. He is the only person other than Sam (who is quite similar in personality) to give up the ring voluntarily.

Addiction + character explains why Gandalf so resolutely refuses to even touch the ring. And why Galadriel, after a struggle, refuses it so graciously.

Gandalf, troubled: the ring is THAT ring.
The point with Galadriel and Gandalf is that leadership and ambition are not by themselves evil . . .

RATHER . . .

The ring does greater harm to those with worldly ambitions. 

Its corruption is inevitable since unchecked and unbalanced, the desire to get ahead can become a demand for respect, then for compliance, then for domination, followed by an obsessive need to enforce a particular vision or plan  "for the good of others" no matter what the cost.

Some LOTR philosophers argue that a truly perfect being or at least one entirely without ambition--like Tom Bombadil--could never be corrupted by the ring. Possibly. But most of Middle Earth is filled with beings who behave like, well, people. Whether it be "original sin" or the "natural man," the desire to bonk YOU on the head for the sake of MY PRECIOUS will eventually overcome any fellow feeling. Even Bilbo only gives up the ring at Gandalf's insistence. (Bilbo's lack of lasting corruption is where free-will comes in--as well as Gandalf's refusal to take the ring by force.)

The ring's insidious nature is necessary to Tolkien's plot. The entire trilogy rests on the belief that the ring is bad and MUST be destroyed. If the ring can corrupt such good and noble beings as Gandalf and Galadriel, the author must be telling the truth.

Back to Shippey's book: it is the kind of part-biography-part-literary-analysis that I prefer to straight biography. If I pick up a biography and it immediately starts listing the dates of the subject's parents' births and marriages--YAWN (I can look up that stuff on Wikipedia, thank you very much).

Shippey's Tolkien is similar to The Narnian by Alan Jacobs--also a very good book--an analysis of writing and personal philosophy.

In addition, check out A Hobbit, A Wardrobe, and a Great War by Joseph Loconte. I'm not usually a fan of "every author's experience shows up directly in that author's writing" but the truth is, experience does shape people, even authors. The book is worth a read.

The Counterintuitive Yet Elegant Philosophy of Non-Action

For the A-Z List, I am reposting my review of Moneyball for the 700s. The 700s cover sports and hobbies. I've also read The Blind Side by Michael Lewis, but I consider Moneyball more interesting and substantive.

* * *

Recently, I read Michael Lewis's Moneyball. It is a remarkable book, especially considering that I know little about statistics and less about baseball--or little about baseball and less about statistics: take your pick.

But the book is well-written and gripping. In many ways it reminds me of Seabiscuit by Laura Hillenbrand which kept me on the edge of my seat even though, by the time I'd read it, I'd seen the movie several times. It takes a good writer to surprise you with a known outcome.

But the part of the book that strikes me most is not the ending. About half-way through the book, Lewis discusses Bill James, hits, and runs. He basically goes through the math of showing the worth of a player who is willing to be walked versus a player who flails away at anything (which is pretty much the way I play the game; shoot, I start waving the bat AFTER the ball crosses the plate).

He goes on to examine Oakland A's' unique willingness to "hire" (buy?) players willing to let pitches go by, even to earn strikes, in order to get on-base. David Justice--whose age had lowered his apparent worth (yeah, men in sports over 30 slow down)--still had the supreme gift of being patient at the plate. The A's, through DePodesta, had discovered that "an extraordinary ability to get on base was more likely to stay with a player to the end of his career than, say, an extraordinary ability to hit home runs."
The Two Hattebergs

Then came the part of the book that made my hair stand on end, the chapter about Hatteberg:
Hatteberg's was a more subtle, less visible strength. He was unafraid of striking out and this absence of fear showed itself in how often he hit with two strikes...The A's hitting coaches had to drill into hitters' heads the idea that there was nothing especially bad about striking out...It angered [Hatteberg] far less to take a called strike than to swing at a pitch he couldn't do much with, and hit some lazy fly or weak grounder.
Lewis discusses how counter-intuitive this business of striking out/allowing for strikes is in a culture where hitting, hitting anything, has become the end-all-be-all of a player's life.

It isn't just counter-intuitive in baseball--it is counter-intuitive in life. At least American life.

I read what Lewis wrote about the worth of not reacting precipitously and thought, "But that's just about every single argument I've tried to make with institutionalized groups of people throughout my life."

And I've lost--because our culture says that to do something, ANYTHING, is better than not doing something. Administrations/heads/bosses/supervisors/political acquaintances feel a constant compulsion to rejigger things, overview stuff, review whatever. Change what happened before. Add extra steps. BIG BIG BIG. MORE MORE MORE.

I knew I was a libertarian before reading Lewis's book. I had no idea it was more than just a distaste for punditry and political excess.

For example, I have always instinctively worked to keep my courses from becoming morasses of "little work"--continual small homework assignments and projects with endless readings and recourse to the textbook. I always figured, if I can't convey basic writing principals in a single semester using a minimum of assignments/texts, I'm not doing my job. The more I've taught, the more I've come to believe this.

Likewise, I have worked for charitable organizations where I got frustrated because instead of doing the minimum required by the organization's rules--and doing that minimum well, i.e. not burning people out--the organization wanted to reinvent the moon: take everyone to Hawaii, create the biggest celebration ever. Any balking on my part was perceived as indifference, a lack of fellow feeling and compassion.

Except the desire to do more ("if less is more, just think how much more, more would be," Frasier argues) doesn't necessarily result in happier, more helped, more fulfilled people.

Constant motion doesn't automatically achieve anything (nope, not even throwing money at the problem: check out the true end of most lottery winners). And Moneyball proves this. Or maybe Bill James did before the Oakland A's put it to the test and Lewis wrote about it. But Lewis says it in a way that instantly clicked in my head:

There's nothing automatically meritorious about doing-something-for-the-sake-of-the-doing. There's nothing to be gained from creating more and more hoops for people to jump through for the sake of showing how hard other people are working. There's nothing worthwhile about inventing or exaggerating supposed needs just so administrators can then fulfill those needs.

There's no merit in appearing caring and charitable, especially if the care and charity is merely meant to show off the personalities and character of the responders.

"We have piped unto you, and ye have not danced; we have mourned unto you, and ye have not lamented," states Matthew 11:17. Seems like an odd sentence to show up in the New Testament but boy, does it say a lot about human nature, especially the political mindset, wherever that mindset appears.

Hearts Over Time: Romance Has Always Been With Us

The next book on the A-Z list, Part 4, is The Amorous Heart by Marilyn Yalom. The call number is 611.12, which means it belongs in the Applied Sciences section.

In truth, the book should probably be in the 300s--psychology, customs, and culture. Or even the 800s--literature. The 600s are supposed to be more about health. The book does discuss the heart as an organ but only very briefly.

On the other hand, Yalom makes the successful argument that since the beginning of writing and song and poetry, people have linked love directly to the organ, as in the physical heart, as in the feelings/reactions that people have when they are excited, lustful, passionate, uncertain, etc. etc. etc.

The connection to the heart icon 💕💕💕💕 came about much later as I discuss below.

Yalom's book makes several fascinating points:

1. Stories about beating, frantic, heavy, loving, jealous, committed hearts are very old, going all the way back to the Egyptians. And the expression of those emotions re: the heart is surprisingly consistent. That is, romance has been with us for a very long time; the ways we think about romance even longer. Probably the first cave-people occupied their evenings by talking about how Bob was felled by a mammoth AND about how Bob's wife was cozying up to Igor now.

2. In Christian medieval Europe, amorous feelings were naturally explored through poetry and songs. The amorous heart had a contender--the heart committed to Yahweh, Christ or, in Islam, to Allah. However, as in the Song of Songs, the terms used to described a religious passion for deity often borrowed from the amorous tradition (and vice versa: see Cohen's "Hallelujah").

Love, Medieval-Style
Yalom is a trustworthy historian and points out that despite the erotic nature of medieval religious poetry, it would be a mistake to get all Freudian about it: "[L]ove as experienced and expressed by medieval mystics should not be reduced to sublimated sexuality; they loved God in a style that was in keeping with the religious culture of their times."

I agree with Yalom. Still--having read some of the included passages, I can only conclude that people in the past were a lot less hung up on the physical aspects of devotion than people are now. I maintain elsewhere that the true split between the body and the mind happened in the nineteenth century, and the poetry of medieval clerics would back me up.

3. Yalom makes the fascinating point that the heart icon appeared early on in history but that does not mean it represented love. It was a shape in nature and therefore a decorative shape in paintings where it clearly meant nothing more than "hey, I'm a decorative shape" (like paisley).

In other words--attaching the emotion of love to the physical location of the heart took place entirely separately (and earlier) from attaching the emotion of love to the decorate symbol of  💟.

So if you've ever thought there was no connection cause, well, just look at the real thing...hey, you were right! After all, the icon itself has been explained as being anything other than a human organ: it represents buttocks or fruit or flowers, etc. etc. The connection to love and therefore to the physical heart itself didn't happen until late in the medieval era.

And the icon didn't truly take off until the nineteenth century. In our current culture, it may be hard to believe that the heart symbol was once merely one of several symbols of love and passion and marriage (human beings tend to reason backwards from their own experience and assume that what is now always has been). The truth is the heart symbol has not always been the prevalent image it is today.

The location of the heart within the body and its connection to the physical reactions of love HAS. So maybe that is why the book is in the 600s--physiology eclipses greeting cards.

Packing for Mars with Mary Roach

For the 500's, I picked up The Quantum Universe by Brian Cox and Jeff Forshaw and Packing for Mars: The Mysterious Science of Life in the Void by Mary Roach.

Both books have the merit of containing clear-headed, straightforward explanations of complex subjects. There's something to be said for a discussion of quantum mechanics which states, "For those readers who find the maths difficult, our advice is to skip over the equations without worrying too much . . . The maths is included mainly because it allows us to really explain why things are the way they are."

And can I mention how much I like the British "maths"?

I started The Quantum Universe and intend to finish it, but it's rather dense--as in, I have a hard time watching sitcoms AND reading it at the same time.

Besides which, Mary Roach is a bit more my style. Because, yes, people and chimps in space are absolutely enchanting. However, there is no real need to make comparisons. Mary Roach and Brian Cox are two sides of a coin that says, "Science is great! People can understand it! Even you!!"

What Packing for Mars brings home is how much any endeavor, like going into space, involves a plethora of human problems. For one, there's an entire chapter on vomiting.

One of the most touching examples of how human beings are part of the science equation is Roach's reaction to Devon Island, a place scientists and astronauts go to prepare for the moon. She keeps gawking and reports: "Concerned mission planners built gawp time into the minute-by-minute schedules. 'We're allowed two quick looks out the window,' Gene Cernan reminded Harrison Schmitt as they prepared to descend to the moon's surface during Apollo 17."

Roach also makes clear how much early astronauts and their scientists didn't know about what would happen to them once they entered space. Actually, the astronauts were less worried about the weirdness of space (and proved to be correct) than the scientists who worried that a lack of gravity would completely deform the human form--hence the chimps. We are so blithe about our supposed knowledge now, we forget how worried (and dumb) humans can be about the unknown.

And I have to add that Roach's chapter about whether mammals can give birth in space does give credence to the complication that something about space--whether it be radiation or the inability of a female to properly contract--could prove extremely problematic for long space trips.

I enjoy many of Mary Roach's books. Here's a list:
Grunt
My Planet
Gulp
Bonk
Spook
Stiff

Wild Wonder of Language 2: We are Not Victims of Lingual Programming

In The Language Instinct, Steven Pinker tackles the idea of linguistic determinism, the idea that language determines how people think. If our culture uses the phrase, "Where no man has gone before," we have no choice but to be sexist! We are only saved from our sexist thoughts when some inventor suggests, "Where no one has gone before."

Pinker points out the utter absurdity of this. He starts, naturally, by focusing on basics. Nearly all cultures have words that relate to fundamental colors ("black, white, red, etc."). People in varying cultures tie the words for those fundamental colors to the same external colors (i.e. everybody picks the same crayon for "red"). In other words, the differing words for the colors don't create a differing understanding of color. Human wiring precedes language.

Pinker tackles the myth that the Eskimos have ever so many words for snow. First of all, the myth is false. The Eskimos have as many differing words for frozen falling water as, well, everybody does (snow, sleet, blizzard, etc.). Pinker quotes from an extremely funny essay by Geoffrey Pullum, who points out how utterly non-amazing this apparent cultural glorification of snow would be--if it was even true:
Horsebreeders have various names for breeds, sizes, and ages of horses; botanists have names for leaf shapes; interior decorators have names for shades...printers have many different names for fonts. [Take] the earnest assertion "It is quite obvious that in the culture of the Eskimos...snow is of great enough importance to split up the conceptual sphere that corresponds to one word..." Imagine reading: "It is quite obvious in the culture of printers...fonts are of great enough importance to split up the conceptual sphere that corresponds to one word..." Utterly boring, even if true. 
C.S. Lewis's Boxen images
Language accommodates human need, not the other way around. Our thoughts are in fact more tangled than what language can satisfy.

In proof, Pinker points out that many people trace their process of thought--a process that ends with lingual communication--backwards NOT to language but to an image. Two examples (from my own reading) are C.S. Lewis who reported that he began the Narnia series with an image of a faun in the snow. And Stephen King who compares writing  novels to uncovering the buried bones of a dinosaur.

Faraday
A stunning number of scientists would agree. The few mentioned in Pinker's book include Michael Faraday, James Clerk Maxwell, Nikola Tesla, Friedrich Kekule, Ernest Lawrence, James Watson & Francis Crick, and Albert Einstein. All of them "saw" solutions as images before writing them down in the available language. Based on my own research, mathematician Alicia Boole Stott apparently also visualized and crafted dimensions before she wrote down her theories and discoveries in the language of mathematics.

A geometer
Pinker doesn't make the comparison to religion (at least not in the particular chapter that I read) but the gap between what one "sees" (through dreams, visions, revelations, what-have-you) and what language is available to the writer (dreamer, visionary, revelator, etc.) explains a great deal of scripture, including the Book of Isaiah. It is far easier to understand such passages if one actually doesn't let the so-called left brain get in the way. Listen to Handel's Messiah first. Then try to reason it out. The part of the brain that understands without attaching a label to everything might actually get there first.

The point is: Our brains are not comprised of a bunch of refrigerator word magnets that were inserted into our brains by our culture. Language is way too symbolic, at its core, to reside in the brain quite so literally.

Since it is the Christmas season . . .

The Wild Wonder of Language, Part 1

My chosen book for the 400s in The Language Instinct by Steven Pinker, and it is so packed with ideas and insights that I will be posting about it in parts.

Pinker's basic argument is that language acquisition--how people learn and use language--is the ultimate pro-nature defense. Rather than being a skill imposed from above through education (nurture), children come equipped with the wiring to not only acquire but improve language.

Children brought up in households where the language is grammatically limited improve the language beyond the ability of their parents. Here are two stunning examples from the book: a child brought up by two parents who spoke pidgin, did not in fact speak pidgin but altered the language of his parents to speak creole (which includes more grammatical complexity). Likewise, a severely deaf child brought up by two parents with rough ASL skills, nevertheless improved and surpassed his parents' limited sign language.

Ah, but those children must have had contact with a peer group! is the nurture argument. However, in the latter case, the child was almost entirely limited to what he received from his parents. Pinker does acknowledge that children need exposure to language in order to be able to effectively utilize it (for the same reason that adults have so much trouble learning a language later in life). The point is: even the mildest exposure results in the child's ability to go further, to make the language what the child needs it to be to communicate a range of needs and ideas.

All the imposed education in the world cannot do what the brain's wiring does naturally and effectively. Likewise, human beings still struggle to create a computer program that can produce language as naturally and effectively as the brains of the humans creating that program (though Google mail sure is trying). As Pinker states, "Higher percentages of grammatical sentences [were discovered during the study] in working-class speech than in middle-class speech. The highest percentage of ungrammatical sentences was found in the proceedings of learned academic conferences."

Fight the Fear: Review of The Monarchy of Fear

The 300's of the Dewey Decimal System are devoted to how humans respond to the world. Some of my favorite books end up in the 300's, so making a selection was difficult.

I chose a new book, The Monarchy of Fear: A Philosopher Looks at Our Political Crisis by Martha Nussbaum because it tackles the problem of "doomsdaying."

"Doomsdaying" is my term  for the tendency of human beings to insist that things are getting worse (see "Talking about Politics: The 6 Reasons It Stinks").

I do not perceive myself as a rebel, yet I become a maverick in just about any group I associate with because I persistently refuse to adopt the premise that evil influences are trying to destroy our children, big businesses are trying to tear us down, the world is falling apart, Armageddon is right around the corner, eventually there will be nothing good left, how can you not see it?! 

When I resist, the people with whom I am trying hard not to argue will pull out their "facts" (which usually consist of stating that X number of people have died or caught diseases in the last 24 hours--I consider these types of facts misleading at best), occasionally religious texts (depending on the group), and almost always the latest news story.

I have never been the kind of person who could pull random information out of the air to refute false claims. So I end up simply shaking my head, leading the arguers to assume that I am oh, so, incredibly naive.

But here's the thing: no matter with whom I am arguing--atheists or fundamentalists, liberals or conservatives--the rhetoric is fear-based. The world twenty, thirty, or hundred years ago was a paradise and is now horrible.

The group getting blamed varies. I've heard lovers of doomsdaying blame the left, the right, corporations, 1 percenters, the news, Hollywood, Clinton 1, Clinton 2, the Bushes, Trump, religious people, atheists, etc. etc. etc.  The rhetoric is always the same.

Nussbaum provides a Freudian explanation for this tendency to panic--hey, being born into a scary world is difficult--while Hans Rosling of the delightful Factfulness calls on evolutionary psychology. They both argue (Nussbaum more philosophically; Rosling more jovially) that human beings have this single tendency: its bred into our bones, its part of our internal makeup.

Fear keeps us alive. It also keeps us stupid. Rosling points out that a test about the world's conditions that he gave to Nobel Laureates still resulted in more wrong answers than if he gave the test to chimpanzees. And Nussbaum states:
"On both the left and the right, panic doesn't just exaggerate our dangers, it also makes our moment much more dangerous than it would otherwise be, more likely to lead to genuine disasters. It's like a bad marriage, in which fear, suspicion, and blame displace careful thought about what the real problems are and how to resolve them. Instead, those emotions, taking over, become their own problem and prevent constructive work, hope, listening, and cooperation."
Nussbaum's best analysis is when she delves into the problem of envy. She argues that yes, there is a place for pointing out inequalities in our culture/country/world. But pointing out inequalities constructively with the express desire of righting those wrongs is quite different from destructively trying to tear down people who have more money, more privileges, more stuff.

Rosling makes the same point when he castigates the use of fear to bring about change. It's bad manners. It doesn't work. And it often ends up causing more harm than good.

And they are right! I see the use of fear and "you'd better do what we say or you won't make it to paradise" threats--in any venue--as having a boomerang effect. Sure, the fear and threats work temporarily but the resultant push back almost always results in more of the thing the fear-mongers fear. The Terror after the beginning of the French Revolution didn't result in reform throughout the European World. Instead, places like Britain pushed back on reform movements in their own countries out of fear that what happened in France would happen to them.

You can see their point.

I am personally pleased by the increase in books trying to lower fear or at least understand it. I recently watched the documentary Won't You Be My Neighbor?, which I'll review at a later date. For now, let's just say, a lot of people are getting tired of meanness and fear.

So maybe eventually, I won't feel like so much of a maverick after all.

Non-Fiction Review: Simply Good News

My decision for what to read in the 200s is a great window into the difficulties of categorizing books.

I considered reviewing N.T. Wright's Paul: A Biography which I'm currently reading. In Worldcat, some libraries list this book under the 200s (religion) while others list it under the 900s (history) while still others list it under "Biographies" (which makes sense, considering the title).

Since my local library places it in "Biographies," I decided that using it as my 200 book would be cheating. So I got out another N.T. Wright book (230 Wright).

All in all, the experience reminded me of Haddon's The Curious Incident of the Dog in the Night-time, a book that ended up being categorized as both adult and teen--such are the vagaries of determining audience!

N.T. Wright is a Anglican bishop/writer about Christianity. He is a cross between Rodney Stark (history in context) and C.S. Lewis (ecumenical gospel explanations). I quite liked Simply Good News which points out that Jesus and the apostles perceived the events of the New Testament precisely in those terms: "news" rather than "advice." The news? The Messiah Jesus Christ fulfilled ancient covenants in order to be present to the entire world. His object is not (necessarily) to take us to heaven, but to bring heaven to earth. Isn't that amazing?!

This summarizes much of Wright's view.
Sounds fairly simple (hence the title). N.T. Wright argues that most Christian churches spend too much time translating "gospel" into "advice" and also fail to teach the historical context for "Messiah" and "covenant." Therefore, most Christians don't know what the "good news" actually is.

I didn't entirely disagree with him regarding this last claim, but since I knew the context (I was raised by amateur--in the best sense of that word--Bible scholars), I bridled a little at the beginning of the book when he kept telling me that I didn't.

In other words, Wright uses a fairly standard teaching approach of "most people don't know!" to prepare the ground for his presentation. I'm always skeptical of this approach. In grad school, I had a professor who used in and when I protested that I did, in fact, know the information, he snapped at me, "That's just you!"

Eh . . . no. But then, too, most people's understanding of anything is piecemeal. If the grad school professor had used the "most people don't know" approach about Antarctica (rather than American history), I wouldn't have protested! (I know nothing about Antarctica. Yet.)

Charlie (voice over): Entropy. Parameter of disorder...energy
broken down in irretrievable heat. What might appear to be
chaos, even decay, is really a system's way of smoothing out
differences--its search for equilibrium. Uncorrelated parts
interact...find their connections in an evolving system...
so, from one perspective, entropy is a clock...charting
the irreversible.
I also disagreed with Wright's insistence that Christianity holds different principles from those of the Enlightenment. Granted, proponents of the Enlightenment tend to also see themselves as distinct from proponents of Christianity, but I've never been a fan of the either-or version of history. As people like Rodney Stark and Karen Armstrong point out, the beginning of the Common Era saw a change in perspective within religion, which sent ripples throughout the human experience. Claiming that this change had no impact on later human developments, good and bad, is foolish, both for secular humanists and for religious apologists. And wrong. History doesn't work in compartments.

However, Wright does do a decent job pinpointing the flaw in much secular humanism, basically the self-centered Victorian belief that humans have reached some pinnacle of perfection or degradation ("it was the best of times; it was the worst of times") and the unfortunately still prevalent belief that "theory" can be plastered onto human behavior.

Well, that's like just so much hokum, isn't it? I mean,
Maxwell's Demon is a thought experiment, right? Granted, there
 are theoretical applications, but, um, when the window breaks,
the cold air still rushes in. Gears fail, oil leaks.
Sooner or later, that engine is gonna break down.
Wright wraps up by discussing the human tendency to not think outside the box, to insist that God is one way, namely an angry, absent, vengeful landlord rather than a loving creator. Instead, Wright argues (echoing Paul), God loves us so much He sent us Jesus Christ and when we love God we'll be happier because we'll be closer to God as He truly is--and our happiness will affect others.

Or to put all this in C.S. Lewis's terms, God is not a tamed lion.

The Good News isn't something we can categorize; it simply is.

Overall, I recommend N.T. Wright as a sincere believer who is neither too cloying nor too erudite. Overall, I find his works quite refreshing despite continuing to disagree with some of his more dogmatic statements.

A-Z: Dewey Decimal 100: Figuring Out Our Futures

For the fourth A-Z list, I'm tackling non-fiction, Dewey Decimal style.

For the 0-99s, I read Abominable Science by Daniel Loxton and Donald Prothero.

For the 100s, I read Stumbling on Happiness by Daniel Gilbert: 152.42 Gilbert at Portland Public Library.

The title sounds like a self-help book. It isn't. It belongs in the section that tackles philosophy. In this case, Gilbert is discussing the science of how people make decisions, how the brain operates. He doesn't actually get to happiness until almost the end of the book.

His ultimate point: human beings are exceptionally good at imagining the future; since they are also exceptionally bad at imagining the future correctly, they are exceptionally bad at predicting what will make them happy.

Gilbert's point is the reason that science-fiction, even Asimov's, revolves around the world the authors know; thus, 1960s Star Trek has beeping buttons, and 1990s Star Trek has flat computer screens, and so on. (The most truly prescient part of 1960s Star Trek were the communicators, but even those had a kind of current-day cousin in the form of huge walkies-talkies.)

Gilbert details the reasons that humans are so bad at imagining the future by referencing multiple studies as well as studies combined with neurological scans--quite frankly, it's the kind of sociology I can get behind.

It comes down to something that proponents and antagonists of AI often seem to misunderstand. However Spock-like/Sheldon-like human beings pretend to be, emotions are a part of decision making. The reactions we have to events in the real world and to imagined events both involve emotional judgment; until computers can mimic this, they truly can't form judgments at all.

Gilbert doesn't talk about computers; he talks about the brain's remarkable ability to separate an imagined emotional judgment from an emotional judgment related to reality. That is, when we are faced with a real choice--a red light--we hugely favor it over an imagined choice--a green light--because the brain wants us to survive.

However, since an imagined event and a real event both entail actual emotional responses, those actual emotional responses can get massively confused, especially since human beings are remarkably bad at realizing that time will alter an emotional response. Gilbert shows that people often feel far less awful after a traumatic event--losing a job, receiving a terrible medical diagnosis--than they imagined they would.

Humans rely on comparisons to understand the value of things, but those comparisons, by necessity, are all performed in the present. That is, we compare experiences to those we are currently undergoing or have already undergone, not to experiences that may or may not happen in the future even when we are imagining the future

Unfortunately, when we imagine future experiences by comparing them to similar past experiences, we encounter the problem not only of confabulation (mixing up memories) but of cropping those past experiences, even to the point of employing a narrative. In one study, men and women recorded their emotions during a period of time. When they were asked to look back and remember how they felt, they remembered their past feelings based on gender expectations (women "remembered" feeling more sensitive; men "remembered" feeling less) even though at the time their feelings were far more gender neutral. The same type of thing happened between Asian Americans and European Americans. The Asian Americans actually recorded more positive emotions but "reported that they had felt less happy and not more." (Ah, European Americans and their endless pursuit of happiness.)

In sum, people are BAD at predicting what will make them happy.

The solution: since people are relatively good at knowing what they feel in the present (if not the future or the past), the best approach is to request impressions ("How do you feel?") from someone currently experiencing what we might want to do in the future. Studies have shown that this is a remarkably effective tool (i.e. people actually do feel better for knowing spoilers).

I used this technique when I contemplated getting a Ph.D. On the one hand (negative), it would entail far more debt, which makes me want to die inside. On the other (positive), it could mean working at a university, which could entail (supposedly) greater job security. On the first hand, I would be competing with people with three Ph.D.s and four M.A.s (how many degrees was I going to have to get in this rat race?). On the other, I would satisfy social pressure (which is quite powerful) by becoming a "real" professor who worked at a "university". On the first hand, I would have to move, which I had no desire to do, and go through another bout of being a student. On the other hand, I would prove that I was ambitious and could accomplish a long-term goal.
Who decides what is relevant?

I could imagine myself doing what I currently loved, working at a community college, teaching
English classes.

I could imagine myself teaching at a university like Barbra Streisand in The Mirror Has Two Faces.

I knew enough to know that imagination wasn't going to help me. I could end up totally miserable in either case. So I went online and did research. Truthfully, I knew where I was heading. The negative column obviously increased my tension and the positive column seemed mostly comprised of making other people happy--which my personal philosophy says is a bad reason to do stuff (though Gilbert points out that believing untrue realities--such as "children make us happy"--prove beneficial to society overall).

However, I wanted to be sure, and I continued to waver until I came across well-written comments by a person who had gotten a Ph.D. in the Humanities; he or she (I can no longer remember) clearly and non-aggressively expressed the downsides and upsides of that decision and wrapped-up with how they currently felt. It was the final push I needed to make up my mind.

And now I get to teach people who truly need help rather than being tied to a higher educational  system that increasingly fills me with dismay as students pay more and more money so that system can continue to charge them more and more money. 

Gilbert is right: asking advice works!

Of course, Gilbert would point out that once people make decisions, their brains expend extra effort to justify those decisions.

Or, as a children's counselor once told me, "Children and adults do things for the same reasons. Adults are better at rationalizing those reasons."

Ah, well, whatever works.

A-Z List, Part 4

Actually, it would be more accurate to call this the 000-999 List.

Yup, it's time for non-fiction, and I'm using the Dewey Decimal System!

I'm passably familiar with the Library of Congress System. By the time I left college (both times) I could more or less find the sections for topics like "Religion" and "Science." Unfortunately, at this point, I only remember that "P" is "Humanities."

But the Dewey Decimal System and I are old pals. I will be reviewing a book from each set of 100. (See list above).

For 000-099, I chose Abominable Science by Daniel Loxton and Donald Prothero (001.944). The book refutes the existence of famous cryptids, such as Big Foot and Loch Ness, while also explaining how myths about these creatures came to be. Loxton and Prothero write separate chapters (Loxton covers Bigfoot and Loch Ness, for example, while Prothero tackles Yeti and the Congo Dinosaur).

The book is quite readable, Loxton's portions slightly more than Prothero's, and beautifully illustrated.

The theme: It isn't science if you can't test it.

Show me a million so-called samples of Bigfoot's fur (which inevitably turn out to be bear or human). If I have nothing to compare it against, so what?

Wishing doesn't make something so. And the human capacity to wish is tremendous.

Loxton, who writes with a sweet-natured glint in the eye, is more tolerant of the desire to find big monsters (despite the fact that a real pleiosaurus like Nessie would have long ago eaten up all the loch's food sources). As a lover of cryptids in childhood, he understands the fascination with the unknown--after all, in the late nineteenth century, Great White Sharks were considered something of a myth! Sometimes, the capacity to wish reveals astonishing truths and inventions.

In comparison, Prothero gets downright testy about the insistence on the existence of certain cryptids. It is possible that he uses a tetchy tone as a deliberate contrast to Loxton's friendlier one. But the difference in tone may also be innate. In the last chapter, the two authors present their differing opinions regarding the pursuit of cryptids, Loxton maintaining that it is harmless and can lead to an interest in the natural world; Prothero arguing that it is harmful to real science.

I mostly agree with Loxton, but I must admit that when I recently saw a non-fiction children's book in the library that presented Bigfoot as a given, I was appalled at the publisher.

Oh, well, part of a free society is learning to distinguish good information from bad.

The most consistent facts that emerge from the creation of these myths is that humans love to hoax! (which is why science requires testing). Any hint of any big thing lurking in the shadows appears to bring out the junior high boy in a truly astonishing number of people (who sometimes recant, then change their minds when they start earning bucks).

As Loxton writes, "There seems little need to conjure a central conspiracy when good humor, expectation, and simple human error could so easily provide ample fuel to spark a monster myth" (141).

(Through there is a very funny Dharma & Greg episode where Greg tries to convince the alien conspiracy theorists that they have been taken in by a conspiracy to defraud them of money. They shake their heads at the poor, deluded fool.)

Ultimately, although I highly recommend the book, I doubt it is read by many "true" believers (excluding the ones who like to hunt for one tiny mistake in a text, then say, "See! I knew they were wrong!").

However, also ultimately, Loxton and Prothero are right: if it can't be tested and it relies on the argument "Everyone else is lying!" then it's a hobby, not a scientific discovery.

* * *
Loxton, Daniel and Donald R. Prothero. Abominable Science! Origins of the Yeti, Nessie, and Other Famous Cryptids. Columbia: 2013.