Things I should have doubted: The atomic bomb ended WWII

(A key part of starting from doubt is changing your mind when confronted with new evidence. I find having this happen to be both frustrating and exciting: frustrating that I was wrong, exciting that I’ve learned something new.)

There is a narrative that dropping the atomic bomb on Hiroshima and Nagasaki was the defining action that ended World War II. It seems plausible and I never questioned it until reading an argument against it. The article Did the atomic bombings of Hiroshima and Nagasaki really end the war? in the National Post Aug 04, 2017 provides a salient overview. I was particularly struck by this paragraph:

The minutes to Japanese war meetings barely mention the bombings.
For his 2011 book Hiroshima Nagasaki, Australian historian Paul Ham pored over the minutes of high-level Japanese meetings and discovered that the country’s ruling military elite had a shocking indifference toward the atomic bombings. On Aug. 9, Japan’s six-member supreme war council was meeting in a bunker under Tokyo when word was first received that Nagasaki had been destroyed. Engrossed in discussions about the Soviet invasion, the assembled men did not seem to care. “A runner comes in and says ‘Sir, we’ve lost Nagasaki, it’s been destroyed by a new ‘special’ bomb’ … and the sort of six Samurai sort of said, ‘thank you, and run along,’ ” Ham told an interviewer in 2011.

Other points provided in that article include:

  • The bombings coincided with one of the largest invasions in history.
  • The United States had been destroying Japanese cities for months.
  • Japanese officials were utterly staggered by war with Russia.
  • Before the bombings, the United States knew that the Japanese were contemplating surrender.

There is a reasonable argument that the narrative was purposely provided to the public in order to justify horrific acts.

The day after I wrote this, the 50th nation signed the U.N. Treaty on the Prohibition of Nuclear Weapons (TPNW). The stated goal is to eliminate all nuclear weapons. This officially ratifies the treaty – but only for those that signed it. Of course, the U.S. and the other original nuclear powers oppose this treaty.

There is quite a discussion of this online that is easily found by searching on “Did the atomic bomb end WW2?

Disrespect for RBG by Another View

Keywords: RBG, after life, Judaism. Christian privilege

This is a letter I had published in the Eugene Register Guard Sep. 27, 2020, p. 8A. On the 22nd, the Another View editorial cartoon showed Ruth Bader Ginsburg in judge’s ropes walking through the pearly gates on a floor of clouds. Interestingly, this was the first time I ever got an email back from the editor; mostly in response to the parenthetical question (which was removed from print with my permission.) They pointed out that the current environment for newspapers does not leave time to consider “nuances” (their word) and that they had very quickly grabbed something out of the choices they had. (Unfortunately, this is an indication of the sad state of affairs of newspapers.) To a secular activist, this is not very nuanced. From what I saw, the RG was not alone in “honoring” RBG using Christian iconography. I even saw one article asking whether people should use RIP relating to RBG – probably not.

As submitted:

I assume that the Another View on Sept. 22nd was intended to show respect for RBG and to unite those that mourn her death. It does neither. The gates and clouds are classic iconography for the Christian heaven. Although I couldn’t find anything specifically about her belief in an afterlife, since she was a non-observing Jew, there is reason to doubt. The Old Testament does not discuss an afterlife and one of the biggest schisms in Judaism was, and is, over its existence. Depicting her as a judge in a religious setting disrespects her support of church/state separation. Using Christian iconography disenfranchises the roughly 30% of the U.S. population that are non-Christian. Implying an afterlife alienates the more than 25% of the U.S. population who do not believe in one.

The assumption that everyone believes in an afterlife and uses religious observance as part of mourning is an example of Christian and religious privilege. (Did anyone involved in publishing this even consider these issues, let alone research them?) I find it disingenuous to use the death of such an honorable person to promote Christianity and religion.

How do we manage a mountain of knowledge?

Keywords: tools of doubt, experts, peer review, amount of knowledge

Try to imagine, if you can, how much knowledge there is in the world. Here’s an image I was given to try to illustrate how awesomely large the amount of knowledge is. Imagine collecting a set of textbooks that would give a reasonable overview of all of mathematics (or, if that turns you off, pick a subject more palatable to you.) Just selecting one textbook from each reasonably separate area of math (geometry, trigonometry, topology, combinatorics, etc.) would easily fill a football field. One copy of each math textbook ever written would stuff the average town library. And this is just for one subject! Another image is to imagine the largest library you’ve ever been in or seen in a movie, and realize that that is only a tiny fraction of all the books ever written. Even the Library of Congress (shown below and which has an estimated 167 million books, maps, recordings etc.) is only a small portion of all the documents ever created. The point is, there is a lot of knowledge. And just because I’m a nerd that likes to throw big numbers around: the first article I came across regarding the total amount of data in the world gave an estimate of 18 zetabytes in 2018 (1 ZB = 10007 bytes = 1021 bytes = 1000000000000000000000 bytes = 1000 exabytes.) The article’s prediction for 2025 is 175 zettabytes which, “at the average current internet connection speed, would take 1.8 billion years to download.” Isn’t that awesome?

Library of Congress (image from Wikipedia)

(In the face of all this information, how is it that some people think a single ancient text contains the wisdom of the ages?)

The point is that nobody has the time to research everything. Nobody can know everything. So, how do we know what the facts are? Where do we find trustworthy information that we can act on? The answer, of course, is to use experts. Realistically, we don’t need to depend on experts for everything. We are all experts in some things; we couldn’t survive the day without expertise about basic daily actions. But there are so many things to know that we have no choice but to accept expert opinion, even if it is simply something like trusting a mechanic to fix your car. But one of the conundrums of being a doubter is that argument from authority (argumentum ab auctoritate) is not valid. Just because someone says it is true doesn’t mean it is. So, we have to be careful in how and when we choose an expert. How do we know who to trust?

Wikipedia is a good example to look at to evaluate this question. As useful as Wikipedia is, you have to be cautious. There are risks to crowdsourcing information. If you don’t know, just about any person willing to learn the ropes of how to provide input into Wikipedia (and has internet access) can do so. So non-experts (even nefarious people) can add to Wikipedia. There are mechanisms to moderate entries including behind the scenes debates that can be reviewed. But there are some really good indications of the quality of Wikipedia entries. Two main indicators are warnings at the top of the page and the number of citations. The moderators actively put comments at the top of pages that seem a little thin or suspicious. Sometimes there are also warnings on the right hand side such as “considered pseudoscience.” And, in terms of any claim of expertise, clear citation of research or original works is a must. Interestingly, I recently came across an entry that had both warnings and a ton of references. So they aren’t mutually exclusive.

These indicators have analogs when evaluating the expertise of individual people or organizations. Are there major criticisms of them? Have they authored peer reviewed papers or are they able to cite published peer reviewed papers to support their theories and opinions? Also, do the people making criticisms, or the publishers involved, have the credentials to be taken seriously?

A person’s education can also be a good indicator. Not every highly educated person has a degree, but degrees and certificates are still an indicator of expertise – assuming the bestowing organization has its own pedigree. Someone with a PhD really is an expert – or, at least, more expert than the average person – in their subject of study. But that expertise can be very narrow. Here’s an illustration I was shown that illustrates the knowledge learned while obtaining university degrees. Draw a circle representing all the knowledge in a particular subject (like math.) Put the smallest dot you can in the center of the circle. That dot represents the amount of knowledge gained by obtaining a bachelor’s degree (traditionally four years). Now draw the smallest circle you can around that dot. That represents the amount of knowledge gained by obtaining a master’s degree (often two more years). Now draw a line from the center of the circle to the perimeter. This represents the knowledge gained by obtaining a PhD (often three to six more years.) Now draw a tiny arc over the end of the line where it intersects the perimeter. This represents what a PhD candidate contributes to the body of knowledge in order to receive the degree. PhD students are very focused.

Experience also plays a major part in developing expertise, so non-academic activities should be considered. After getting a Phd it is not unusual to go into a different area of study then the thesis. And one of the main things a PhD student learns is how to learn. So, post PhD work can expand expertise quite a bit. But, in general, experience is harder to evaluate (since certificates aren’t always given out.) But the same types of criteria can be used. Did the person work in a single area for a long time (the 10,000 hours hypothesis)? Was there career distinguished? Do other people recognize their expertise?

Unfortunately, it is not unusual for well-educated and experienced people to claim expertise beyond what they truly have. That’s why it’s important to not rely on a single source of expertise unless you have to.

A cautionary tale: I was once at a science based symposium where a presenter made what seemed to be questionable assertions. So I asked for references and I was given the name of a supposed expert. I found the “expert’s” website, and, although they were selling books they had written, there were no other cited publications at all (let alone peer reviewed papers.) Further, I was able to find evidence that they, at one time, placed “PhD” after their name but stopped doing so when it was pointed out they didn’t actually have one. It is unfortunate that an otherwise creditable symposium included a not so creditable presentation.

Another cautionary tale: One of the topics that came up during my yoga teacher training was ayurvedic “science” or “medicine.” This is an ancient practice that basis medicine on the five elements of water, fire, earth, air, and spirit (or variations thereof.) Many of the claims made seemed questionable. So, I went looking for peer reviewed papers, I found a curious situation. There are plenty of magazines dedicated to ayurveda that contain articles on medicine. What is curious is that the ones I found were magazines dedicated to ayurveda that cover no other topic. Further, I could find no articles on ayurveda in any established peer reviewed medical publication. Having worked for some large companies, I once came up with what I think of as a group version of Peter’s Principle (“People in a hierarchy rise to their level of incompetence”. The problem is that people can be really good at their job but then get promoted beyond their abilities.) My generalization is that an organization can grow to the point of having so much internal paperwork that they cease to provide value to the rest of the company (I’ve never come up with a cute name for this.) The ayurveda literature (and community) is self-contained. It feeds itself to the point of having no value outside itself. This closed self-validation (among other things) makes these articles not creditable.

A major point here is that you have to check both the credentials of an expert and the credentials of the credentials – potentially with multiple iterations. Fortunately, this chain is usually not very long since there are well established creditable publications and sources. One of the practices of a doubter is to develop a list of experts (people and organizations) that have a history of creditability.

Florence Howe : “Mother of Women’s Studies”

Keywords: women’s studies, feminist, woman authors, The Feminist Press

(There are many influential people in history that are not generally known. Many such can arguably be said to have changed history because they doubted something and acted on that doubt. Here is someone I recently came across that I put in these categories.)

I sometimes get struck when confronted with “a first”. You know, the first Black president, or the first openly bisexual governor. Intellectually I understand that, in some sense, there is always a first. So I think my reactions are usually more from the fact that so many firsts I hear about are so recent (e.g., the above examples.) Why didn’t these things happen long ago? This extends to fields of study. How can there be whole areas of reality that haven’t been interesting enough to study until recently? And thus I was struck by the very concept that there was a “mother of women’s studies”.

Why is it only in the 1960s or so that the idea of women’s studies was even thought about? The answer, of course, is millennia of patriarchy. As is implied by this quote, who would be interested in such?

“I was teaching women’s studies at Goucher College in Maryland at the time, and there weren’t enough materials,” Ms. Howe told The New York Times in 1972. “The publishers I spoke to all said, ‘Wonderful idea, but there’s no money in it.’”

One of her best known activities was establishing The Feminist Press which provided exposure for many women authors. More generally she was an active and honored academic, an early feminist, and the author of influential essays.

Florence Howe doubted the patriarchal attitude towards women and spawned an entire field of study to counter it. That’s awesome!

See also (or, as always, do your own search):

New York Times article

Wikipedia article

Am I a figment of your imagination?

Keywords: reality, solipsism, modelism, arrogance, rudeness, philosophical foundations

I have been somewhat distressed by how many people I’ve talked to that claim, “We create reality.” This is a very dangerous idea as well as being, quite frankly, rude.

I wish I could recall what science fiction story I read (so that I could give credit) where a philosophical bartender was discussing Descartes. The main idea was that Descartes was trying to figure out if there is anything that he could be absolutely positive about; is there some single indisputable fact? After thinking about it, Descartes realized that, if nothing else, he was thinking! And thus his famous phrase. Although they might be considered assumptions rather than facts, I argue that the following three statements are necessary to engage in any exploration of reality. So, here is my philosophical foundation – the starting point of moving beyond doubt.

  1. Cogito, ergo sum (I think therefore I am),
  2. Something exists besides self (objective reality),
  3. Sensory perceptions interpret, rather than capture, reality (modelism).

To deny the first statement is to contradict yourself: an “I” has to exist in order to implement the action of denial. Although it is possible to get into discussions about what each of the words (“I”, “think”, “therefore”, “am”) mean, at this philosophical level, to do so is to go down a rabbit hole. Denying this statement leaves no room for further discussion of anything.

The second statement points out the fallacy of “We create reality.” I make a distinction between “subjective” and “objective” reality. We each create our own subjective reality, but no one creates objective reality. Although there are many definitions of objective reality, it is most fundamentally something that exists beyond me (although it includes me as well.) This is in contrast to my perceptions – my model; my subjective reality – of me and not me (the third statement.)

Solipsism is the belief that you are the only thing that exists and that everything else is a figment of your imagination. (There are other, less strict, definitions, but this is the one I’m using.) It is rightly said that there is no valid argument against solipsism; any argument might just be me making it up.  At the deepest level denying solipsism has to be an assumption. My argument against solipsism is that it hurts to try to walk through walls. (Another way of saying this is that objective reality is that which kicks back when you kick it.) If you truly create reality, why hurt yourself? Why limit yourself?

The statement “We create reality” is a contradiction. There can only be one solipsist – only one person that creates reality. The moment you say “we” you have acknowledged that there is something other than yourself. To claim solipsism is also to claim godhood – which is, perhaps, the epitome of arrogance. It is also rude to tell me that I am a figment of your imagination. If you are unwilling to acknowledge my existence – separate from yours – then there is no reason for me to talk to you. Please go have a nice conversation with yourself.

Importantly, claiming solipsism or, in general, denying the existence of a reality outside your self is dangerous. It is dangerous to yourself in that you will hurt yourself walking into walls. But you can also cause great harm by ignoring the reality of how your actions (not your imagination) affect other people.

The third statement mostly follows from the second. If there is something other than myself, I cannot incorporate that other something into myself (else it is no longer other.) It is important to codify this phenomenological reality because it is a very key concept. Exploring reality is only done through the perceptions of physical senses. What I do with these perceptions is create a model of reality; I do not create reality itself. The essence of doubt is realizing that there is always a disconnect between my subjective model and objective reality – they are never exactly the same. My (and I think many people’s) purpose in exploring reality is to make my subjective model as close to objective reality as I can.

But to have even a modicum of success in making our subjective reality close to objective reality, we cannot do it alone. To think that you can is again, perhaps, the epitome of arrogance. This is where “consensus reality” comes in. In order to communicate with other people there has to be a common ground of understanding. This is the group version of subjective reality. Or better yet, in order to leave the word “reality” to mean “objective reality”, let us call these objects “subjective models” and “group consensus models.” Major forms of group models are cultural and scientific. Cultural models are those that we share with the people we grew up with and with the people around us. These are sometimes called “world views”. I include religious views in this category. There isn’t much question that the world view of a Hindu living in poverty in India is significantly different from a rich Christian living in the western world. Science is an attempt to go beyond cultural models. By prodding objective reality – by testing it in consistent and reproducible ways – science gradually and continuously makes its model closer to reality.

Of course, it is important to remember that any individual’s subjective version of a group consensus model is never identical to anyone else’s. But this level of doubt and self-awareness doesn’t preclude us from acting. It just means we need to be cautious and that we can only, ever, do the best we can based on incomplete models.

Proper Pronunciation and Suicide Prevention

Keywords: tools of doubt, communication, social courtesy, gender neutral pronouns, suicide prevention

Do you get annoyed when someone corrects your pronunciation? If so, why? Correcting your pronunciation is a simple example of starting from doubt. Am I saying things properly? If not, then I can change what I do. I’m amazed at how often I find I’m not pronouncing words properly. (A couple of examples: I used to pronounce “diaspora” with emphasis on the first and third syllables. I used to pronounce both the u and i as separate syllables in “ennui.”) When in doubt, I use https://howjsay.com.

What about people’s names? Have you ever made an effort to pronounce someone’s name properly? Two mathematician’s names that are often mispronounced are Euler and Ramanujan. It seems like a simple courtesy and, in professional settings, it can be in your own self-interest to properly pronounce names. Teaching is a good example of where this can be an issue. I remember teaching one class where, when looking through the roster, I saw the name “Dung.” Instead of attempting to pronounce it, I asked the student how to. (The D is pronounced as Y. How does it happen that conversion of words between writing systems can make what seems like such a blatant error?) This illustrates a very basic approach to doubt – and social courtesy: When in doubt, ask!

Beyond pronunciation is whether or not you call people by the name of their choice. In grade school it seems common for children to make up nicknames or purposely mispronounce other people’s names. Unfortunately, and uncourteously, some adults continue to do so. I grew up being called “Chuck” or “Chuckie.” It never really bothered me and there is some argument for the family doing so since I had a grandfather and a cousin that went by “Charlie.” But when I moved away to college I started introducing myself as “Charles.” Perhaps there was some teenage rebellion in there, but primarily I just decided to call myself by what was on my birth certificate. Somewhat interestingly, it appears that the use of “Charles,” rather than the various nickname versions of it, has become much more common since when I was a child. I was sitting in a public hot tub a few decades ago with three or four other people. I don’t remember how it came up, but one of the people said they felt that “Charles” was too formal. It was both funny and poignant that there were three of us in the hot tub that went by “Charles”. It is still not uncommon for me to introduce myself as “Charles” and then have the person use a related nickname. I usually point out the discrepancy; mostly based on the fact that if you refer to me as “Chuck”, then other people won’t know who you’re talking about. (Words are about communication.) BdiJ doesn’t always correct people about their name depending on how transitory the relationship is – do what works for you. It is also not unusual for people (at any age) to want to change their preferred moniker. I had a friend do this in their 40’s, yet we had a mutual friend that absolutely refused to use this new name. Childhood family nicknames perhaps raise a sticky wicket. Do you insist that your parents or other close relatives change what they call you? Do you, as a close relative, make the effort to do so? My answer is: discuss it, come to an agreement, and don’t get upset or embarrassed if there are relapses.

Before reading further, I ask that you seriously consider the question: Isn’t it simple courtesy (and a communication best practice) to refer to people the way they want to be referred to?

I think this simple courtesy extends to the use of pronouns – regardless of whether or not there are more than two genders. (I’m convinced there are enough scientific results to support gender as a spectrum. I might write another blog on this, but if you want to research the debate, a starting point is “The Science of Gender and Science: Pinker vs. Spelke, a Debate.” And here’s a discussion of biological sex vs. gender.) In case you haven’t heard, there is a push for the use of gender neutral pronouns (and a smaller movement for an expansion beyond trinary gender pronouns.) Of note is that Merriam-Webster’s Word of the Year for 2019 is “they” as used as a singular pronoun. A consequence of this push for gender neutral pronouns is the push to include your preferred pronouns when you introduce yourself; common examples are: she/her, he/him, and they/them.

I have to admit I still struggle with implementing this. I even wrote something that I’m not likely to publish when I first came across the use of “they” in the singular. My main thought was that there should be a better option. This seemed like too much of an in-your-face usurpation of a common word. I still wish there were a better option, but I’ve come to accept that this is a very good option – maybe even the best option. First, I think it is courteous to acknowledge people’s right to determine what they are called. Second, “they” is an existing word that people know and use. Third, words morph their meaning and usage quite regularly. Fourth, sometimes an in-you-face approach is needed to get people’s attention. So, why not?

Something that helped me get past my gut reaction to the use of “they” in the singular was a science fiction story I read in the Analog Science Fiction and Fact magazine. The main character used they/them pronouns (not all characters did). The narrative style used a lot of pronouns and their various forms (I have to admit there are some weird-feeling forms.) By the end of the story the use of “they” as singular was much more comfortable to me. Reading this story also made me realize that a character’s gender is rarely relevant to a story – even in regards to romantic or sexual relationships. It really is a question of normalizing usage. People can get used to it if they are exposed often enough.

Having presented the courtesy argument, here is BdiJ’s argument for the use of nonbinary pronouns:  If their use prevents or aids a child (or adult) from committing suicide or developing life time mental health issues, it’s worth it. If you don’t know, in general suicides are increasing. This includes among teens. Certainly this isn’t the only reason for suicides and there isn’t a well understood reason for the recent increases, but every little bit helps.

When in doubt, choose courtesy and compassion.

Intellectual Humility

Keywords: humility, self-reflection, foundations of doubt

Is it possible to be humble and say you are? Is it arrogance if what you say about yourself is true? Without getting into hairsplitting definitions, I’ll say that, from the viewpoint of doubt, it’s better to say you try to be humble (and mean it) and that there is a difference between arrogance, self-promotion, and acknowledging your strong points. Arrogance includes some level of proclaiming yourself superior to others. In contrast, there are times, such as job interviews, when you need to promote yourself. Also, acknowledging that you are better at something can be very useful when doing a project. (I find it mildly interesting to consider people that are arrogant about their weaknesses. I put people that loudly proclaim they are sinners in this category, with flagellants – e.g., people that whip themselves, as do some members of Opus Dei – being high on this list.)

I recently came across the concept of “intellectual humility” in the September/October 2020 issue of Skeptical Inquirer. Practicing intellectual humility helps doubters to keep ourselves in check as well as helping in how we present ourselves – and our evidence – to non-doubters. When you step beyond doubt and claim something is true, it is easy to step into arrogance as well. The article presents some “representative items from several questionnaire measures of intellectual humility.”

  1. I reconsider my opinions when presented with evidence (Leary et al. 2017).
  2. I am willing to hear others out, even if I disagree with them (Krumrei-Mancuso and Rouse 2016).
  3. When I realize that someone knows more than me, I feel frustrated and humiliated (reversed in scoring) (Alfano et al. 2017).
  4. I feel comfortable admitting my intellectual limitations (Haggard et al. 2018).
  5. I try to reflect on my weaknesses to develop my intelligence (Porter and Schumann 2018).
  6. I often get defensive if others do not agree with me (reversed in scoring) (McElroy et al. 2014).

I like to think that I do well on all of these (spoken humbly.) Number 2 might be my most difficult, mainly because I’ve come to a lot of conclusions based on evaluations of research and arguments. Where do you draw the line between hearing others out and saying, “I’ve heard the arguments before?” This rolls over to number 6, although I probably get more insistent than defensive (which can look like being defensive.) I don’t recall thinking about 4 and 5 before reading the article. I find I have difficulty with the use of the word “intelligence” in 5. I want to rephrase it as “I reflect on my weaknesses to increase my knowledge.” For 4 I want to ask the question of whether it is limitations of intelligence or time and effort. Although I am comfortable with admitting I’m not a genius, I tend to think that, given the time and resources, I can understand just about any specific thing (and my PhD in Mathematics gives some evidence of this.)  And this gets into the definition of “intelligence”. Can you change it? Does intelligence require ability in breadth as well as depth? Does intelligence require instantaneous comprehension of new material?

I generally consider number 1 – changing your mind in the presence of counter evidence – to be the hallmark of starting from doubt. But all of these represent ways in which doubt applies to self as well as to the rest of reality.

References (taken from the article, I haven’t read these):

Alfano, M., K. Iurino, P. Stey, et al. 2017. Development and validation of a multi-dimensional measure of intellectual humility. PloS one 12(8).

Haggard, M., W. Rowatt, J. Leman, et al. 2018. Finding middle ground between intellectual arrogance and intellectual servility: Development and assessment of the limitations-owning intellectual humility scale. Personality and Individual Differences 124: 184–193.

Krumrei-Mancuso, E., and S. Rouse. 2016. The development and validation of the comprehensive intellectual humility scale. Journal of Personality Assessment 98(2): 209–221.

Leary, M., K. Diebels, E. Davisson, et al. 2017. Cognitive and interpersonal features of intellectual humility. Personality and Social Psychology Bulletin 43(6): 793–813.

McElroy, S., K. Rice, and D. Davis. 2014. Intellectual humility: Scale development and theoretical elaborations in the context of religious leadership. Journal of Psychology and Theology 42(1): 19–30.

Porter, T., and K. Schumann. 2018. Intellectual humility and openness to the opposing view. Self and Identity 17(2): 139–162.

<span>%d</span> bloggers like this: