Saturday, May 30, 2015

The Good Life


We all know that to live a good life, one must not steal, rape, murder, molest, abuse, etc.

But that's not enough.

Living a good life also means helping the less fortunate.

In 2015, it's just impossible to claim ignorance. We know far too much about far too many who are far less fortunate than us.

And that's why you should support me in my quest to raise a total of $10,000 in the Calgary Run for Water. On June 13, at the age of 45 and with wide-open aortic regurgitation*, I will attempt to run 10 km, and raise $1,000 for each. The proceeds go towards helping the people of Dorze Bele in Ethiopia obtain access to a real game changer: clean water. Imagine if 1 in 5 of our children failed to reach the age of 12 because of water-borne disease. That's the state of affairs in Dorze Bele, and we can change it.

I already donated $1,000 when I registered. Join me in trying to live a better life. Click here and give generously. Help me to raise the remaining $9,000.

Two years ago, with your help, I raised over $7,000 in this event. I know we can all do better this year.

We can live better lives by helping others to live better lives.



*My cardiologist has given me the green light to do this. Aerobic exercise seems safe in my condition, maybe even beneficial.

Wednesday, May 13, 2015

On Disagreement. Part 3


So far in this series, I’ve considered two straightforward instances of disagreement and argued that in each instance, the rational thing to do because of the disagreement is suspend belief (see here and here). Today, I’d like to summarize what I think are the circumstances where disagreement requires suspension of belief.

Quite simply, one should suspend belief whenever, as far as one can know (from an epistemic perspective), the probability that the belief is true is roughly equal to the probability that it is false.

Not all disagreement presents such a situation. For example, Dr. Rik Willems is an expert in the treatment of slow heart rhythm disorders with cardiac pacemakers. If a first-year medical student on her first clinical cardiology rotation thinks that a patient should have a pacemaker implanted, and Dr. Willems disagrees, the probability that Dr. Willems is right is considerably greater than the probability that the medical student is. After all, medical students are supposed to get their plans for patients vetted by attending physicians, not the other way around!

Dr. Willems and the medical student are not epistemic peers. That is, they are not in equally good positions to make judgments upon pacemaker therapy. This is not to say that just because Dr. Willems is in a superior position to make such judgments, that his opinion must be right. The rational thing for him and the student to do is explain to each other the reasons for their opinions. Maybe Dr. Willems has contracted viral encephalitis and evidence of his cognitive dysfunction will be disclosed in the conversation. More likely, however, the medical student has missed an important detail of the patient’s situation, or misinterpreted the available evidence addressing pacing in that situation. This conversation comprises a process known as “full disclosure”; it represents the best possible attempt for disagreeing parties to consider and share the reasons for their own belief and the reasons for the opposing belief. In many such instances, the reasons on one side of the disagreement will really be better and the disagreement will be resolved. We can all, medical students included, learn a great deal this way, even though not all disagreements end so educationally and amicably.

The disagreeing clocks left little to no room for consideration of which time reading was more likely to be correct. Electronic quartz clocks these days are all remarkably accurate, so these two machines are “epistemic peers”. Maybe one had suffered a power loss that the other had not. Maybe somebody spilled a Coke into the one on the night table and caused a malfunction. Or maybe steam and humidity from the adjacent shower caused a malfunction in the bathroom clock. Since the clocks can’t speak and arrive at full disclosure, it seems quite clear that the weight that one must put on the reading of each clock is about equal, and so one must suspend belief about what the time actually is.

The disagreement about the dress also leaves little to no room for consideration of which opinion is more likely to be correct. If just two individuals disagreed, they’d have at least a few things to discuss. Is one looking at the monitor from a particular angle, or in a room with a particular reflection that is affecting her perception? Is one color blind? Is one deceiving the other? But since the disagreement occurred on a global scale, all of these possibilities even out among the two disagreeing camps. Upon becoming aware of the scale of the disagreement, one really is left with no good reason to think that one perception is more likely to be correct than the other, and the rational thing to do is suspend belief. Since the weight of one perception is, as far as anyone can tell, equal to the weight of the other, the circumstances are not unlike considering a coin flip, and this is true even when both parties are disagreeing on the very private evidence of perception.

Why can’t the parties agree to disagree? For the simple reason that both parties have, in the genuine opposition of the other, a good reason to believe that their own perception is, as far as either can tell, the wrong one. Had the opposing belief resided in your own mind – a situation people sometimes find themselves in when they are torn between 2 equally strong but opposing beliefs – you’d be perfectly agnostic. The fact that the opposing belief resides in another mind is, as far as either can tell, arbitrary, and therefore not sufficient to render one belief more likely to be true.

So there we have it.  If epistemic peers disagree after full disclosure, and there remains no good reason independent of the disagreement itself to consider one belief more likely to be correct than the other, the rational thing to do is to suspend belief and try to find other information that will settle the question. If further deciding information is unavailable, either in principle or in practice, then the question will have to just remain open, and cognizers will just have to remain agnostic, at least until such new reasons are available. 

If you think about that for a moment, you should realize that if you accept it, you're going to have to suspend belief about a whole lot of things. This approach to disagreement leads to a significant amount of skepticism, though not, at least as far as I can see, the kind of sweeping philosophical skepticism that is intellectually crippling. We can still believe, for example, that a computer screen is in front of us, that Kennedy was assassinated in the sixties, that OJ was probably guilty (even if that belief isn't beyond all reasonable doubt) and that the gene is the unit of inheritance. But what should minimum wage be? What should be done about income inequality, anthropogenic global warming, and ISIS? Is Allah or Jesus God? These kinds of questions would seem to require the humble approach of agnosticism, and further argumentation, experimentation, and evidence. Sometimes, we are forced to act despite being agnostic, but notice that there's nothing wrong with taking a "best guess" when that's all that is available.

In part 4, I’ll apply this reasoning to a case of disagreement in the Cardiology community and explain how it is being addressed. Chime in now with your own disagreement and you just might find me addressing it in part 5, when I will consider some criticisms of approaching disagreement in the logical fashion I have been describing.

Monday, May 11, 2015

On Disagreement. Part 2


Here's a pretty dull picture of a dress with gold and white stripes, right? As you probably know, this past February, a tumblr user posted this photo and it went viral. Why? Because of disagreement.

If, like me, you saw gold/white stripes, then you were rational to believe that the stripes actually are gold/white. But what are you rational to believe when you realize that a huge population of people disagree with you? While looking at the same picture, they see black and blue stripes. How does your awareness of that disagreement influence the rationality of your original belief?

It turns out that the dress in the photo has been identified, so evidence that will settle the question exists. However, while one is aware of the genuine disagreement and before one is aware of what that definitive evidence shows, we can and should ask the following questions:

- Are those who see gold/white stripes rational in continuing to believe that the stripes are gold/white?

- Are those who see black/blue stripes rational in continuing to believe that the stripes are black/blue?

- Or should both camps suspend belief and conclude that there is something fishy about this situation - something that's preventing either group from rationally forming a belief about the actual colors of the stripes?

It seems to me that just as the disagreeing clocks in my previous post prevent rational belief regarding the actual time, so does the disagreement that captivated the world-wide-web prevent rational belief regarding the actual colors of the stripes on the dress. Until further evidence is available to settle the question, anybody who insists that the dress stripes actually are as they appear to them in the face of that disagreement is just special pleading.

The definition of arrogance is displaying a sense of superiority, self-importance, or entitlement. Without a reason for one group to think that their perception of the dress colors is more likely to be correct than the other, any member of each group who is aware of the genuine disagreement that exists, yet who insists that the colors actually are as they appear, is being arrogant. The humble thing to do here is the epistemically right thing to do, and that is to recognize that one simply can't rationally believe that the dress colors are as they appear. Not, at least, until further evidence settles the issue. The rational thing to do here is to remain agnostic on the question, despite the deliverance of your senses.

Let me explain. Assent to a belief is only rational when it is more likely that the belief is true than false. Since there is no reason to think that one group is more likely than the other to have true beliefs about the dress stripes, the principle of insufficient reason (also known as the principle of indifference) suggests that the probability that either group is correct is no better than 0.5 (after all, both groups could be wrong). Accordingly, the genuine disagreement in this case prevents rational belief. Again, the rational thing to do is to remain agnostic about the dress. One could humbly say that one's best guess is that the colors are as they appear to them, but one would not be rational to say that they believe that the dress colors actually are as they appear.

What if someone perceiving the colors as white/gold were to think to themselves something like this: "Maybe those people seeing black/blue stripes have something wrong with their visual systems? Maybe they are falling prey to an illusion? Accordingly, I can rationally continue to believe that I'm right and they are wrong." Would this kind of argumentation provide a good reason for rationally maintaining the belief that the dress colors actually are white/gold?

Well, if those seeing black/blue stripes are falling prey to a visual illusion, then those seeing white/gold stripes are rational to continue to hold their belief that the stripes are white/gold, but the disagreement calls that very conditional into question! Assuming that the other group is the one falling prey to a visual illusion is a classic case of begging the question (also known as circular reasoning). To avoid this fallacy, one would have to not be assuming that the other group is likely to be wrong; that is, one would need to have reason(s) independent of the disagreement itself to believe that the other group is likely to be wrong.

I've now considered two instances of disagreement: (a) quartz clocks displaying different times, and (b) two large groups of people disagreeing about the colors of a dress in a photo. In both instances, there was no good reason to think that one clock, or one group, was more likely to be correct than the other, and in both instances, assent to belief was irrational, or so I have argued.

Next time, I'll try to summarize what I think are the logical principles involved in considering how disagreement should affect the rationality of one's belief(s). This is the time to chime in if you think that I've made some mistake in my reasoning so far. This is your time to disagree.

By the way, here's a picture of the actual dress in the photo:

And here's a link to a great discussion of the explanation for this disagreement by Canadian experimental psychologist and cognitive scientist, Dr. Stephen Pinker.

Sunday, May 3, 2015

On Disagreement. Part 1



Suppose you’ve just completed an over 20-hour series of flights to an exotic location. You’re exhausted. When you get to your hotel, you close the curtains tight, curl up in a cool, crisply made bed, and finally fall into a delicious sleep. After what seems like an eternity, when you stir again, you crack open one eye and see that the bedside alarm clock reads 07:00 am. Refreshed and remembering that you have a busy day ahead, you pop out of bed, planning your day.

When you reach the bathroom to take a shower though, something strange catches your eye. The clock on the bathroom wall says 09:45 am.

Hmmm.

Up until you walked into the bathroom, it was quite reasonable (ie. rational) for you to believe that it was 7 am. What’s it now rational for you to believe upon seeing the virtually simultaneous reading of 09:45 am on the second clock?

I don’t give a flying fruit what the actual time is, and I don’t mean for a second to suggest that if the scenario I just posed were to actually happen in real life, one ought to go back to bed and deliberate at length about the question I asked. There’s no doubt that you can just pick up the phone and ask the front desk what time it is, or check your smart phone that synchronizes automatically over WiFi. 

Boring. 

I care about what it’s rational to believe before sorting the problem out. Why? Because many disagreements that we regularly face are not so easily resolved and it is precisely those that are the most interesting and challenging disagreements to handle. For example:  “I thought we should pay down our mortgage, but my sister said it’s better to save for retirement.” “I really think I should marry him but my parents think otherwise.” “My cardiologist thinks I should put off having my valve replacement surgery, but the cardiac surgeon said that the operation is called for now.” I suggest that there might be something for us to learn from simple cases of disagreement that we might - no, we should - apply to the more complicated and important disagreements with which life is brimming.

So please stop and consider for a moment what impact the disagreement between the two clocks has on what one can rationally believe. Remember, you were rational to believe that it was 7 am right up until you saw the second clock. Should you (a) continue to believe that it’s 7 am, concluding that the second clock must be wrong? Should you (b) believe that it’s 9:45 and conclude that the first clock must be wrong? Should you (c) think that it’s probably half way between the two times (8:22:30)?  Should you (d) believe that you have no idea at all what time it is? Should you (e) believe that it’s probably morning?

You probably felt compelled to seek out further information as you contemplated the situation I posed, and that should be a good indication that (a) and (b) are not reasonable. After all, there’s no reason to think that one clock is more likely to be correct than the other. Perhaps it’s reasonable to believe that it’s morning, but notice that had the second clock read 7 pm, you’d be completely lost and you’d have to conclude with (d).

It seems obvious to me – a fact of rationality itself – that the awareness of the disagreement of the second clock must dramatically reduced the confidence that one rationally had in initially believing that it was 7 am. 

In Part 2, we’ll explore some more complicated disagreements, but this is an important time to chime in if you think that my conclusion is mistaken. I’ll repeat it one more time: the instant you become aware of the significant and mutually exclusive disagreement of the second clock, you have a very good reason to drop your belief that it’s 7 am. You suddenly have a very good reason to doubt that you can tell anything reasonable about the time, except that maybe, it’s morning, and that'll just have to do until you gather information that will settle the question. I think that if you agree with me here, you’ll have to admit that disagreement ought to have a much greater impact upon the confidence we have in our beliefs than it seems to have. Join me in the rest of this series on disagreement to see if I’m right, or if you disagree!