Terminal and Instrumental Beliefs

As you may know from my past posts, I believe that probabilities should not be viewed as uncertainty, but instead as weights on how much you care about different possible universes. This is a very subjective view of reality. In particular, it seems to imply that when other people have different beliefs than me, there is no sense in which they can be wrong. They just care about the possible futures with different weights than I do. I will now try to argue that this is not a necessary conclusion.

First, let’s be clear what we mean by saying that probabilities are weights on values. Imagine I have an unfair coin which give heads with probability 90%. I care 9 times as much about the possible futures in which the coin comes up heads as I do about the possible futures in which the coins comes up tails. Notice that this does not mean I want to coin to come up heads. What it means is that I would prefer getting a dollar if the coin comes up heads to getting a dollar if the coin comes up tails. 

Now, imagine that you are unaware of the fact that it is an unfair coin. By default, you believe that the coin comes up heads with probability 50%. How can we express the fact that I have a correct belief, and you have an incorrect belief in the language of values?

We will take advantage of the language of terminal and instrumental values. A terminal value is something that you try to get because you want it. An instrumental value is something that you try to get because you believe it will help you get something else that you want.

If you believe a statement S, that means that you care more about the worlds in which S is true. If you terminally assign a higher value to worlds in which S is true, we will call this belief a terminal belief. On the other hand, if you believe S because you think that S is logically implied by some other terminal belief, T, we will call your belief in S an instrumental belief. 

Instrumental values can be wrong, if you are factually wrong about the fact that the instrumental value will help achieve your terminal values. Similarly, an Instrumental belief can be wrong if you are factually wrong about the fact that it is implied by your terminal belief. 

Your belief that the coin will come up heads with probability 50% is an instrumental belief. You have a terminal belief in some form of Occam’s razor. This causes you to believe that coins are likely to behave similarly to how coins have behaved in the past. In this case, that was not valid, because you did not take into consideration the fact that I chose the coin for the purpose of this thought experiment. Your Instrumental belief is in this case wrong. If your belief in Occam’s razor is terminal, then it would not be possible for Occam’s razor to be wrong.

This is probably a distinction that you are already familiar with. I am talking about the difference between an axiomatic belief and a deduced belief. So why am I viewing it like this? I am trying to strengthen my understanding of the analogy between beliefs and values. To me, they appear to be two different sides of the same coin, and building up this analogy might allow us to translate some intuitions or results from one view into the other view.

Read 2 comments

  1. that is very interesting analogy. also if beliefs and values are similar why the questions: which is your most important belief and what is your most important value return two different answers? (i am not sure if they indeed return two different answers, it just seems that they do at first glance)

    • I do not know what the importance of a belief is, or even what an individual belief is. Your beliefs are a huge probability distribution and it is not obvious how to decompose it into individual beliefs. (A similar thing is true for values.) I would say that if anything, important beliefs are strong beliefs, things that you are very confident about. These are the things for which you do not really pay attention to the worlds that do not satisfy them. Similarly strongly valuing the outcomes in one hypothetical world over another is expressed as not really paying attention to those hypothetical worlds you do not value strongly.

      The important thing to note here is that the types of “values” that manifest themselves as beliefs are values of the form “I care more about this model of the world (or this hypothetical future) than other models,”

Leave a Reply