A conversation between Brian Tomasik and Luke Muehlhauser

Luke and Mr. Tomasik found that they agreed about the following:

  •  Physicalism and functionalism about consciousness.
  •  Specifically, Mr. Tomasik endorses “Type A” physicalism, as described in his
    article “Is There a Hard Problem of Consciousness?” Luke isn’t certain he
    endorses Type A physicalism as defined in that article, but he thinks his
    views are much closer to “Type A” physicalism than to “Type B” physicalism.
  •  Consciousness will likely turn out to be polymorphic, without a sharp dividing
    line between conscious and non-conscious systems, just like (say) the line
    between what does and doesn’t count as “face recognition software.”
  • Consciousness will likely vary along a great many dimensions, and Luke and
    Mr. Tomasik both suspect they would have different degrees of moral caring
    for different types of conscious systems, depending on how each particular
    system scores along each of these dimensions.

 

A core disagreement

In Luke’s view, a system needs to have certain features interacting in the right way in order to qualify as having non-zero consciousness and non-zero moral weight (if one assumes consciousness is necessary for moral patienthood).

In Mr. Tomasik’s view, various potential features (e.g. ability to do reinforcement learning or meta-cognition) contribute different amounts to a system’s degree of consciousness, because they increase that system’s fit with the “consciousness” concept, but all things have non-zero fit with the “consciousness” concept.

Luke suggested that this core disagreement stems from the principle described in Mr. Tomasik’s “Flavors of Computation are Flavors of Consciousness“:

It’s unsurprising that a type-A physicalist should attribute nonzero consciousness to all systems. After all, “consciousness” is a concept — a “cluster in thingspace” — and all points in thingspace are less than infinitely far away from the centroid of the “consciousness” cluster. By a similar argument, we might say that any system displays nonzero similarity to any concept (except maybe for strictly partitioned concepts that map onto the universe’s fundamental ontology, like the difference between matter vs. antimatter). Panpsychism on consciousness is just one particular example of that
principle.

Source

Leave a Reply

Recent Posts

Categories

Recent Comments

Let’s keep in touch!

Loading