In “Freedom under naturalistic dualism” I have carefully argued that consciousness is radically noumenal, that is, it is the most real (perhaps the only real) thing in the Universe, but also totally impossible to be observed by others (non-phenomenal). In my view this strongly limits our knowledge on sentience, with important consequences for animalism, that I will comment in this post.

We have direct access to our own stream of consciousness and given our physical similarity with other humans and the existence of language, we can confidently accept the consciousness of other humans and their reporting of their mental states. 

Under physicalist epiphenomenalism (which is the standard approach to the mind-matter relation), the mind is super-impressed on reality, perfectly synchronized, and parallel to it. Understanding why some physical systems make an emergent consciousness appear (the so called “hard problem of consciousness”) or finding a procedure that quantifies the intensity of consciousness emerging from a physical system (the so called “pretty hard” problem of consciousness) is impossible: the most Science can do is to build a Laplace demon that replicates and predicts reality. But even the Laplacian demon is impotent to assess consciousness; in fact, regarding Artificial Intelligence we are in the position of the Laplace's demon: we have the perfectly predictive source code, but we don’t know how to use this full scientific knowledge of the system for consciousness assessment. 

In my view Integrated Information Theory (IIT) is the best theory of consciousness available, because it recognizes that a theory of consciousness can only be the formalization (ideally by mathematical axiomatization) of our previous intuitions. The testing of any theory of consciousness can only be done on a very limited “circle of epistemic trust”: the set of beings so similar to us that we can accept their consciousness as obvious and that can report to us so we can compare predictions with pseudo-observations (that is, trustable accounts of experience; I call reports on states of consciousness “pseudo-observations” because the only full observations of consciousness are those of the own states of consciousness). Beyond humans, our understanding of other minds decays exponentially. We don’t know and we really cannot knowWhat Is It Like to Be a Bat”.

Moral weights depend on intensity of conscient experience. Surprisingly, moral weight estimates often suggest some degree of conservation of consciousness: when you examine the tables, you take 10 animals with a brain of 100 grams, and their moral weight is around that of one animal of 1 kg. For me this is absurd. The organization of the matter in larger and more complex structures is what (likely) creates consciousness. The maximum amount of consciousness you can make with 1.2 kg of biological matter is that of a human brain, by a large margin.

That is, for me it is obvious (remember, “obvious” is the most I can say in a world of noumenal consciousness: no observations are available) that consciousness intensity grows far more than linearly in the number of nodes/connections/speed of the underlying neural network: it is strongly super-additive. Any plausible estimate of consciousness intensity shall recognize the only intuition we share: that consciousness is related to complexity, and scale economies on consciousness are large.

In my view it is likely that large vertebrates can feel direct physical pain with intensity commensurate to that of humans, because we have both large and complex brains, and pain and pleasure are very simple functions. I can accept some “saturation” of the super-additivity of sentience regarding pain for large vertebrates.  In any case, the deep extension of the moral circle (beyond the large vertebrates) implies a relatively clear measure of brain complexity and some hypotheses about the relation between that measure and sentience intensity. 

The easy world of “one man, one vote” that ethicists are used to for very similar (human) beings cannot be extended. Before the extension of the moral circle, we need at least clear and distinct (i.e. quantitative) hypotheses on brain complexity and size and consciousness. 

Unlike John M. Keynes, I am totally for being as precisely wrong as possible.  

1

0
2

Reactions

0
2

More posts like this

Comments7
Sorted by Click to highlight new comments since:

"Under physicalist epiphenomenalism (which is the standard approach to the mind-matter relation)" <- can you give support for this parenthetical claim? It has been some years since I took a philosophy of mind course, but when I did, I got the impression that epiphenomenalism is not the most popular view on the mind-body problem.

What are the alternatives?  As long as you accept the autonomy of matter (this is physicalism) there are not degrees of freedom left. 

I dont know what is the current majority, but physicalism is clearly majority for scientists, and once you are a physicalist either you are epiphenomenalist or eliminativist. Probably there is a majority of self reported eliminativists, but I take the charitable position of thinking that they don't really understand the issue.

There are forms of physicalism that are not eliminativist (they see consciousness as something real, for example as a kind of information processing) and are not epiphenomenalist (they hold that mental states can affect the physical world). I hold a view like this, and I would guess most non-dualist philosophers of mind do too.

Personally I think that eliminativism (at least in its most extreme forms) and epiphenomenalism are both intuitively implausible. They contradict my firsthand experience that my consciousness exists and has effects on the observable physical world. So I'm unlikely to accept either of them without a strong argument.

Your will has effects on the world, of course, but it is determined by a physical system.

I developed that position in the first reference of this post (Freedom under naturalistic dualism).

the fact that mind is determined by a physical system not necessarily entail epiphenomenalism. My best analogy is the difference between the object language and the metalanguage. In mathematics (number theory, Godel's theorem), the metalanguage is embedded in the object language. https://en.wikipedia.org/wiki/Metalanguage#Embedded In this sense, the metalanguage supervenes on (and is determined by) the object language, but is not an epiphenomenon (and not eliminated either). 

The field equations describing the universe as a dynamic system (plus randomness from the Born rule) leave no room for anything else than epiphenomenal conscience. Of course, when you "want" your arm to move, it moves because the system that creates the “will” and the system that “moves” the arm are intertwined, so you can describe the movement both in purely materialistic terms (as the Laplace demon would do) or in terms of a cascade of "conscient decisions". But the whole point of the epiphenomenism is that being the matter autonomous, the materialistic description is consistent and sufficient by itself to describe and predict the events. 

Executive summary: The author argues that consciousness is a fundamental yet unobservable reality, and that it is likely to grow more than linearly with the complexity of neural networks, a concept referred to as the super additivity of consciousness. This has implications for understanding sentience in animals and considering their moral weight.

Key points:

  1. The author proposes that consciousness is a noumenal reality, the most real thing in the universe, but cannot be observed by others.
  2. Our understanding of consciousness is limited to our own experiences and those reported by other humans, due to physical similarities and communication.
  3. The author criticizes physicalist epiphenomenalism and limitations of scientific understanding regarding consciousness in artificial intelligence and other systems.
  4. Integrated Information Theory (IIT) is endorsed as the best theory of consciousness, as it acknowledges that theories of consciousness are formalizations of our intuitions.
  5. The author suggests that intensity of consciousness is related to complexity and is likely super-additive, growing more than linearly with the number of nodes/connections/speed of neural networks.
  6. Moral weights are related to intensity of conscious experience and the author advocates for careful consideration of sentience in animals, especially large vertebrates, when extending the moral circle.

 

This comment was auto-generated by the EA Forum Team. Feel free to point out issues with this summary by replying to the comment, and contact us if you have feedback.

Curated and popular this week
Relevant opportunities