O

Omnizoid

1761 karmaJoined

Comments
103

I don't think this is right.  We could imagine a very simple creature experience very little pain but be totally focused on it.  It's true that normally for creatures like us, we tend to focus more on more intense pain, but this doesn't mean that's the relevant benchmark for intensity.  My claim is the causal arrow goes the other way. 

But if I did, I think this would make me think animal consciousness is even more serious.  For simple creatures, pain takes up their whole world.  

RP had some arguments against conscious subsystems affecting moral weight very significantly that I found pretty convincing.  

In regards to your first point, I don't see either why we'd think that degree of attention correlates with neuron counts or determines the intensity of consciousness

Interesting!  I intended the post largely as a response to someone with views like yours.  In short, I think the considerations I provided based on how animals behave is very well explained by the supposition that they're conscious.  I also find RP's arguments against neuron counts completely devastating. 

Gotcha, makes sense!  And I now see how to manipulate the spreadsheet. 

I tried to do that but ended up a bit confused about what numbers I was using for stuff (I never really properly learned how spreadsheets worked).  If I agree with you about the badness of excruciating pain but think you underrated disabling pain by ~1 order of magnitude, do the results still turn out with shrimp welfare beating other stuff?  

I liked your analysis.  No worries if this would be too difficult, but it might be helpful to make a website where you can easily switch around the numbers surrounding how the different kinds of suffering compare to each other and plug in the result.  

I agree with most of your estimates but I think you probably underrated how bad disabling pain is.  Probably it's ~500 times worse than normal life.  Not sure how that would affect the calculations. 

But then wouldn't this by brain has a bunch of different minds?  How can the consciousness of one overlap with the consciousness of another? 

It may be that certain mental subsystems wouldn't be adequate by themselves to produce consciousness.  But certainly some of them would.  Consider a neuron in my brain and name it Fred.  Absent Fred, I'd still be conscious.  So then why isn't my brain-Fred conscious?  The other view makes consciousness weirdly extrinsic--whether some collection of neurons is conscious depends on how they're connected to other neurons. 

Load more