When comparing different problems, I found myself with moral questions about what we should care about, whose answers had a direct impact on the scale of global problems. 80k suggests a concept called moral uncertainty, which consists on maximizing the expected value of our decisions. For that it is necessary to estimate the probability of moral assertions (for example: non-human animal lives matter equally as human ones), but I haven't found resources for that. I would be very grateful if someone had a clue on how to proceed.