Moral uncertainty is uncertainty about how to act given lack of certainty in any one moral theory, as well as the study of how we ought to act given this uncertainty.
We are sometimes uncertain about empirical facts, such as whether it will rain tomorrow. But we can also be uncertain about moral facts, such as whether it's wrong to steal, or how we should value the well-being of animals. Uncertainty about whether it's wrong to steal is uncertainty about moral or normative facts, while uncertainty about how we should value the well-being of animals is uncertainty about axiological or value facts.
Can moral uncertainty be rational, and what should we do in response to it? We might think that it can never be rational to be uncertain about normative or axiological facts, because such facts are, like mathematical facts, knowable a priori. Nevertheless it seems that agents like ourselves are uncertain about non-trivial mathematical facts, and that we are also uncertain about normative and axiological facts. Given this, it seems necessary to develop some account of how we ought to act under moral uncertainty.
Several such accounts have been formulated in recent years:
Two important problems facing the view that moral uncertainty can affect how we ought to act are the regress problem - the problem that we will be uncertain not only about typical moral questions, but also about which approach to moral uncertainty is correct, and so on ad infinitum[5] - and the problem of intertheoretic comparisons - the problem that there seems to be no principled ways to make comparisons between different moral theories.[6]
...