My theoretical studies at Chalmers University of Technology exposed an, in my view, neglected yet very significant gap in the knowledge of how to change the world and the knowledge of how to steer that change. Despite a rapidly increasing human influence of our environments the results seem to be dominated by chaos and unforeseen outcomes.
I'm advocating dramatically different ways to make non-local decisions. This framework is a conservative utilitarianism. It's conservative because I see our emotions and values as biases and heuristics that nascent technologies make problematic and dangerous, and I don't believe these evolutionary evolved systems can adapt in pace with this technology-driven world.
LinkedIn: https://www.linkedin.com/in/markusbredberg/
Contact: markus.bredberg@hotmail.se
I am looking for any research opportunity on ethics and new technology.
So, in a general sense, the locality problem is the problem of deciding how much room we should allow for personal decisions, in contrast to impersonal decision.
As I've argued, the personal -- or local, or subjective -- decisions are intuitive and feel meaningful, but are also intrinsically linked with biases. Impersonal -- or Universal, or objective -- decisions are always more correct, but doesn't allow for any self expression. The locality problem becomes more urgent -- there is more at stake -- when we increase the scope of our decisions, which we naturally do when we try to achieve more of any (good) thing. Increasing the scope, in turn, tends to make more people affected by any action, and it tends to do so in a less intuitive way, making the Universal decision making better suited. The downside of this is that less subjective decision making means a decreased sense of people feeling important, and also them suppressing their emotions. It is the dilemma of how personal we can allow the decisions to be, that is the locality problem.