Z

zekesherman

220 karmaJoined

Comments
10

Signup seems broken because the Doodle no longer works. I would like to sign up as an accountability partner, purely as holding someone else to account, I don't think I need someone else to hold me to account (for now at least). PM me for my email.

I love this. I want to be their ambassador and give speeches in elementary schools.

You're absolutely right David - that's a better way to live and I'm happy for you, cheers and thank you for the kind words.

Thanks for the kind words Madhav, but I do disagree:

Recommend that early-career folks try out university programs with internships/coops in the field they think they'd enjoy. This would help error-correct earlier rather than later. 


I imagine that's already suggested somewhere in the career guides, anyway it's exactly what I did - as I pivoted my goals in the final year of undergrad I became a computer science research assistant and took courses like linear algebra and intro to machine learning, then did data science bootcamp over the summer. I believed I knew from experience that these were tough but survivable experiences.

I think most people would have error corrected in the same situation; few people would be as stubborn/selfless as I was. 

Adjust the articles on high-visibility sites to focus less on finding the "most" impactful career path, but instead one of many impactful career paths. 

My impression of public EA career advice is that it is mostly fine. At the time, I sometimes derided it for being too meek, and consciously held myself to a stricter standard than the vibe of 80k Hours. Had I read your rewrites I would have ignored them. I believed in utilitarianism long before I read 80k Hours.

Can you clarify why you think your three criteria are enough to ascribe benign intentions the majority of the time?

It seems right based on all my experience talking to people, seeing what they say, considering their beliefs, and observing their behavior.

The point I was trying to get at was that there’s no relation to thinking a lot about how to make the world a better place and making sacrifices to achieve that AND also having benign intentions towards other groups. People can just more narrowly define the world that they are serving.

Well in EA we don't "just more narrowly define the world that we are serving". We have philosophical rigor.

A concrete example of how believing women have less worth than men could be harmful in evaluating charities; one charity helps women by X utils, one charity helps men by X utils. (Perhaps charity #1 decreases the amount of work women need to do by having a well for water; etc.). Believing women have less worth than men would lead to charity #2 strictly dominating charity #1 when they should AC tually be equally recommended.

There are people in EA who believe that animals have a lot of value, so they don't give money to charities that help women (or men). Are they harming women? What should we do about them?

It’s not a core EA belief that women are equal to men.

What do you mean by equal? It's a core EA belief that the interests of women are equally valuable to the interests of men.

Also, your claim is that we should hide facts from people in order to prevent them from achieving their goals. This is only truly justified if people are actively trying to make life worse for women, which is obviously antithetical to EA. The mere fact that someone thinks women should be treated or prioritized a little differently doesn't necessarily mean that giving them facts will make their behavior worse under your view.

When you say we give the benefit of the doubt for the sake of the EA project, what you’re saying is that demographic minorities need to accept some level of malevolence

EA is not for malevolent people, EA for people who are trying to make the world better.

If you are worried about people lying to infiltrate EA, that's not going to change no matter what we do - people could lie to infiltrate any group with any rules.

in exchange for the privilege of contributing to the EA cause

The EA cause is not a privilege. It's a duty.

Why should the burden be on them? Why not place the burden (if you can even call it that) on individuals who don’t have to worry about this systematic malevolence — which is what this document suggests we do — to think about what they say before they say it

In my original comment, I explicitly said that it's a two-way street.

The reason that it's a two-way street is that when these kinds of issues are only resolved by making demands on the offending party, the conflict never ends - there is a continued spiral of new issues and infighting.

My question: is this belief because you are an effective altruist, or because these criteria are sufficient to indicate positive intentions towards minorities 100% of the time?

The criteria by themselves are sufficient to indicate that benign intentions are 90% likely. The remaining 10% chance is covered by the fact that we are Effective Altruists, so we extend the benefit of the doubt for a greater purpose.

An example: a devout traditionalist Buddhist male might also believe that they are trying to improve the world as much as possible, given the guidelines of their religious/spiritual tradition. They might very well also make personal sacrifices for such, and they may well be doing so in a paradigm of philosophical rigor. Buddhists might also claim that the way they achieve these things is scientifically rigorous (there's a famous quote by the Dalai Lama where he says that if science refutes part of his teachings, the teachings must be rejected). But if said (male) Buddhist was raising questions about whether women deserve equal rights to men, does the fact he satisfies your three criterion mean we should assume he has positive and respectful intentions towards women?

If we were Buddhists, then yes except for the fact that I am mainly talking about the offensive things that people really say, like "the variability hypothesis explains the gender disparity in academia" or "women tend to have worse policy preferences than men" and so on, which are not simple cases of rights and values.

For the most incendiary issues, there is a point where you would expect any EA to know that the PR and community costs exceed the benefits, and therefore you should no longer give them the benefit of the doubt. And I would expect such a person, if they support EA, to say "ah I understand why I was censored about that, it is too much of a hot potato, very understandable thing for a moderator to do." Again, just the most incendiary things, that people across the political spectrum view as offensive. Some kinds of unequal rights would be like that. But there are some issues, like maternity leave or child custody or access to combat roles in the military, where people commonly support unequal rights.

If you harbor an implicit belief that other genders are inferior to men, for example, then no matter how much you care about bettering the world, and how many sacrifices you make to better the world, your intentions would still be about bettering the world for {approved of population/men}.

Even if you held such a belief, it does not follow that you would disregard the rights and well-being of women. You might give them less weight but it would not matter for most purposes, the same charities and jobs would generally still be effective.

Philosophical and scientific rigor don't help either; although I'm not well versed in the history of racism, I do know that science and philosophy have been used to espouse discriminatory views plenty of times in the past.

Science has improved, we know way more about people than we used to. I presume you would agree that the best science doesn't give people reasons to give unequal rights to people. Every wrong view in the history of science has been justified with science. So what do we do about that? Well, we have to do science as well as we can. There are no shortcuts to wisdom. In hindsight, it's easy to point at ways that science that went wrong in the past, but that's no good for telling us about current science.

If someone has the right philosophy, then sharing better information with them will generally just help them achieve the same values that you want. If they don't have the right philosophy then all bets are off, you might be able to manipulate them to act rightly by feeding them an incomplete picture of the science. But that's why the EA/not-EA distinction clears things up here.

I did not think you were trying to say either of those things. Again, I'm aware of these things that people say. I was just talking about the philosophical basis for judging strategies here.

Well that is what I said someone would say, and what I said I wasn't going to argue about. I am aware of the kinds of arguments you're making. And I have lived in places where everything around me was very much not built around my comfort, I was in the military where I could not switch off in the slightest.

Thanks for your thoughts.

We do not want to spend the majority of this document talking about truth, because we feel that if people in underrepresented groups are truly seen as equal then them feeling comfortable in this space should be seen as a goal in itself, not just a means to reach truth.

I'm a white male, and I view my own comfort in debate spaces merely as a means to reach truth, and welcome attempts to trade the former for the latter. Of course, you may be thinking "that's easy for you to say, cause you're a white male!" And there's no point arguing with that because no one will be convinced of anything. But I'm at least going to say that it is the right philosophy to follow.

Our ideas about certain groups are informed by a history of oppression in which some groups have been seen as inferior to others. There are still systematic demographic differences in who holds the majority of the economic and political power and we all still hold conscious and unconscious biases. This has to be taken into consideration in order to reach truth. In this context assessing arguments for their strength must include thinking about the biases that exist against certain groups. Having discussions where all valuable input is heard must include thinking about who has more and less power.

Note that even if we ignore all these considerations, there are still educated people and uneducated people, kind people and unkind people, persuasive people and impertinent people, and all kinds of other differences that will lead to irrational collective epistemology. So I don't see how this calls for any meta-level shift in discussion rules and norms. In the same way that we informally figure out that, say, people with biology PhDs know more about biology, we can informally figure out that women know more about what it's like to be a woman, and so on.

We believe that adapting to the context of societal oppression when we decide how to have certain discussions will get us closer to truth. This because 1. We will be better at seeing our own and others’ biases in the discussion. 2. We will listen more fairly to people with experience and/or expertise in the discussion. 3. People with experience and/or expertise are more likely to voice their opinion because they won’t feel threatened by the way we are conducting the discussion. 4.If we believe that talented and altruistic minds are distributed across all demographic groups, it follows that alienating one of these groups will limit the ability of EA as a whole to find truth - because we are not retaining the “best” minds.

The reverse of these things can happen when oppressive/offensive speech is stifled. But of course, there is a balance to be had if you want to appeal to as many people as possible. I'm not going to be a free speech absolutist, of course.

For instance, it can be tempting to think that our own understanding of which types of debates are offensive is objective

Yes, but this goes both ways. There are political and cultural groups where the same ideas are not viewed as so offensive, and the way that we treat these ideas can change our perceptions of how offensive they are. The best example for this is trigger warnings, which research has shown to increase the level of triggering that people get from the unwanted content. You can also look at the differences in how the same nominally offensive things are perceived by people of the same demographic in different cultures - such as conservative and religious culture in the United States, or people outside the West, basically anywhere besides contemporary liberal and leftist circles. So, while we obviously can't ignore facts about people being offended, it's worth taking some time to think about what might be done to mutually build resilience and trust in EA, rather than just taking everything one way into political correctness treadmills or internecine identity politics.

And we see the key issue here:

Because of the history of this view I believe that it is likely to affect many people negatively if this view is spread. I want to protect myself and others from this. I think more specifically about the women who may see this, particularly the younger ones who are new to EA. I imagine that for some of them, it will lead them away from EA ideas and all the value EA engagement could have for them and for the world. Worse still, others may absorb the idea that the way women are treated is fair and shrink themselves, diminishing their own self-worth and diminishing the positive impact they could have on the world.

One of the major problems driving social justice fear and offense in the US right now is the failure of right-wing and centrist actors to credibly demonstrate that they're not secretly harboring bias and hate. If I was going to pick something that activists for underrepresented demographics need to revise when they look at EA, it's that they should stop applying their default suspicions to the people in EA. If you think the Charles Murray or whoever has got to be shut up because the angry masses are looking for a demagogue to give them an excuse to oppress people, fine - that's a judgment about society writ large, it's on you and I don't know whether you're wrong or right. But when people are already committing to improve the well-being of the world as much as possible, and when they are making personal sacrifices to be involved in this effort, and when they are accepting our paradigm of philosophical and scientific rigor, the least you can give them is a basic element of trust regarding their beliefs and motivations. They aren't bad people. It's OK if you don't fix their beliefs. They're not going to set fire to Western society. There are way too many smart, diverse and educated people in EA for that sort of thing to happen.

And then you'll feel better about them! And they won't assume that you're just here to shut out wrongthink, and so on for every dispute that people can have. It's an important part of building solidarity in a movement.

Of course this is a two-way street: on the flip side, people need to tackle politically incorrect topics in ways that make it easy for other people to extend the benefit of the doubt. If people are idly speculating (or worse, ranting) about very controversial issues, that's pretty tough. What we want is for people to start with serious important questions about the world, and then use arguments to answer the question if-and-only-if they help, because (a) it is way more productive, and (b) it credibly demonstrates that people's motivations are coming from a good place and their takeaways will be good.

I moderate an EA social media group. In the (probably) thousands of posts that I've gotten on it, not one person has ever done the former - "hey guys, what do you think of the IQ gap between races?" and so on. If they did, I'd delete it, because it causes all sorts of problems (as you helpfully point out) while lacking a clear relation to EA issues. But there was exactly one case where someone asked about a controversial side of a specific EA issue, whether lifesaving in Africa is bad because it would purportedly lower average global intelligence. I didn't delete it, because that person's motivations and takeaways related to the very important issue of cause prioritization. Instead, I wrote one comment that persuaded them to rethink the issue, and that was the end of it. For a variety of reasons, I am skeptical of the idea that it's unduly hard to "educate" (interesting word choice) people about these race and gender issues, compared to convincing them on any other topic in EA.

As you might imagine, I don't personally worry much about this since it comes up so rarely in my own lane (it sounds like things are different in some other places). But I will try to remember the things you've written in the future!

Load more