H

harfe

849 karmaJoined

Posts
1

Sorted by New
5
harfe
· · 1m read

Comments
146

the child is adding an expected value of $27,275 per year in social surplus.

It would take $133,333 per year to raise a child to adulthood for it to not be worthwhile

I think the comparison of "social surplus" to effective donations is mistaken here. A social surplus of $27,275 (in the US) does not save 5 lives, but an effective donation of that size might.

There used to be such a system: https://forum.effectivealtruism.org/posts/YhPWq784eRDr5999P/announcing-the-ea-donation-swap-system It got shut down 7 months ago (see the comments on that post).

One outstanding question is at what point AI capabilities are too close to loss of control. We propose to delegate this question to the AI Safety Institutes set up in the U.K., U.S., China, and other countries.

I consider it clickbait if you write "There Is a Solution", but then say that there are these AI safety institutes that will figure out the crucial details of the solution some time in the future.

edit: the issue raised in this comment has been fixed

In my opinion you have not really argued why it is neglected. As a starting point, they seem to spend roughly $35 million per year: https://projects.propublica.org/nonprofits/organizations/824506840. $14 million of those are for salaries, so I would be surprised if new features are strongly bottlenecked by money and talent.

I am just guessing on these issues, but my suspicion as for why some features such as "live location sharing" and "display past encrypted messages from the group chat you were not a member of but only just now joined" are not (yet) implemented because they do not fit well into Signal's approach to security/privacy.

Could you provide examples of political discussions on the EA Forum that appear to have negatively impacted the forum’s environment or impaired its ability to achieve its objectives?

As far as I remember, the political discussions have been quite civilized on the EA Forum. But I think this is because of the policies and culture the EA Forum has. If political discussions were a lot more frequent, the culture and discussion styles could get worse. For example, it might attract EA-adjacent people or even outsiders to fight their political battles on the EA Forum. Maybe this can be solved by hiring additional moderators though.

Also, politics can get a lot of attention that would be better spend elsewhere. For example this post about Trump generated 60 comments, and I am not sure if it was worth it.

Could you expand a bit on how this would look like? How are they being "shunted", what kind of roles are low-level roles? (E.g. your claim could be that the average male EA CS-student is much less likely to hear "You should change from AI safety to community-building" than female EA CS-students.)

That is, many (most?) people need a break-in point to move from something like "basically convinced that EA is good, interested in the ideas and consuming content, maybe donating 10%" to anything more ambitious.

I am under the impression that EAGx can be such a break-in point, and has lower admission standards than EAG. In particular, there is EAGxVirtual (Applications are open!).

Has the rejected person you are thinking of applied to any EAGx conference?

I think sounding like a salesman is ok here.

Maybe something like "Which orgs would benefit from my unpaid labor?" or "Offer: I can work on your project for free for some months (Tech/Operation/Management)".

Load more