In an internal report we wrote recently about this, we felt more concerned about whether rejection makes it less likely for people to apply in the first place (but we think we can reduce this with clearer comms about the admissions bar).
I'm not sure I agree with Scott that EAG should be open access, but since you mention this as a concern, I thought I'd mention that, yep, I haven't bothered applying to EAG for several years. The discussion around EAG the last few years made it seem incredibly obvious that I wouldn't be wanted anyway, so I didn't even bother weighing the pros and cons of trying to attend. Now that I actually think about it, I'm not at all sure that I should have been so convinced I couldn't get in. I attended EAG in 2018 as a volunteer because I was told that the organizers couldn't find anyone more qualified to run a discussion group about EA and religion, and I still have my 2018 EAG name tag that labels me a "Speaker". In terms of more recent involvement, I won a second prize in the recent EA forum writing contest, I'm theoretically a mod for the EA Corner discord server, and I've been working on putting together an essay about the most effective ways of preventing miscarriages for people who place high credence on the possibility unborn children having moral worth (though I'm still working on contacting various people involved in that work and getting cost estimates of their operations, so it's not ready yet).
...but I figured that Everyone Knows that you don't get a spot unless you're professionally involved in EA direct work, have been involved in one of the various formal EA fellowships, or have a bunch of personal brand recognition, so I never got to the point of weighing the pros and cons of attendance; I assumed that I couldn't attend, and immediately turned my attention to being okay with not being an important EA member like some of my friends are.
Not sure this is anyone's fault, or whether I would have wanted to go to EAG even if I could - I assume there's an attendance fee, and I might not have wanted to shell out - but I saw your comment and wanted to mention it as an experience that some people do have right now.
Weighing in as a Christian (raised evangelical protestant, currently Catholic), I worry that if this had been my introduction to AI risk, it would have made me less likely to take concerns about AI seriously.
One, the argument seems like a stretch - any human person can already claim to be Jesus, and it doesn't mean the end times are here. A bot that makes the same claim is currently no more convincing than a human trying the same tactic. I won't say that there are no Christians who will take this concern seriously, but it has the sound of a conspiracy theory or the seed of a cult (as do many attempts to draw parallels between Revelations and current events, especially when paired with a specific call to action that isn't already found in scripture). While some evangelicals certainly do go in for that stuff, I think a larger number of them actually have antibodies against it - they've seen arguments like this come from their own communities, they and know that the people espousing them often turn out to be involved in something culty.
Two, while I do think that there are some real and important conversations to be had about how AI might end up affecting religious people (or how religious people's priorities might be ill-served by the current set of people doing AI work), this does not read like a good-faith attempt to start serious discussions of that sort. I don't think that any of us should be in the business of trying to manipulate religious beliefs we don't share in directions that are personally convenient for us, at least not unless it's an attempt to convince people of what one believes to be the genuine truth. It seems dishonest, and I think that the most thoughtful and insightful people - the ones we should most want to convince to take this seriously - will be able to tell.