Since when is EA about buying bednets being the bare minimum? That seems like an unusual definition of EA. Many EAs think obligation framings around giving are wrong or not useful. EA is about doing as much good as possible. EAs try to figure out how to do that, and fall short, and that's to be expected, and great that they try! But an activity one knows doesn't do the most good (directly or indirectly) should not be called EA.
I think "do as much good as possible" is not the best framing, since it means (for example) that an EA who eats at a restaurant is a bad EA, since they could have eaten ramen instead and donated the difference to charity. I think it's counterproductive to define this in terms of "well, I guess they failed at EA, but everyone fails at things, so that's fine"; a philosophy that says every human being is a failure and you should feel like a failure every time you fail to be superhuman doesn't seem very friendly (see also my response to Squark above).
My interpretation of EA is "devote a substantial fraction of your resources to doing good, and try to use them as effectively as possible". This interpretation is agnostic about what you do with the rest of your resources.
Consider the decision to become vegetarian. I don't think anybody would think of this as "anti-EA". However, it's not very efficient - if the calculations I've seen around are correct, then despite being a major life choice that seriously limits your food options, it's worth no more than a $5 - 50 donation to an animal charity. This isn't "the most effective thing" by any stretch of the imagination, so are EAs still allowed to do it? My argument would be yes - it's part of their personal morality that's not necessarily subsumed by EA, and it's not hurting EA, so why not?
I feel the same way about offsetting nonvegetarianism. It may not be the most effective thing any more than vegetarianism itself is, but it's part of some people's personal morality, and it's not hurting EA. Suppose people in fact spend $5 offsetting nonvegetarianism. If that $5 wasn't going to EA charity, it's not hurting EA for the person to give it to offsets instead of, I don't know, a new bike. If you criticize people for giving $5 in offsets, but not for any other non-charitable use of their money, then that's the fallacy in this comic: https://xkcd.com/871/
Let me put this another way. Suppose that somebody who feels bad about animal suffering is currently offsetting their meat intake, using money that they would not otherwise give to charity. What would you recommend to that person?
Recommending "stop offsetting and become vegetarian" results in a very significant decrease in their quality of life for the sake of gaining them an extra $5, which they spend on ice cream. Assuming they value not-being-vegetarian more than they value ice cream, this seems strictly worse.
Recommending "stop offsetting but don't become vegetarian" results in them donating $5 less to animal charities, buying an ice cream instead, and feeling a bit guilty. They feel worse (they prefer not feeling guilty to getting an ice cream), and animals suffer more. Again, this seems strictly worse.
The only thing that doesn't seem strictly worse is "stop offsetting and donate the $5 to a charity more effective than the animal charity you're giving it to now". But why should we be more concerned about making them give the money they're already using semi-efficiently to a more effective charity, as opposed to starting with the money they're spending on clothes or games or something, and having the money they're already spending pretty efficiently be the last thing we worry about redirecting?
I don't think ethical offsetting is antithetical to EA. I think it's orthogonal to EA.
We face questions in our lives of whether we should do things that harm others. Two examples are taking a long plane flight (which may take us somewhere we really want to go, but also release a lot of carbon and cause global warming) or whether we should eat meat (which might taste good but also contribute to animal suffering). EA and the principles of EA don't give us a good guide on whether we should do these things or not. Yes, the EA ethos is to do good, but there's also an understanding that none of us are perfect. A friend of a friend used to take cold showers, because the energy that would have heated her shower would be made by a polluted coal plant. I think that's taking ethical behavior in your personal life too far. But I also think that it's possible to take ethical behavior in your personal life not far enough, and counterproductively shrug it off with "Well, I'm an EA, who cares?" But nobody knows exactly how far is too far vs. not far enough, and EA doesn't help us figure that out.
Ethical offsetting is a way of helping figure this out. It can be either a metaphorical way, eg "I just realized that it would only take 0.01 cents to offset the damage from this shower, so forget about it", or a literal way "I am actually going to pay 0.01 cents to offset the costs of this shower."
As such, I think all of your objections to offsetting fall short:
The reference class doesn't particularly matter. The point is that you worried you were doing vast harm to the world by taking a hot shower, but in fact you're only doing 0.01 cents of harm to the world. You can pay that back to whoever it most soothes your conscience to pay it back to.
Nobody is a perfectly effective altruist who donates 100% of their money to charity. If you choose to donate 10% of your money to charity, that remaining 90% is yours to do whatever you want with. If what you want is to offset your actions, you have just as much right to do that as you have to spend it on booze and hookers.
Ethical offsetting isn't an "anti-EA meme" any more than "be vegetarian" or "tip the waiter" are "anti-EA memes". Both involve having some sort of moral code other than buying bednets, but EA isn't about limiting your morality to buying bednets, it's about that being a bare minimum. Once you've done that, you can consider what other moral interests you might have.
People who become vegetarian believe that, along with their charitable donations, they feel morally pushed to being vegetarian. That's okay. People who want to offset meat-eating believe that, along with their charitable donations, they feel morally pushed to offset not being vegetarian. That's also okay. As long as they're not taking it out of the money they've pledged to effective charity, it's not EA's business whether they want to do that or not, just as it's not EA's business whether they become vegetarian or tip the waiter or behave respectfully to their parents or refuse to take hot showers. Other forms of morality aren't in competition with EA and don't subvert EA. If anything they contribute to the general desire to build a more moral world.
A suggestion for Giving What We Can: on their pledge drive page](https://www.givingwhatwecan.org/post/2015/12/do-something-incredible-this-new-year/), the majority of the screen is taken up by a pop-down talking about their fundraising drive, so designed so that it's unclear it's a pop-down and not obvious how to dismiss it. Not only does that make it hard to read the pledge drive page, but it's really confusing for people referred there about their pledge drive - are they asking for pledges, for donations, or what? It just looks like too much of a trivial inconvenience in the way of people reading their request for pledges. I would recommend that the pop-down be disabled at least on that page.
I gave the example of giving 10% to bed nets because that's an especially clear example of a division between charitable and non-charitable money - eg I have pledged to give 10% to charity, but the other 90% of my money goes to expenses and luxuries and there's no cost to EA to giving that to offsets instead. I know many other EAs work this way too.
If you believe this isn't enough, I think the best way to take it up with me is to suggest I raise it above 10%, say 20% or even 90%, rather than to deny that there's such a thing as charitable/non-charitable division at all. That way lies madness and mental breakdowns as you agonize over every purchase taking away money that you "should have" given to charity.
But if you're not working off a model where you have to agonize over everything, I'm not sure why you should agonize over offsets.