This is not surprising. I know it sucks, but the queer community needs to learn how to use the Fediverse and other alternatives. Trying to fly under the radar on these pro-fascist technobro platforms is only going to lead to more censorship. Even if the owners don’t explicitly care (which I’d argue is rare), they’re more than happy to bend over backwards at the request of the government or a special interest group.
Pixelfed is the direct analog to Instagram. Help your queer friends migrate. It works great, and many apps meant for Mastodon or Lemmy can also handle Pixelfed feeds.
There really isn’t a great FOSS, AP protocol software that can properly house a lot of these people because the art they create (porn, erotica, smut, etc) requires commercialization to stay viable.
Currently, only centralized platforms and payment processors can make it all work. I seriously doubt Pixelfed will catch on because of this
I can understand some of that when talking about platforms like YouTube or TikTok, but afaik, Instagram isn’t paying people to share their stuff. Functionally, Pixelfed seems to be on similar footing, unless I’m missing some kind of revenue stream.
If it pays in exposure, that’s why I say people need to learn how to use it. I’m aware that getting people to move is its own problem, but for just Instagram, I think Pixelfed can work.
“We never engaged – I’ll be very clear – in any kind of sexual whatever," Storment said. “A lot of my business dealings would happen through Instagram because I’d be hiring people based on their profiles.”
When Storment was using Instagram in helping organize a party at Public Works during Pride weekend 2024, Meta flagged a conversation with a queer creator, who wishes to remain anonymous, being hired for the party as Storment “hiring someone to do sex work” and shut down all his accounts, Storment said.
Algorithms simply encode the biases we already have, they do not transcend them. Algorithmic moderation is an AWFUL IDEA, of course automated tools to flag or help sort through content to address problematic posts/comments have been and will continue to be useful for moderators… but the idea that these moderation systems could do anything other than punch down is absurd. Moderation of human conversations requires humans, and requires a diversity of humans and a diversity of moderation entities, this IS the hardest problem in social media to solve and AI like this will never be able to address it meaningfully.


