mstdn.social is one of the many independent Mastodon servers you can use to participate in the fediverse.
A general-purpose Mastodon server with a 500 character limit. All languages are welcome.

Administered by:

Server stats:

10K
active users

#algorithmicbias

0 posts0 participants0 posts today
The Internet is Crack<p>💡 How AI Learns Old Bias—and Spreads It</p><p>This week on The Internet Is Crack, Dr. Suresh Venkatasubramanian explains why algorithms trained on historical data keep reproducing injustice—and what real accountability looks like.</p><p>🎧 <a href="https://youtu.be/GQiFnpK7Wyo" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="">youtu.be/GQiFnpK7Wyo</span><span class="invisible"></span></a></p><p><a href="https://mastodon.social/tags/AIEthics" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AIEthics</span></a> <a href="https://mastodon.social/tags/AlgorithmicBias" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AlgorithmicBias</span></a> <a href="https://mastodon.social/tags/DataJustice" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>DataJustice</span></a> <a href="https://mastodon.social/tags/TechPolicy" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>TechPolicy</span></a> <a href="https://mastodon.social/tags/TheInternetIsCrack" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>TheInternetIsCrack</span></a></p>
The-14<p>Here’s why the public needs to challenge the ‘good AI’ myth pushed by tech&nbsp;companies<br><a href="https://mastodon.world/tags/Tech" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Tech</span></a> <a href="https://mastodon.world/tags/AI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AI</span></a> <a href="https://mastodon.world/tags/GoodAI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>GoodAI</span></a> <a href="https://mastodon.world/tags/AIMyth" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AIMyth</span></a> <a href="https://mastodon.world/tags/TechEthics" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>TechEthics</span></a> <a href="https://mastodon.world/tags/DataBias" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>DataBias</span></a> <a href="https://mastodon.world/tags/PrivacyRights" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>PrivacyRights</span></a> <a href="https://mastodon.world/tags/SurveillanceCapitalism" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>SurveillanceCapitalism</span></a> <a href="https://mastodon.world/tags/AlgorithmicBias" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AlgorithmicBias</span></a> <a href="https://mastodon.world/tags/ResistAI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ResistAI</span></a> <a href="https://mastodon.world/tags/CriticalTech" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>CriticalTech</span></a> <a href="https://mastodon.world/tags/EthicalAI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>EthicalAI</span></a> <a href="https://mastodon.world/tags/DigitalRights" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>DigitalRights</span></a> <a href="https://mastodon.world/tags/TechJustice" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>TechJustice</span></a><br><a href="https://the-14.com/heres-why-the-public-needs-to-challenge-the-good-ai-myth-pushed-by-tech-companies/" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">the-14.com/heres-why-the-publi</span><span class="invisible">c-needs-to-challenge-the-good-ai-myth-pushed-by-tech-companies/</span></a></p>
BiyteLüm<p>🔒 Myth-busting: AI isn’t always intelligent—it reflects the data it’s fed. Biased training data can lead to discriminatory outcomes in hiring, policing, even credit scores.</p><p>🧠 Always ask: Who trained the AI? On what data?</p><p>Transparency &amp; accountability matter.<br><a href="https://mastodon.social/tags/PrivacyAware" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>PrivacyAware</span></a> <a href="https://mastodon.social/tags/AIethics" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AIethics</span></a> <a href="https://mastodon.social/tags/DataProtection" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>DataProtection</span></a> <a href="https://mastodon.social/tags/TechJustice" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>TechJustice</span></a> <a href="https://mastodon.social/tags/AlgorithmicBias" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AlgorithmicBias</span></a></p>
Dr Robert N. Winter<p>In the final instalment of this edition of the Talent Aperture Series, I continue the case that hiring isn't procurement—it's stewardship—and explore:</p><p>🧠 How we reclaim human judgement in hiring<br>📈 Why blind recruitment and contextual interviews are gaining ground<br>💎 What good decision-making really demands in a world drunk on metrics.</p><p><a href="https://robert.winter.ink/the-talent-aperture-reopened/" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">robert.winter.ink/the-talent-a</span><span class="invisible">perture-reopened/</span></a></p><p><a href="https://social.winter.ink/tags/Discernment" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Discernment</span></a> <a href="https://social.winter.ink/tags/EthicalHiring" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>EthicalHiring</span></a> <a href="https://social.winter.ink/tags/AlgorithmicBias" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AlgorithmicBias</span></a> <a href="https://social.winter.ink/tags/HumanJudgement" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>HumanJudgement</span></a> <a href="https://social.winter.ink/tags/ResponsibleAI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ResponsibleAI</span></a> <a href="https://social.winter.ink/tags/TalentEthics" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>TalentEthics</span></a> <a href="https://social.winter.ink/tags/StrategicRecruitment" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>StrategicRecruitment</span></a> <a href="https://social.winter.ink/tags/HiringPractices" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>HiringPractices</span></a></p>
Nebraska.Code<p>Heather Hartman, MS, PMP presents 'Technoethics, AI Dilemmas' July 24th at Nebraska.Code().</p><p><a href="https://nebraskacode.amegala.com/" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="">nebraskacode.amegala.com/</span><span class="invisible"></span></a></p><p><a href="https://mastodon.social/tags/Technoethics" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Technoethics</span></a> <a href="https://mastodon.social/tags/ArtificialIntelligence" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ArtificialIntelligence</span></a> <a href="https://mastodon.social/tags/AI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AI</span></a> <a href="https://mastodon.social/tags/MachineLearning" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>MachineLearning</span></a> <a href="https://mastodon.social/tags/Automation" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Automation</span></a> <a href="https://mastodon.social/tags/BigData" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>BigData</span></a> <a href="https://mastodon.social/tags/Privacy" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Privacy</span></a> <a href="https://mastodon.social/tags/SurveillancePractices" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>SurveillancePractices</span></a> <a href="https://mastodon.social/tags/AlgorithmicBias" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AlgorithmicBias</span></a> <a href="https://mastodon.social/tags/AIOmaha" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AIOmaha</span></a> <a href="https://mastodon.social/tags/Nebraska" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Nebraska</span></a> <a href="https://mastodon.social/tags/lincolnne" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>lincolnne</span></a> <a href="https://mastodon.social/tags/TechConference" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>TechConference</span></a> <a href="https://mastodon.social/tags/TechnologyTrends" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>TechnologyTrends</span></a> <a href="https://mastodon.social/tags/LincolnNE" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>LincolnNE</span></a> <a href="https://mastodon.social/tags/Programming" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Programming</span></a> <a href="https://mastodon.social/tags/softwaredevelopment" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>softwaredevelopment</span></a> <a href="https://mastodon.social/tags/softwareengineering" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>softwareengineering</span></a> <a href="https://mastodon.social/tags/EmergingTechnologies" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>EmergingTechnologies</span></a> <a href="https://mastodon.social/tags/TechTalk" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>TechTalk</span></a></p>
ResearchBuzz: Firehose<p>The Conversation: Women’s sports are fighting an uphill battle against our social media algorithms. “Algorithms, trained to maximise engagement and profits, are deciding what appears in your feed, which video auto-plays next, and which highlights are pushed to the top of your screen. But here is the problem: algorithms prioritise content that is already popular. That usually means men’s […]</p><p><a href="https://rbfirehose.com/2025/05/12/the-conversation-womens-sports-are-fighting-an-uphill-battle-against-our-social-media-algorithms/" class="" rel="nofollow noopener" target="_blank">https://rbfirehose.com/2025/05/12/the-conversation-womens-sports-are-fighting-an-uphill-battle-against-our-social-media-algorithms/</a></p>
Michaël | HouseStationLive.com<p>THE ALGORITHM VS. THE HUMAN MIND: A LOSING BATTLE<br>¯</p><p>_<br>NO RECOGNITION FOR THE AUTHOR</p><p>YouTube does not reward consistency, insight, or author reputation. A comment may become a “top comment” for a day, only to vanish the next. There’s no memory, no history of editorial value. The platform doesn’t surface authors who contribute regularly with structured, relevant input. There's no path for authorship to emerge or be noticed. The “like” system favors early commenters — the infamous firsts — who write “first,” “early,” or “30 seconds in” just after a video drops. These are the comments that rise to the top. Readers interact with the text, not the person behind it. This is by design. YouTube wants engagement to stay contained within the content creator’s channel, not spread toward the audience. A well-written comment should not amplify a small creator’s reach — that would disrupt the platform’s control over audience flow.<br>¯</p><p>_<br>USERS WHO’VE STOPPED THINKING</p><p>The algorithm trains people to wait for suggestions. Most users no longer take the initiative to explore or support anyone unless pushed by the system. Even when someone says something exceptional, the response remains cold. The author is just a font — not a presence. A familiar avatar doesn’t trigger curiosity. On these platforms, people follow only the already-famous. Anonymity is devalued by default. Most users would rather post their own comment (that no one will ever read) than reply to others. Interaction is solitary. YouTube, by design, encourages people to think only about themselves.<br>¯</p><p>_<br>ZERO MODERATION FOR SMALL CREATORS</p><p>Small creators have no support when it comes to moderation. In low-traffic streams, there's no way to filter harassment or mockery. Trolls can show up just to enjoy someone else's failure — and nothing stops them. Unlike big streamers who can appoint moderators, smaller channels lack both the tools and the visibility to protect themselves. YouTube provides no built-in safety net, even though these creators are often the most exposed.<br>¯</p><p>_<br>EXTERNAL LINKS ARE SABOTAGED</p><p>Trying to drive traffic to your own website? In the “About” section, YouTube adds a warning label to every external link: “You’re about to leave YouTube. This site may be unsafe.” It looks like an antivirus alert — not a routine redirect. It scares away casual users. And even if someone knows better, they still have to click again to confirm. That’s not protection — it’s manufactured discouragement. This cheap shot, disguised as safety, serves a single purpose: preventing viewers from leaving the ecosystem. YouTube has no authority to determine what is or isn’t a “safe” site beyond its own platform.<br>¯</p><p>_<br>HUMANS CAN’T OUTPERFORM THE MACHINE</p><p>At every level, the human loses. You can’t outsmart an algorithm that filters, sorts, buries. You can’t even decide who you want to support: the system always intervenes. Talent alone isn’t enough. Courage isn’t enough. You need to break through a machine built to elevate the dominant and bury the rest. YouTube claims to be a platform for expression. But what it really offers is a simulated discovery engine — locked down and heavily policed.<br>¯</p><p>_<br>||<a href="https://hear-me.social/tags/HSLdiary" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>HSLdiary</span></a> <a href="https://hear-me.social/tags/HSLmichael" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>HSLmichael</span></a> </p><p><a href="https://hear-me.social/tags/YouTubeCritique" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>YouTubeCritique</span></a> <a href="https://hear-me.social/tags/AlgorithmicBias" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AlgorithmicBias</span></a> <a href="https://hear-me.social/tags/DigitalLabour" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>DigitalLabour</span></a> <a href="https://hear-me.social/tags/IndieCreators" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>IndieCreators</span></a> <a href="https://hear-me.social/tags/Shadowbanning" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Shadowbanning</span></a> <a href="https://hear-me.social/tags/ContentModeration" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ContentModeration</span></a> <a href="https://hear-me.social/tags/PlatformJustice" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>PlatformJustice</span></a> <a href="https://hear-me.social/tags/AudienceManipulation" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AudienceManipulation</span></a></p>
ResearchBuzz: Firehose<p>The Register: Open source AI hiring bots favor men, leave women hanging by the phone. “Open source AI models are more likely to recommend men than women for jobs, particularly the high-paying ones, a new study has found. While bias in AI models is a well-established risk, the findings highlight the unresolved issue as the usage of AI proliferates among recruiters and corporate human resources […]</p><p><a href="https://rbfirehose.com/2025/05/04/the-register-open-source-ai-hiring-bots-favor-men-leave-women-hanging-by-the-phone/" class="" rel="nofollow noopener" target="_blank">https://rbfirehose.com/2025/05/04/the-register-open-source-ai-hiring-bots-favor-men-leave-women-hanging-by-the-phone/</a></p>
woxx<p><strong>“In reality, people decide how algorithms work”</strong></p><p><strong>Researcher Yasaman Yousefi deals with the question of how algorithmic systems and artificial intelligence can be made fairer. She will <a href="https://ewb.lu/event/algorithmic-discrimination-reasons-consequences/" rel="nofollow noopener" target="_blank">be a guest in Luxembourg at Erwuessebildung next week</a>. woxx spoke to her about the AI hype.</strong></p><p>Dieses <a href="https://www.woxx.lu/forscherin-zu-algorithmischer-diskriminierung-in-wirklichkeit-entscheiden-menschen-wie-algorithmen-funktionieren/" rel="nofollow noopener" target="_blank">Interview ist auch auf Deutsch verfügbar</a>. Follow <a href="https://ewb.lu/event/algorithmic-discrimination-reasons-consequences/" rel="nofollow noopener" target="_blank">this link for more information about the event Algorithmic Discrimination: Reasons &amp; Consequences (29.04.2025)</a>.</p> <p></p><p class="">Yasaman Yousefi researches <span>fairness in algorithmic decision-making.</span> (Photo: private)</p> <p class=""><strong><span>What is algorithmic discrimination?<br></span></strong><span>Algorithmic discrimination is a phenomenon that happens when automated systems start treating people unfairly because of certain characteristics they might have. Algorithms make correlations based on those characteristics. These could for example be the race, gender, or socioeconomic status of a person, or even correlated categories, like “sad teenagers” or “dog owners”. This is what algorithms do: they find common points and group characteristics. That kind of </span><span>grouping</span><span>, which is harmless per se, can turn into discrimination because the data or the historical practices these algorithms have been trained with might be biased. This can lead to an unequal access to opportunities and services, or even information in some cases. For example, there was the case of Amazon’s hiring algorithm. It wasn’t specifically designed in that way, but it did end up being biased against women. Why? Because the algorithm had been trainedon historical data from Amazon</span><span><span><span>’s</span></span></span><span> hiring practices of the past 20 years, when more men had been hired than women. So, the algorithm decided that if the word “woman” was mentioned in a candidat</span><span><span><span>e’s </span></span></span><span>CV, it would ignore the CV and not pass it on to HR. That’s how algorithmic discrimination happens, even if it’s not necessarily intentional.</span></p><p class=""><span><strong>A more subtle form of how algorithms can influence our lives is throughsocial media or shopping sites. How does this manipulation work?</strong><br></span><span>The algorithms on these sites and apps are designed to optimize engagement and profit, especially when it comes to shopping. On social media, they prioritize content that triggers a stronger reaction. Usually, that means polarizing views or even misinformation. In the case of online shopping, whether through social media or general recommendation systems, the algorithms influence our choices because they personalize recommendations. By looking at our data, they show us targeted ads. I</span><span><span><span>’ll give you an</span></span></span><span> example: I’m a cat owner. All of my Instagram, as you can imagine, is full of cat content </span><span><span><span>–</span></span></span><span> that’s exactly the way it works. Sometimes you might feel like the algorithm</span><span><span><span>’s</span></span></span><span> reading your mind; if you just thought about an item or you were talking about something with your friends, and soon after that you see an ad for it. But the algorithms correlate what you did online with other factors, such as your age group, income level, interests, etc. They nudge our behaviour without us even being fully aware of it.</span></p><p class=""><strong><span>We interact, knowingly or unknowingly, with a lot of algorithmic systems on a daily basis </span><span><span><span>–</span></span></span></strong><span><strong> without often realizing that they have the potential to amplify societal inequalities. How does this happen?</strong><br></span><span>The problem is not algorithms as such, it’s the fact that they’re trained on data. Data reflects a lot of inequalities. There was a case in the US where an algorithm was used in judicial decision makings to predict the possibility of a criminal repeating a crime. It started favouring white people over Black people, even if the crime had been the same. It started recommending to the judges to give Black people more time in prison. This shows us how biases can be reinforced and magnified by algorithms. These biases are not new, nor something the algorithms create – they amplify the biases that are already present in the data. This can lead to problems everywhere: marginalized communities like women or different racial and ethnic backgrounds could receive worse credit offers or worse mortgage offers, or face less job visibility or stricter controls in airport security controls.</span></p><p class=""><span><strong>There is a trend of anthropomorphising algorithmic systems – not only with AI, but also with social media. For example, when we discuss our social media strategy, people tend to say “Instagram doesn’t like X, Instagram likes Y”. Is this hiding who is really accountable for the actions of these algorithms?</strong><br></span><span> This is a very good point. By humanizing algorithms we are giving the illusion that these platforms have a will of their own. In reality, the decisions about how these algorithms work, and how they’re made, are human decisions and reflect all the power balances that exist in society. The idea that it is an algorithm or an app that likes or doesn’t like something is really manipulative because it takes the focus away from the people behind the tech: engineers, designers, developers – and does not hold them accountable. I think this narrative of ‘Siri says so’, or ‚ChatGPT thinks so‘ is what we should really be afraid of. </span></p><p class=""><span><strong>With so-called AI, this tendency is even bigger. A chatbot is seen as if it had intelligence, or even agency. Is that a clever ruse to avoid responsibility then?<br></strong></span><span>Absolutely. The first time I ever thought about my research topic, I was still a master’s student and had read an article about Siri’s voice being female. This triggered the question of why is it even female? Who chose it to be female? Siri is meant to be your assistant and you are the boss. So it was designed with a female voice to sound subordinate – this builds on and reinforces the stereotype that women are good for assistant roles. So this tendency to provide AIs with a ‘personality’ is a way to avoid scrutiny about design choices, about data sources and about power structures that shape all of these systems. In a way, the people behind the AI are creating this beautiful loophole, so you end up blaming issues on the tech, but not the policy behind its design.</span></p><p class=""><span><strong>What risks could this be hiding</strong>?<br></span><span>Let me just be clear that I don’t think AI is all risks. With good education, ethical design and a solid legislation to protect users, AI could actually be very beneficial to society. Besides the perpetration of social biases and discrimination, though, one of the biggest risks for me is the lack of transparency and the issue of black boxes. These bring the potential for erosion of human oversight. We could fall into the trap of automation bias and over-rely on AI recommendations, which would let power dynamics stay the same, but without us noticing. New forms of AI discrimination can arise that we aren’t prepared for. There is also the matter of privacy and data protection, which isn’t solved, despite EU legislation like the GDPR and the AI Act. The latter is a good start to tackle these issues, but the law is always evolving slower than technology developments.</span></p><p class=""><span><strong>You mentioned these black boxes, do you think companies could be forced to open these and provide transparency on how their algorithms work?</strong><br></span><span>I think opening the black boxes entirely is too big an ask, because even engineers sometimes don’t understand why a decision was made by such a system. Laws could push for better transparency and accountability, and this is what they’re doing. For example, the AI act stipulatesrequirements for explainability, transparency and human oversight. These maybe don’t open the black box, but do allow to show the reasoning behind it. We should also consider that the more complex these systems become, the more difficult it is to open them. Another issue to raise here is that the human mind is kind of a black box too: we can’t look inside a brain and see exactly how it works, human decisions aren’t always explainable. So I think with both humans and AI, it’s important to continue questioning and ask for the reasoning behind a decision.</span></p><p class=""><span><strong>At the moment, there seems to be a gigantic hype around everything that has to do with AI. The tech industry, but also governments are spending millions to build “AI factories”. Do you think it’s possible to deploy these technologies in an ethical way?<br></strong></span><span>By adopting ethical design and inclusive development methods, by engaging civil society and introducing strong regulations, we could actually have ethically designed technologies. This is what many organisations are trying to achieve now. For example, I work as a consultant with a multidisciplinary team at a company called DEXAI, which aims to build ethical AI systems. Especially with the AI Act in the picture, the obligations on ethical AI deployment are stricter now. </span><span>I think next to the focus on ethical design, we should also ask ourselves this important question: Is AI really necessary here? In this field of deployment, do we really need it orcan we do without? In some cases, we can really do without. There is this book called “From Pessimism to Promise: Lessons from the Global South on Designing Inclusive Tech”, which includes an interesting example of this happening across several countries in Africa. There is a big investment for AI systems to combat the poaching of wildlife. These systems alert the rangers, so they can get there and intervene. But some of the rangers say that if the money had been invested on better equipment, they</span><span><span><span>’d be able to </span></span></span><span> do their work </span><span><span><span>–</span></span></span><span> without the help from AI. We should use AI up until it’s useful, and not just push for more and more, simply because there is money in it. But that’s unfortunately what’s happening.</span></p><p class=""><strong><span>Are there any rays of hope?<br></span></strong><span>On a more positive note, I think technology in general, if given equal access to, could, in some cases, really be the saviour for a lot of societies and communities. I look at it from a personal perspective: where I am today is because of technology. The languages I learned to speak I learned because of my access to technology. Equal access should be given to marginalized communities for maximizing the benefits of technology. I see a light of hope for people in communities where inequalities are way worse than in Europe. I see them reaching out to the rest of the world and speaking their mind. I think we should focus more on these positive sides and not just on the negative. Equal access to and inclusive design of tech could even correct some human actions, which could lead to a brighter future.</span></p><p class=""><span><b>About Yasaman Yousefi</b></span></p><p class=""><strong><span><span><span>Dr. </span></span></span></strong><strong><span><span><span>Yasaman Yousefi</span></span></span></strong><span><span>&nbsp;ha</span></span><span><span><span><span>s </span></span></span></span><span><span><span><span>a </span></span></span></span><span><span><span><span>Ph</span></span></span></span><span><span><span><span>D </span></span></span></span><span><span><span><span>in Law, Science, and Technology (LAST-JD)&nbsp;</span></span></span></span><span><span><span><span>from</span></span></span></span><span><span>&nbsp;</span></span><span><span><span><span>the University of Bologna, in collaboration with the University of Luxembourg.&nbsp;</span></span></span></span><span><span><span><span>She recently defended her doctoral dissertation:&nbsp;</span></span></span></span><em><span><span><span>The Quest for AI Fairness: Ethical, Legal, and Technical Solutions</span></span></span></em><span><span><span><span>. Her research takes an interdisciplinary approach to fairness in algorithmic decision-making, bridging ethical, legal, and technical perspectives. Yasaman&nbsp;</span></span></span></span><span><span><span><span>is now working as</span></span></span></span><span><span>&nbsp;</span></span><span><span><span><span>a postdoctoral&nbsp;fellow</span></span></span></span><span><span>&nbsp;</span></span><span><span><span><span>at the University of Bologna, where she is researching the risks of General-Purpose AI systems. She also works as a consultant </span></span></span></span><span><span><span><span>and researcher </span></span></span></span><span><span><span><span>on the ethical&nbsp;</span></span></span></span><span><span><span><span>management&nbsp;</span></span></span></span><span><span><span><span>and legal compliance of AI systems in several EU-funded Horizon projects</span></span></span></span><span><span><span><span>, in c</span></span></span></span><span><span><span><span>ollaboration with DEXAI.</span></span></span></span><span><span><span><br></span></span></span></p><p></p>
Victoria Stuart 🇨🇦 🏳️‍⚧️<p>pseudoscience: 🇨🇦 Controversial admissions test which experts say lacks evidence<br><a href="https://www.cbc.ca/news/gopublic/casper-test-medical-school-1.7507308" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://www.</span><span class="ellipsis">cbc.ca/news/gopublic/casper-te</span><span class="invisible">st-medical-school-1.7507308</span></a><br>Critics say claims that Casper predicts student performance are unsubstantiated</p><p><a href="https://mastodon.social/tags/MCAT" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>MCAT</span></a> <a href="https://mastodon.social/tags/PsychologyTesting" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>PsychologyTesting</span></a> <a href="https://mastodon.social/tags/Casper" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Casper</span></a> <a href="https://mastodon.social/tags/AcuityInsights" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AcuityInsights</span></a> <a href="https://mastodon.social/tags/bias" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>bias</span></a> <a href="https://mastodon.social/tags/pseudoscience" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>pseudoscience</span></a> <a href="https://mastodon.social/tags/AlgorithmicBias" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AlgorithmicBias</span></a></p>
Fedizen ⁂ Fediverse News<p><a href="https://mastodon.social/tags/DemocraticPoliticians" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>DemocraticPoliticians</span></a> Should Leave <a href="https://mastodon.social/tags/ElonMusk" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ElonMusk</span></a>’s <a href="https://mastodon.social/tags/XTwitter" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>XTwitter</span></a>.</p><p>Undermining Democratic Discourse 🤬 Since <a href="https://mastodon.social/tags/Musk" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Musk</span></a>’s takeover, <a href="https://mastodon.social/tags/Twitter" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Twitter</span></a> has become a platform for <a href="https://mastodon.social/tags/misinformation" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>misinformation</span></a>, <a href="https://mastodon.social/tags/hatespeech" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>hatespeech</span></a>, and <a href="https://mastodon.social/tags/algorithmicbias" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>algorithmicbias</span></a>, threatening democratic values.</p><p>Amplification of Extremism 📢 By removing <a href="https://mastodon.social/tags/contentmoderation" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>contentmoderation</span></a> safeguards, <a href="https://mastodon.social/tags/X" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>X</span></a> allows <a href="https://mastodon.social/tags/farright" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>farright</span></a> voices and <a href="https://mastodon.social/tags/conspiracymyths" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>conspiracymyths</span></a> to spread unchecked, distorting public debate. (1/3)</p>
Miguel Afonso Caetano<p>"After the entry into force of the Artificial Intelligence (AI) Act in August 2024, an open question is its interplay with the General Data Protection Regulation (GDPR). The AI Act aims to promote human-centric, trustworthy and sustainable AI, while respecting individuals' fundamental rights and freedoms, including their right to the protection of personal data. One of the AI Act's main objectives is to mitigate discrimination and bias in the development, deployment and use of 'high-risk AI systems'. To achieve this, the act allows 'special categories of personal data' to be processed, based on a set of conditions (e.g. privacy-preserving measures) designed to identify and to avoid discrimination that might occur when using such new technology. The GDPR, however, seems more restrictive in that respect. The legal uncertainty this creates might need to be addressed through legislative reform or further guidance."</p><p><a href="https://www.europarl.europa.eu/thinktank/en/document/EPRS_ATA(2025)769509" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://www.</span><span class="ellipsis">europarl.europa.eu/thinktank/e</span><span class="invisible">n/document/EPRS_ATA(2025)769509</span></a></p><p><a href="https://tldr.nettime.org/tags/EU" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>EU</span></a> <a href="https://tldr.nettime.org/tags/AI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AI</span></a> <a href="https://tldr.nettime.org/tags/AIAct" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AIAct</span></a> <a href="https://tldr.nettime.org/tags/GDPR" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>GDPR</span></a> <a href="https://tldr.nettime.org/tags/DataProtection" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>DataProtection</span></a> <a href="https://tldr.nettime.org/tags/AlgorithmicDiscrimination" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AlgorithmicDiscrimination</span></a> <a href="https://tldr.nettime.org/tags/AlgorithmicBias" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AlgorithmicBias</span></a> <a href="https://tldr.nettime.org/tags/Privacy" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Privacy</span></a></p>
ResearchBuzz: Firehose<p>The Conversation: Unrest in Bangladesh is revealing the bias at the heart of Google’s search engine. “…while Google’s search results are shaped by ostensibly neutral rules and processes, research has shown these algorithms often produce biased results. This problem of algorithmic bias is again being highlighted by recent escalating tensions between India and Bangladesh and cases of […]</p><p><a href="https://rbfirehose.com/2025/02/17/the-conversation-unrest-in-bangladesh-is-revealing-the-bias-at-the-heart-of-googles-search-engine/" class="" rel="nofollow noopener" target="_blank">https://rbfirehose.com/2025/02/17/the-conversation-unrest-in-bangladesh-is-revealing-the-bias-at-the-heart-of-googles-search-engine/</a></p>
PUPUWEB Blog<p>As JD Vance criticizes EU's AI regulation, 12+ US states are considering algorithmic discrimination bills strikingly similar to the EU's AI Act. <a href="https://mastodon.social/tags/AIRegulation" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AIRegulation</span></a> <a href="https://mastodon.social/tags/AlgorithmicBias" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AlgorithmicBias</span></a> <a href="https://mastodon.social/tags/TechPolicy" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>TechPolicy</span></a> <a href="https://mastodon.social/tags/JDVance" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>JDVance</span></a> <a href="https://mastodon.social/tags/USStates" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>USStates</span></a> <a href="https://mastodon.social/tags/AIAct" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AIAct</span></a> <a href="https://mastodon.social/tags/Discrimination" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Discrimination</span></a> <a href="https://mastodon.social/tags/GovTech" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>GovTech</span></a> <a href="https://mastodon.social/tags/ArtificialIntelligence" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ArtificialIntelligence</span></a></p>
Michael Vera<p><a href="https://mastodon.michaelvera.org/tags/epartheid" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>epartheid</span></a>: Social engineering under the guise of curating discourse.</p><p><a href="https://mastodon.michaelvera.org/tags/fediverse" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>fediverse</span></a> <a href="https://mastodon.michaelvera.org/tags/decentralization" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>decentralization</span></a> <a href="https://mastodon.michaelvera.org/tags/socialmedia" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>socialmedia</span></a> <a href="https://mastodon.michaelvera.org/tags/censorship" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>censorship</span></a> <a href="https://mastodon.michaelvera.org/tags/onlinefreedom" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>onlinefreedom</span></a> <a href="https://mastodon.michaelvera.org/tags/techethics" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>techethics</span></a> <a href="https://mastodon.michaelvera.org/tags/privacy" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>privacy</span></a> <a href="https://mastodon.michaelvera.org/tags/internetgovernance" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>internetgovernance</span></a> <a href="https://mastodon.michaelvera.org/tags/digitalrights" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>digitalrights</span></a> <a href="https://mastodon.michaelvera.org/tags/platformbias" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>platformbias</span></a> <a href="https://mastodon.michaelvera.org/tags/algorithmicbias" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>algorithmicbias</span></a> <a href="https://mastodon.michaelvera.org/tags/freeexpression" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>freeexpression</span></a> <a href="https://mastodon.michaelvera.org/tags/independentmedia" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>independentmedia</span></a> <a href="https://mastodon.michaelvera.org/tags/echochamber" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>echochamber</span></a> <a href="https://mastodon.michaelvera.org/tags/onlinecommunity" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>onlinecommunity</span></a> <a href="https://mastodon.michaelvera.org/tags/techaccountability" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>techaccountability</span></a> <a href="https://mastodon.michaelvera.org/tags/contentmoderation" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>contentmoderation</span></a> <a href="https://mastodon.michaelvera.org/tags/communitygovernance" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>communitygovernance</span></a> <a href="https://mastodon.michaelvera.org/tags/digitalexclusion" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>digitalexclusion</span></a> <a href="https://mastodon.michaelvera.org/tags/isolationchamber" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>isolationchamber</span></a></p>
Michael Vera<p><a href="https://mastodon.michaelvera.org/tags/epartheid" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>epartheid</span></a>: The illusion of inclusion, the reality of exclusion.</p><p><a href="https://mastodon.michaelvera.org/tags/fediverse" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>fediverse</span></a> <a href="https://mastodon.michaelvera.org/tags/decentralization" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>decentralization</span></a> <a href="https://mastodon.michaelvera.org/tags/socialmedia" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>socialmedia</span></a> <a href="https://mastodon.michaelvera.org/tags/censorship" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>censorship</span></a> <a href="https://mastodon.michaelvera.org/tags/onlinefreedom" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>onlinefreedom</span></a> <a href="https://mastodon.michaelvera.org/tags/techethics" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>techethics</span></a> <a href="https://mastodon.michaelvera.org/tags/privacy" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>privacy</span></a> <a href="https://mastodon.michaelvera.org/tags/internetgovernance" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>internetgovernance</span></a> <a href="https://mastodon.michaelvera.org/tags/digitalrights" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>digitalrights</span></a> <a href="https://mastodon.michaelvera.org/tags/platformbias" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>platformbias</span></a> <a href="https://mastodon.michaelvera.org/tags/algorithmicbias" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>algorithmicbias</span></a> <a href="https://mastodon.michaelvera.org/tags/freeexpression" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>freeexpression</span></a> <a href="https://mastodon.michaelvera.org/tags/independentmedia" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>independentmedia</span></a> <a href="https://mastodon.michaelvera.org/tags/echochamber" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>echochamber</span></a> <a href="https://mastodon.michaelvera.org/tags/onlinecommunity" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>onlinecommunity</span></a> <a href="https://mastodon.michaelvera.org/tags/techaccountability" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>techaccountability</span></a> <a href="https://mastodon.michaelvera.org/tags/contentmoderation" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>contentmoderation</span></a> <a href="https://mastodon.michaelvera.org/tags/communitygovernance" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>communitygovernance</span></a> <a href="https://mastodon.michaelvera.org/tags/digitalexclusion" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>digitalexclusion</span></a> <a href="https://mastodon.michaelvera.org/tags/isolationchamber" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>isolationchamber</span></a></p>
Michael Vera<p><a href="https://mastodon.michaelvera.org/tags/epartheid" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>epartheid</span></a>: Echo Chambers labeled as "Safe Spaces."</p><p><a href="https://mastodon.michaelvera.org/tags/fediverse" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>fediverse</span></a> <a href="https://mastodon.michaelvera.org/tags/decentralization" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>decentralization</span></a> <a href="https://mastodon.michaelvera.org/tags/socialmedia" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>socialmedia</span></a> <a href="https://mastodon.michaelvera.org/tags/censorship" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>censorship</span></a> <a href="https://mastodon.michaelvera.org/tags/onlinefreedom" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>onlinefreedom</span></a> <a href="https://mastodon.michaelvera.org/tags/techethics" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>techethics</span></a> <a href="https://mastodon.michaelvera.org/tags/privacy" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>privacy</span></a> <a href="https://mastodon.michaelvera.org/tags/internetgovernance" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>internetgovernance</span></a> <a href="https://mastodon.michaelvera.org/tags/digitalrights" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>digitalrights</span></a> <a href="https://mastodon.michaelvera.org/tags/platformbias" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>platformbias</span></a> <a href="https://mastodon.michaelvera.org/tags/algorithmicbias" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>algorithmicbias</span></a> <a href="https://mastodon.michaelvera.org/tags/freeexpression" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>freeexpression</span></a> <a href="https://mastodon.michaelvera.org/tags/independentmedia" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>independentmedia</span></a> <a href="https://mastodon.michaelvera.org/tags/echochamber" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>echochamber</span></a> <a href="https://mastodon.michaelvera.org/tags/onlinecommunity" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>onlinecommunity</span></a> <a href="https://mastodon.michaelvera.org/tags/techaccountability" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>techaccountability</span></a> <a href="https://mastodon.michaelvera.org/tags/contentmoderation" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>contentmoderation</span></a> <a href="https://mastodon.michaelvera.org/tags/communitygovernance" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>communitygovernance</span></a> <a href="https://mastodon.michaelvera.org/tags/digitalexclusion" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>digitalexclusion</span></a> <a href="https://mastodon.michaelvera.org/tags/isolationchamber" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>isolationchamber</span></a></p>
Michael Vera<p><a href="https://mastodon.michaelvera.org/tags/epartheid" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>epartheid</span></a>: Moderation policies designed to silence dissent.</p><p><a href="https://mastodon.michaelvera.org/tags/fediverse" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>fediverse</span></a> <a href="https://mastodon.michaelvera.org/tags/decentralization" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>decentralization</span></a> <a href="https://mastodon.michaelvera.org/tags/socialmedia" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>socialmedia</span></a> <a href="https://mastodon.michaelvera.org/tags/censorship" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>censorship</span></a> <a href="https://mastodon.michaelvera.org/tags/onlinefreedom" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>onlinefreedom</span></a> <a href="https://mastodon.michaelvera.org/tags/techethics" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>techethics</span></a> <a href="https://mastodon.michaelvera.org/tags/privacy" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>privacy</span></a> <a href="https://mastodon.michaelvera.org/tags/internetgovernance" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>internetgovernance</span></a> <a href="https://mastodon.michaelvera.org/tags/digitalrights" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>digitalrights</span></a> <a href="https://mastodon.michaelvera.org/tags/platformbias" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>platformbias</span></a> <a href="https://mastodon.michaelvera.org/tags/algorithmicbias" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>algorithmicbias</span></a> <a href="https://mastodon.michaelvera.org/tags/freeexpression" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>freeexpression</span></a> <a href="https://mastodon.michaelvera.org/tags/independentmedia" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>independentmedia</span></a> <a href="https://mastodon.michaelvera.org/tags/echochamber" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>echochamber</span></a> <a href="https://mastodon.michaelvera.org/tags/onlinecommunity" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>onlinecommunity</span></a> <a href="https://mastodon.michaelvera.org/tags/techaccountability" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>techaccountability</span></a> <a href="https://mastodon.michaelvera.org/tags/contentmoderation" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>contentmoderation</span></a> <a href="https://mastodon.michaelvera.org/tags/communitygovernance" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>communitygovernance</span></a> <a href="https://mastodon.michaelvera.org/tags/digitalexclusion" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>digitalexclusion</span></a> <a href="https://mastodon.michaelvera.org/tags/isolationchamber" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>isolationchamber</span></a></p>
Michael Vera<p><a href="https://mastodon.michaelvera.org/tags/epartheid" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>epartheid</span></a>: Censorship rebranded as Community Guidelines. </p><p><a href="https://mastodon.michaelvera.org/tags/fediverse" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>fediverse</span></a> <a href="https://mastodon.michaelvera.org/tags/decentralization" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>decentralization</span></a> <a href="https://mastodon.michaelvera.org/tags/socialmedia" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>socialmedia</span></a> <a href="https://mastodon.michaelvera.org/tags/censorship" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>censorship</span></a> <a href="https://mastodon.michaelvera.org/tags/onlinefreedom" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>onlinefreedom</span></a> <a href="https://mastodon.michaelvera.org/tags/techethics" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>techethics</span></a> <a href="https://mastodon.michaelvera.org/tags/privacy" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>privacy</span></a> <a href="https://mastodon.michaelvera.org/tags/internetgovernance" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>internetgovernance</span></a> <a href="https://mastodon.michaelvera.org/tags/digitalrights" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>digitalrights</span></a> <a href="https://mastodon.michaelvera.org/tags/platformbias" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>platformbias</span></a> <a href="https://mastodon.michaelvera.org/tags/algorithmicbias" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>algorithmicbias</span></a> <a href="https://mastodon.michaelvera.org/tags/freeexpression" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>freeexpression</span></a> <a href="https://mastodon.michaelvera.org/tags/independentmedia" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>independentmedia</span></a> <a href="https://mastodon.michaelvera.org/tags/echochamber" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>echochamber</span></a> <a href="https://mastodon.michaelvera.org/tags/onlinecommunity" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>onlinecommunity</span></a> <a href="https://mastodon.michaelvera.org/tags/techaccountability" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>techaccountability</span></a> <a href="https://mastodon.michaelvera.org/tags/contentmoderation" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>contentmoderation</span></a> <a href="https://mastodon.michaelvera.org/tags/communitygovernance" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>communitygovernance</span></a> <a href="https://mastodon.michaelvera.org/tags/digitalexclusion" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>digitalexclusion</span></a> <a href="https://mastodon.michaelvera.org/tags/isolationchamber" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>isolationchamber</span></a></p>
Michael Vera<p>@Mastodon <a href="https://mastodon.michaelvera.org/tags/Mastodon" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Mastodon</span></a></p><p><a href="https://mastodon.michaelvera.org/tags/epartheid" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>epartheid</span></a>: digital isolation disguised as moderation.</p><p><a href="https://mastodon.michaelvera.org/tags/fediverse" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>fediverse</span></a> <a href="https://mastodon.michaelvera.org/tags/decentralization" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>decentralization</span></a> <a href="https://mastodon.michaelvera.org/tags/opensource" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>opensource</span></a> <a href="https://mastodon.michaelvera.org/tags/socialmedia" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>socialmedia</span></a> <a href="https://mastodon.michaelvera.org/tags/censorship" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>censorship</span></a> <a href="https://mastodon.michaelvera.org/tags/onlinefreedom" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>onlinefreedom</span></a> <a href="https://mastodon.michaelvera.org/tags/techethics" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>techethics</span></a> <a href="https://mastodon.michaelvera.org/tags/privacy" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>privacy</span></a> <a href="https://mastodon.michaelvera.org/tags/internetgovernance" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>internetgovernance</span></a> <a href="https://mastodon.michaelvera.org/tags/digitalrights" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>digitalrights</span></a> <a href="https://mastodon.michaelvera.org/tags/platformbias" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>platformbias</span></a> <a href="https://mastodon.michaelvera.org/tags/algorithmicbias" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>algorithmicbias</span></a> <a href="https://mastodon.michaelvera.org/tags/freeexpression" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>freeexpression</span></a> <a href="https://mastodon.michaelvera.org/tags/independentmedia" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>independentmedia</span></a> <a href="https://mastodon.michaelvera.org/tags/echochamber" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>echochamber</span></a> <a href="https://mastodon.michaelvera.org/tags/onlinecommunity" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>onlinecommunity</span></a> <a href="https://mastodon.michaelvera.org/tags/techaccountability" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>techaccountability</span></a> <a href="https://mastodon.michaelvera.org/tags/contentmoderation" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>contentmoderation</span></a> <a href="https://mastodon.michaelvera.org/tags/selfhosting" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>selfhosting</span></a> <a href="https://mastodon.michaelvera.org/tags/communitygovernance" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>communitygovernance</span></a> <a href="https://mastodon.michaelvera.org/tags/digitalinclusion" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>digitalinclusion</span></a> <a href="https://mastodon.michaelvera.org/tags/isolationchamber" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>isolationchamber</span></a></p>