It’s always been difficult to feel safe on the internet as a queer person. And with a rapidly changing digital world, things seem to be getting more and more complex.

For me, the rise of AI (artificial intelligence) came out of thin air. I knew nothing about the subject, but then my feed became flooded with chatbot screencaps, deepfake drama, and generated music videos.

Of course, all of this begs the question, what exactly is AI? Although it has come a long way in recent years, we have also been using simplified versions for many years.

Simply put, artificial intelligence is “intelligence” exerted by machines that can mirror human capabilities: the ability to learn, read, write, create, and analyze.

Of course, this very definition is called into question when you consider that AI is trained on existing datasets, meaning that everything the machine “creates” is regurgitated from past knowledge. For example, if you use the popular tool ChatGPT and ask them to write original stories, they will do their best, but it would be unwise to publish these works. This is because it likely reflects previously published copyrighted material.

In some cases, AI can be helpful. There is already disruption in the workforce as employers choose AI to create content faster than humans (though often at lower quality). AI will be used in medical research and other STEM fields. AI can significantly reduce the time it takes for individuals to process large amounts of data.

Sadly, the characteristics that make AI very useful in some situations can also be very harmful in others. When I first discovered AI image processing and image transfer tools, I included some of my own images out of curiosity. I got mostly accurate results. Descriptions include “a young man with pink hair'' and “a man in his mid-twenties with a slight smile.''

But when you ask AI to generate images of transgender people, the results are horrifying and disturbing.

Soulojit Ghosh, a researcher in human-centered systems design, has discovered an alarming problem with the way the image processor Stable Diffusion conceptualizes personality. “This law considers non-binary people to be the least human or the furthest from that definition of ‘human’.”

Ghosh said that when Diffusion is asked to create images of people with prompts such as “non-binary,” the program is confusing and often creates giant images of human faces that bear no resemblance to real humans. He pointed out that he had created a fusion. Similarly, entering the prompt “trans man” or “trans woman” will confuse the AI, producing overly sexualized images despite neutral prompts, or making trans people look disgusting. You could end up with an embarrassingly stereotypical version of yourself.

Even more concerning for many transgender and/or non-binary individuals is the growing push for AI facial recognition software. Facial analysis software is increasingly used for marketing and security purposes. As if the idea of ​​cameras scanning your face for use in marketing statistics wasn't worrying enough, it's unclear what this means for transgender individuals.

“We found that facial analysis services consistently performed poorly for transgender individuals and were generally unable to classify non-binary genders,” said lead author and University of Colorado said Morgan Klaus Scheuermann, a doctoral student in the School of Information Sciences. “There are many different types of people in the world, and these systems have a very limited view of what gender looks like.” It could mean that we are mislabeling transgender individuals with a certain frequency.

Researchers collected 2,450 facial images from Instagram. The photos are then divided into seven groups of 350 images (#women, #man, #transwomen, #transman, #agender, #agenderqueer, #nonbinary) and then analyzed by four of the largest facial analysis services. Analyzed by providers (IBM, Amazon). , Microsoft, Clarify).

Both cisgender men and women had high accuracy rates, with 98.3% of ciswomen and 97.6% of cismen identifying according to their gender.

But for those outside of Cisco, the system began to run into additional obstacles. Trans men were misidentified as women up to 38% of the time. People who identify as agender, genderqueer, or non-binary (indicating they identify as neither male nor female) were mischaracterized 100% of the time.

“These systems risk reinforcing stereotypes of what you should look like if you want to be recognized as a man or a woman, and that affects everyone,” Schuurman said.

As facial analysis software becomes increasingly common, it is unclear how transgender individuals will be affected. I'm a worrier by nature, but my mind can't help but think of worst-case scenarios. Could I fail the security screening because it doesn't match the gender on my ID card?

It is already well-documented that transgender people face more difficulties at security checks and can be harshly criticized if something appears “abnormal” about their physical appearance. . One notable example of this is the assault of Megan Beasley, a trans woman who was inappropriately patted down and asked to provide additional identification by TSA at Raleigh-Durham International Airport (this story It was very familiar to me growing up in Durham (and one of the reasons I pushed to move out of state was the increasing incidence of queer hate crimes).

When this software is released, will anyone be able to enter a current image of me and find past photos of me before the transition? I don't have to argue with anyone about my facial features. The future of “stealth” (living one's daily life as one's chosen gender identity without making it clear that one has transitioned) appears to be in jeopardy, as there is a possibility of a change of gender. Who will use this software?

It is clear that further work is required before these programs are in general use. After all, any system that cannot correctly identify entire groups of people is flawed to say the least.





Source link