A Day Of Ageism, Like Any Other Day

A Day Of Ageism, Like Any Other Day

Last month I caught a cold. Remember that phrase? Since the advent of COVID-19, it doesn’t seem like we use it much anymore. Anyway, my cold wasn’t Covid - I continually tested at home and got PCR tests - but I did need over-the-counter meds like cough syrup, decongestants and homeopathic remedies.

One thing all my cold treatment purchases had in common was small print. Really small print. There I was, sick as a dog and getting pissed off because I couldn’t read the directions! How many tablets? How often? One teaspoon or is that tablespoon?

Yes, I do wear glasses, and I did just go to have my eyes checked and upgrade my prescription. As I age my needs change, which is true for everybody. So why aren’t our changing needs taken into consideration? Why are our changing needs, as we age, invisible? Why are we, as we age, invisible? If we were a country, Americans 50 and older would be the world’s third-largest economy.

One answer is ageism.

One of those days that I was having trouble reading the cough syrup small print, I was also reading an article about ChatGPT, the virally popular Artificial Intelligence (AI) app. The author used the term Digital Ageism.

Digital Ageism refers to discrimination and prejudice in the digital world against individuals based on their age. Maybe some of you have learned about or experienced unconscious bias that is programmed into algorithms, such as facial recognition software that consistently misidentifies Black faces. It's important to be aware of the potential for any bias in AI systems and to actively work towards reducing yet another example of unconscious bias.

AI algorithms can perpetuate and promote ageism in a number of ways:

- Biased training data: If the data used to train AI algorithms is biased, it can perpetuate negative stereotypes about certain age groups. This can result in AI systems that display ageist behaviors, such as recommending products or services that are not appropriate for older individuals.

- Inaccurate facial recognition: AI-powered facial recognition systems can struggle to accurately recognize the faces of older individuals, leading to incorrect identifications or even false arrests.

-Job discrimination: AI algorithms used in the recruitment process can be ageist if they are trained on data that reflects discriminatory hiring practices. This can result in older job applicants being unfairly screened out.

-Insufficient accessibility features: AI-powered systems and interfaces may not include accessibility features that older individuals need, such as large font sizes or audio descriptions, leading to exclusion and further marginalization.

Later that same day, I attended one of my cardiac rehab sessions at a local hospital. Wired up to a heart monitor, as I walked on a treadmill, I could watch my heartbeat on a big screen at the front of the room. The woman on the treadmill next to me was engaged in conversation with a doctor from the hospital. The doctor had on a white coat. Given how close the treadmills are, I couldn’t help but hear snippets of their conversation.

The doctor commented, “Well, you still look great for someone your age.” Before I could say anything, the doctor took her leave and said goodbye to the woman on the treadmill next to me.

 

I consider "you still look great for someone your age" to be ageist because it reinforces negative stereotypes about aging and implies that aging is inherently negative. It also implies that it is unusual or surprising for someone to look good at a certain age, which reinforces the idea that aging is a period of decline and that people's looks and abilities deteriorate with age. Our healthcare system is full of systemic and internalized ageism, so I wasn’t surprised, but I was disappointed.

So, this was just one day of ageism. I went to sleep that night and woke up to another day.

Back to blog