Sep 23, 2025

Memory Without Empathy: A UX Problem

In Technically Wrong, Sara Wachter-Boettcher writes about Facebook’s “Year in Review.” The feature resurfaced photos of people’s deceased loved ones. It was designed to highlight “special moments,” but the algorithm didn’t understand grief.
That example feels very close to home. After experiencing pregnancy loss, I still get ads about children’s cancer. I have marked them as irrelevant, but they keep showing up. Each one is a painful reminder. Not only because of my own loss, but also because I know parents who have had to face the unimaginable reality of losing a child to illness.
This is not just about ads. It is about design decisions. When algorithms and AI systems decide what to surface, they are shaping the user experience. They remember everything we share, but they do not understand context, vulnerability, or pain. That gap between memory and understanding is where harm happens.

The lesson for UX and AI is clear:
• Give people real, lasting control over what they see, not just temporary mute buttons
• Go beyond personalization for clicks and engagement, and account for emotional impact
• Recognize that sometimes the most empathetic choice is not to resurface certain content at all

AI in UX has enormous potential. But without respecting the full range of human experience, it risks repeating the same mistakes we already see in today’s algorithms, only on a larger scale. Designing AI with sensitivity to grief, trauma, and vulnerability is not optional. It is essential.