Tag Archives: AI

AI-powered phone call service offers virtual wellness checks for seniors

I came across a new service for seniors and their family caregivers that is getting ready to launch. I’ll admit it caught my eye because it has my name! Joy Calls is an AI-powered call service that performs virtual wellness checks.

A smart feature of the service is that even though Joy is an AI-powered persona, the high-tech is contained in a familiar package: a telephone call. No special equipment or training is required to use the service. Joy calls your loved one, checking in on things like medication, hydration, diet, and mood. Your loved one’s responses are summarized and provided to the caregiver.

The concept is designed not to replace family caregivers but virtually augment the monitoring of their loved ones, potentially extending their ability to age in place safely. A price hasn’t been set yet, but according to Onscreen, the company producing Joy Calls, the service is expected to cost in the range of $10-40 per month.

It’s an intriguing concept, and I will be interested in how well the service works and if it can meet the needs of seniors and family caregivers. I’m curious to see how the service handles sensitive information and if it can reduce the risk of AI hallucinations, otherwise the calls could go off the rails. And thinking of my mother, how will Joy handle a senior who is, to put it politely, verbose? Will she interrupt and try to get the call back on track? I felt helpless at times trying to keep my mother focused on the topic at hand, so I hope Joy Calls is up to the challenge.

Image created by ChatGPT.

Leave a comment

Filed under Awareness & Activism

Generative ghosts and the potential impact on the grieving process

AI technology has seemingly immersed itself in every part of our lives, so why not in our afterlives as well?

The concept of “generative ghosts” is outlined in a research paper released in 2024 that includes a Google DeepMind scientist as a contributor. Since then, thanks to a grant from Google, the research continues while at the same time, enterprising AI companies are swooping in to offer products and services. I’ve been following the trend over the last year.

In the paper, the researchers defined generative ghosts as “AI agents that represent a deceased person.” According to the researchers, this differs from a static “griefbot” program where you could have chats with your deceased grandmother about her life based upon data you provide, such as letters, journals and audio and video files to create the information source that the AI chatbot would rely on to provide responses. With generative ghosts, the program is able to create novel content and evolve over time. An example would be a grandmother offering advice on her granddaughter’s wedding day, years after the grandmother’s death.

While some may find such a concept creepy, I can see its benefits especially for younger generations, who have been raised solely in a digital world and who may not have the same emotional connection that older generations have to low-tech sources of family history such as photo albums and scrapbooks. A griefbot that’s a phone app or an avatar of grandma in a short web video sharing her beloved recipe for chocolate chip cookies might be more impactful for younger relatives. Generative ghosts could be tailored to interact with relatives of a variety of ages, serving as a generational bridge to ancestors.

Of course there are many ethical and practical considerations to ponder when it comes to such a concept, which the research paper outlines. One question is whether the generative ghost would speak in first person, as if they were actually the deceased loved one, or in third person, representing the loved one. The form in which the generative ghost assumes is also a question to consider: does it remain in a digital format, exist in a virtual reality world, or does it take on a physical form like a robot? Does the generative ghost remain in its own time period or does it grow in its understanding of current events is another interesting question. One of the most intriguing questions that I found in the research paper was whether the generative ghosts should be allowed to earn income, if say, your relative was a successful author.

The impact of generative ghosts to society could present a host of benefits and consequences. While it could help some through the painful grieving process, it could also interfere with a person’s ability to move on with their lives after the death of a loved one. As with any digital tool, there is the risk of cybercriminals to hack and hijack personal data.

If you could create a generative ghost of a deceased loved one, who would you choose?

Image created by Google Gemini.

1 Comment

Filed under Awareness & Activism

How creating GenAI prompts reminds me of dementia communication

For the past year, I’ve immersed myself in the emerging generative AI technologies. Mostly for my job, but also for my personal curiosity. Every industry is being impacted by AI, including caregiving. If you’ve followed any of the AI discussion, you’ll know that while the technology offers great potential in certain areas, it also can produce errors, which are referred to as “hallucinations.”

Users of these gen AI models are given the responsibility of creating suitable prompts for whatever tasks they are asking the model to complete. There are now people being hired as “prompt engineers” solely for that purpose. The reasoning goes that the better the prompt, the better the execution.

Instead of just typing in a few keywords into a search engine bar, one has to think about a variety of details. There’s a lot of trial and error in the process, with the accompanying frustration and wonder when you get it just right.

This made me think about communicating with someone in the earlier stages of dementia. The person’s communication skills are typically not that impaired early on, but some aspects may be slightly off. The misuse of a word. The incorrect memory recall. The out of left field response. Not understanding a routine request the person’s done many times before.

I remember having conversations with my father during those early stages of the disease and it was disconcerting because our discussion was mostly normal, until it suddenly wasn’t. And that’s how it feels to me working with generative AI technology. It’s accurate a good deal of the time but there’s still something just a bit askew.

If gen AI is characterized as being almost human, we sometimes feel our loved ones with dementia are not the people they once were. Creating AI prompts reminds me to be thoughtful when assembling the building blocks of communication, and how we may be required to reconstruct our typical communication style with our loved ones with dementia, by reframing questions and devising ingenious ways to keep the conversation, and the connection, alive.

The prompt used for the blog post image: “A digital illustration of an adult daughter and her 80-year-old mother with dementia setting at the dining room table, having a conversation with each other, with hearts floating around them.”

Leave a comment

Filed under Awareness & Activism

ChatGPT: Does it have uses for caregivers?

If you’ve been online over the last few months, you’ve probably come across discussions about ChatGPT. The conversational AI-powered (artificial intelligence) tool developed by OpenAI is the latest tech fad that some experts claim could take over our jobs in the future. (If you are interested in working with images instead of words try the related DALL-E.)

You may have seen some of the program’s capabilities: it can write articles, essays, jokes and songs, debug software code, and create resumes with some input from the user. Users can have a conversation of sorts with ChatGPT while refining their requests and the tool can ingest those new points and update its responses in real time.

As someone who enjoys exploring new tools but retains a healthy amount of skepticism about such tools taking over the world, I’ve spent some time testing out ChatGPT, focusing on how the tool could potentially be of aid to caregivers.

My main takeaway is that while ChatGPT can adequately provide information on a vast amount of topics, the responses are mainly generic and middling in quality, like someone reciting an encyclopedia entry. Your mileage will vary if you are asking a question on a highly technical topic or asking it to generate code for a website. But when asking for caregiving advice such as making a caregiver plan for someone with dementia or tips on aging in place, it regurgitates acceptable but basic advice that can be found across the internet. You can see a couple of examples below:

The glaring issue for me is that there is no attribution with ChatGPT responses. That could be important when you are seeking medical advice such as dementia caregiving tips. Are the pointers it is offering come from a dementia expert like Teepa Snow or a low quality resource? At this point, the responses could be used as a decent starting point, but the user would need to do additional research outside of the ChatGPT system to verify, augment, and personalize the information. Google and other search engines are seeking to incorporate attributes of such AI-based tools into their own programs which would offer a more conversational way to search for information.

I’m going to continue to explore the uses of ChatGPT and how it might be useful for caregivers. If you’ve used the tool, I’d love to hear your feedback.

Photo by Zac Wolff on Unsplash.

1 Comment

Filed under Awareness & Activism