Lifestyle

Why joining the ChatGPT caricature trend could be riskier than you realise

Michael Sherman|Published

Participating in the ChatGPT caricature trend may pose privacy risks as it involves sharing personal information with an AI app that builds user profiles, potentially compromising trust and data security. Picture: ChatGPT

Image: ChatGPT

It’s taken social media by storm, and just about everybody has jumped on the new ChatGPT caricature trend - but giving the AI app even more details in addition to a photo of yourself might not be the harmless bit of fun that it seems.

You’re supposed to give ChatGPT a prompt to make the image alongside information about your hobbies, interests, work information, family, and lifestyle.

In isolation, this information is not really something to be overly protective over. But when you issue a collection of information of this nature in combination with a picture, it allows the app to associate your name and personal information.

Of course, ChatGPT will say it does not share this kind of information, and that’s because it has to.

The implications of logging into ChatGPT: Building user profiles and trusting AI

There’s also the argument that because you have to log in to ChatGPT if you want to have access to your previous searches or prompts, the app is already building a database of information about its users. That is completely true, but the search history of a user generally varies substantially from sharing of what is normally private and personal information about yourself, which corresponds with an image of yourself.

It’s one of the many ways we’re welcoming AI into our lives with open arms, and trust that isn’t built but instantly given.

Now with this trend in full swing, we’re effectively allowing AI to build a profile of our information, both professional and personal, and it’s through the use of a tech company that doesn’t actually care about the individual but about profit. What happens in the future if OpenAI - which powers ChatGPT - changes its terms and conditions?

That for me at least, is a scary thought.

@Michael_Sherman

IOL Tech

* The views expressed are not necessarily the views of IOL or Independent Media.

** JOIN THE CONVERSATION: Send us an email with your comments, thoughts or responses to iolletters@inl.co.za. Letters should be a maximum of 500 words, and may be edited for length. Anonymous correspondence will not be published. Submissions should include a contact number and physical address (not for publication).