Seriously, Stop Putting Consumer Data Into ChatGPT
I’ve seen an alarming number of posts on LinkedIn and Mastodon reminding people not to input company and consumer data into ChatGPT or having other ChatGPT privacy issues. It’s an important reminder but one that also highlights the need for this blog post: There’s no way of knowing what happens to that data. DO NOT PUT IT INTO A GENERATIVE AI MODEL.
Samsung led the bad-headlines charge after employees not once, not twice, but three times uploaded proprietary company data into ChatGPT. Two instances involved asking ChatGPT to debug product code. The third instance entailed asking ChatGPT to transcribe a recording of an internal company meeting. Fox Business generously headlined these incidents as an “accident,” but these all seem very intentional if not foolhardy.
Before you become the next bad PR headline for using generative AI recklessly or having ChatGPT privacy issues, let’s remember that large language models such as ChatGPT, Google’s Bard, Microsoft’s Bing AI, and so on are literally always learning. Any data you share becomes training data that can and will impact future outputs. When it comes to consumers, that is a particularly problematic truth — especially in jurisdictions where consumers have strong data privacy protections and rights. Italy’s data protection authority didn’t ban ChatGPT out of the blue; it did so on the grounds that Italians didn’t opt into their data being used to train ChatGPT, nor did the platform provide a legal basis for processing their data. (ChatGPT has since restored service in Italy after adding disclaimers and an age requirement.)
Lest you dismiss this as just an Italy problem, Canada has also launched an investigation into ChatGPT and privacy. As for Samsung, it banned employees from using generative AI. If you’re exploring generative AI or concerned about ChatGPT privacy issues, tread carefully. Remember that it is your responsibility to be good stewards of company and customer data, and that means not throwing it into a black box with limited-to-no privacy protections and hoping for the best.