web analytics

Sharing sensitive business data with ChatGPT could be risky

Rate this post

The furor surrounding ChatGPT remains at a fever pitch as the ins and outs of the AI chatbot’s potential continue to make headlines. One issue that has caught the attention of many in the security field is whether the technology’s ingestion of sensitive business data puts organizations at risk. There is some fear that if one inputs sensitive information — quarterly reports, materials for an internal presentation, sales numbers, or the like — and asks ChatGPT to write text around it, that anyone could gain information on that company simply by asking ChatGPT about it later.

On March 22, OpenAI CEO Sam Altman confirmed reports of a ChatGPT glitch that allowed some users to see the titles of other users’ conversations. On March 20, users began to see conversations appear in their history that they said they hadn’t had with the chatbot. Altman said the company feels “awful” but the “significant” error has now been fixed.

To read this article in full, please click here

Read MoreCSO Online

LinkedIn
Twitter
Facebook
WhatsApp
Email

advisor pick´S post

More Latest Published Posts