By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
Security Parrot - Cyber Security News, Insights and ReviewsSecurity Parrot - Cyber Security News, Insights and Reviews
Notification
Latest News
OpenAI may use Associated Press archive for AI training
July 14, 2023
EU users can hold conversations with Google Bard from training set
July 14, 2023
Aptos, the new default font for Microsoft Office
July 14, 2023
BlackLotus UEFI bootkit sources published on GitHub
July 14, 2023
Hackers from the XDSpy cyber-espionage group attacked Russian organizations on behalf of the Ministry of Emergency Situations
July 14, 2023
Aa
  • News
  • Tutorials
  • Security InsiderComing Soon
  • Expert InsightComing Soon
Reading: Samsung employees exposed sensitive data while communicating with ChatGPT
Share
Security Parrot - Cyber Security News, Insights and ReviewsSecurity Parrot - Cyber Security News, Insights and Reviews
Aa
Search
  • News
  • Tutorials
  • Security InsiderComing Soon
  • Expert InsightComing Soon
Follow US
Security Parrot - Cyber Security News, Insights and Reviews > News > Samsung employees exposed sensitive data while communicating with ChatGPT
News

Samsung employees exposed sensitive data while communicating with ChatGPT

Last updated: 2023/04/07 at 1:51 AM
Security Parrot Editorial Team Published April 7, 2023
Share
SHARE

According to media reports, Samsung engineers have been using ChatGPT to quickly fix errors in source code. However, the company has recently encountered three cases of data leaks via the chatbot, including notes from internal meetings and data related to production and profitability. As a result, Samsung is now warning employees about the dangers of using ChatGPT and is considering blocking access to the service altogether.

The Economist reported that in one case, a Samsung developer gave the chatbot the source code for a proprietary error-correction program, essentially exposing the code to a secret AI application run by a third party. In the second case, an employee exposed ChatGPT test patterns designed to identify defective chips and requested their optimization. The third case involved an employee using the Naver Clova app to convert a recording of a private meeting to text and then sending it to ChatGPT to prepare a presentation.

To prevent similar incidents in the future, Samsung is working on safeguards and is also considering developing its own AI service, similar to ChatGPT, for internal use. The company is warning employees that data transmitted to ChatGPT is stored on external servers and cannot be “revoked”, increasing the risk of confidential information leakage. Furthermore, ChatGPT learns from the received data, which means it can disclose confidential information to third parties.

Weekly Updates For Our Loyal Readers!

Security Parrot Editorial Team April 7, 2023
Share this Article
Facebook Twitter Email Copy Link Print

Archives

  • July 2023
  • June 2023
  • May 2023
  • April 2023
  • February 2023
  • October 2022
  • September 2022
  • August 2022
  • July 2022
  • June 2022
  • May 2022
  • April 2022
  • March 2022
  • February 2022
  • January 2022
  • December 2021
  • November 2021
  • October 2021
  • September 2021
  • August 2021
  • July 2021
  • June 2021
  • May 2021
  • April 2021
  • March 2021
  • February 2021
  • January 2021
  • December 2020
  • November 2020
  • October 2020
  • September 2020
  • August 2020
  • July 2020

You Might Also Like

News

OpenAI may use Associated Press archive for AI training

July 14, 2023
News

EU users can hold conversations with Google Bard from training set

July 14, 2023
News

Aptos, the new default font for Microsoft Office

July 14, 2023
News

BlackLotus UEFI bootkit sources published on GitHub

July 14, 2023

© 2022 Parrot Media Network. All Rights Reserved.

  • Home
  • Parrot Media Group
  • Privacy Policy
  • Terms and Conditions
Join Us!

Subscribe to our newsletter and never miss our latest news, podcasts etc..

Zero spam, Unsubscribe at any time.

Removed from reading list

Undo
Go to mobile version
Welcome Back!

Sign in to your account

Lost your password?