According to a report by the Wall Street Journal, Apple has implemented restrictions on the use of external artificial intelligence tools such as ChatGPT for its employees. This move comes as the company is working on developing its similar technology. The report cites a document and sources stating that Apple is concerned about the potential leak of confidential data by employees using these AI programs.
In addition to ChatGPT, Apple has advised its employees against using Microsoft-owned GitHub’s Copilot, a tool used to automate software code writing. In response to these concerns, OpenAI, the creator of ChatGPT, introduced an “incognito mode” for ChatGPT last month. This mode does not save users’ conversation history or use it to improve its artificial intelligence.
There has been increased scrutiny over the management of user data by ChatGPT and other chatbots it has inspired. These chatbots commonly use the data from hundreds of millions of users to improve or “train” their AI.
In related news, OpenAI recently introduced the ChatGPT app for Apple’s iOS in the US. However, Apple, OpenAI, and Microsoft have not responded to requests for comment on this matter.