Security Safe Space: ChatGPT-powered productivity apps rising in popularity, but be cautious shar... - 4 minutes read
Productivity apps equipped with the promises of “artificial intelligence” are becoming increasingly common. From prioritizing tasks to keeping up with a fitness routine, there’s seemingly a ChatGPT-powered productivity app for just about any New Year resolution. However, beneath the surface, it feels like the beginning of a cautionary tale.
9to5Mac Security Safe Space exclusively is brought to you by Mosyle, the only Apple Unified Platform. Making Apple devices work-ready and enterprise-safe is all we do. Our unique integrated approach to management and security combines state-of-the-art Apple-specific security solutions for fully automated Hardening & Compliance, Next Generation EDR, AI-powered Zero Trust, and exclusive Privilege Management, with the most powerful and modern Apple MDM on the market. The result is a totally automated Apple Unified Platform currently trusted by over 45,000 organizations to make millions of Apple devices work-ready with no effort and at an affordable cost. Request your EXTENDED TRIAL today and understand why Mosyle is everything you need to work with Apple.
This is Safe Space, a weekly security-focused column on 9to5Mac, where each Sunday, Arin Waichulis discusses the latest in data privacy, vulnerabilities, and emerging threats around Apple’s over 2 billion active device ecosystem.
In March 2023, OpenAI released the ChatGPT API, with immediate adoption from heavy hitters like Snapchat for its My AI chatbot, but also Instacart, Quizlet, and Shopify for its Shop consumer app. However, due to the API’s rather loose terms of service, which can only be described as if you got a pulse you’re clear, it wasn’t long before developers flooded the App Store with productivity-related apps using ChatGPT plugins.
A recent investigation into privacy policies of popular personal productivity apps by security researchers at Private Internet Access (PIA) found “troubling” examples of poor transparency. One particular app was a popular AI chat assistant that uses the ChatGPT API and its existing database to tailor its answers to the user’s prompt.
Despite the app’s App Store page claiming it only uses messages and device IDs to improve app functionality and manage accounts, “Its true data practices are hidden in its privacy policy, which states it collects your name and email, usage stats, and device information,” states PIA. It’s not uncommon for apps to collect this type of user data to sell to third parties or use to build detailed profiles for personalized ads.
This is a drop in the pond to the onslaught of ChatGPT-powered food, health, and productivity apps that are available right now on the App Store. AI coding, personal fitness advice, translation – without having to go through the data policies and practices of each, is there a greater takeaway here?
Not long after ChatGPT’s popularity explosion in January 2023, regulators and lawmakers expressed grave concerns over the use of personal information in its training data. Italy even temporarily banned the service last year until better privacy notices were implemented.
There are two ways ChatGPT gets a hold of personal information.
The first can be to train the large language model (LLM) through bulk data. These uploads mainly include vast amounts of permissionless works like articles, books, blog posts, and other sources of text scraped from all over the Internet.
However, the most notable in this case is through ChatGPT or one of the many apps using its API. Because the chatbot is designed to converse, this can offer a false sense of security, leading users to share sensitive information, such as names, addresses, health data, financial information, and other personal details they usually wouldn’t. Any information shared, private or not, is stored by OpenAI and, as far as we know, never deleted.
While the company claims that data stored is done so without personal identifiers, aka anonymously, I strongly caution against sharing anything private. After all, when signing up for ChatGPT, a phone number is required to help the platform prevent spam bots. This alone raises concerns about the anonymity of users.
Should you avoid apps with ChatGPT integration? Not exactly. But it’s important to exercise caution in anything you enter personal or sensitive information into. A false sense of security could lead to unknowingly sharing sensitive data. And between cybercriminals and sketchy data policies, what you share with these apps could end up anywhere.
FTC: We use income earning auto affiliate links. More.
Source: 9to5Mac
Powered by NewsAPI.org