A new stage in AI chatbots: proprietary, “private” models

The need for privacy when communicating with chatbots has given rise to private models that do not “leak” data outside (photo: CC0 Public Domain)

Privacy has become a major concern with AI chatbots that are connected to one’s corporate servers. Here, the next stage in the development of technology is taking shape: bots that care about privacy and privacy. The first such is a fact, and probably its appearance will trigger the birth of other alternatives.

Companies such as Samsung, JPMorgan, Apple and Amazon have banned their employees from using ChatGPT out of fear that confidential company information could be leaked by sharing it with the bot. ChatGPT, which is owned by OpenAI, continuously learns itself through all the prompts and messages that users enter.

An alternative for everyone

But now there’s an alternative for anyone worried about potentially revealing personal information to an online chatbot. PrivateGPT is an open source AI model that allows users to ask questions based on their own documents without an internet connection.

Created by a developer named Ivan Martinez Toro, PrivateGPT runs locally on the user’s home device. The system requires that you first download an open source Large Language Model (LLM) called gpt4all. The user is then instructed to put all their files that they will use to train the chatbot into a special directory so that the model can “chew” all the data.

Once the bot is trained, the user can ask the model any questions. He will respond using the documents provided as context. PrivateGPT can handle over 58,000 words and currently requires significant local computing resources – specifically a good CPU.

“PrivateGPT in its current state is more of a proof of concept (POC). This is a demonstration that proves the feasibility of creating a completely local version of an AI assistant similar to ChatGPT that can accept documents and answer questions without any data leaving the computer,” says Toro.

The AI ​​model can safely work offline. “It’s easy to imagine the potential of turning this POC into an actual product. For companies, this is an opportunity for a dramatic jump in productivity, as they will have access to their own, customized, secure and private ChatGPT.”

Toro says he created the app after seeing how valuable ChatGPT was in the workplace. “People and legal departments at my current company had access to ChatGPT for a few weeks and we ended up running out of credits; many approached me asking me to help them regain access because they didn’t want to go back to the old way of doing things – without it.”

Privacy is a primary motive

However, the same situation made Toro think about secrecy and privacy: the company’s legal department wanted to summarize a private legal document using ChatGPT, but failed because of the privacy risks.

It is privacy that has become a major concern with online AI-based models that are connected to one’s corporate servers.

One scandalous case of data leakage through LLM chats occurred in April when three Samsung employees in Korea accidentally “leaked” sensitive information to ChatGPT. An employee had shared confidential source code to check it for bugs. Another had asked ChatGPT to optimize the written code. A third shared a recording of a meeting and asked the chatbot to convert the conversation into written notes.

Bans rained down

After this series, many of the big tech companies realized the imminent danger to their corporate secrets and banned the use of chatbots for their employees. Bloomberg banned any generative AI and it became clear that it was trying to create its own proprietary model.

In addition to corporate information, personal data can also be leaked through ChatGPT, which provokes reactions from government regulators.

This is why Italy recently decided to temporarily ban ChatGPT – for about a month – citing concerns about the service’s use of personal information and conflict with the EU’s General Data Protection Regulation (GDPR). The ban later fell after OpenAI met the conditions requested by the Italian data protection authority.

Leave a Reply

Your email address will not be published. Required fields are marked *