What are Gemini, Claude, and Meta doing with our data? – Computerworld


Privacy concerns are important, he said, “but it doesn’t mean organizations should avoid large language models altogether. If you’re hosting models yourself, on-prem or through secure cloud services like Amazon Bedrock, you can ensure that no data is retained by the model.”

St-Maurice pointed out that, in these scenarios, “the LLM functions strictly as a processor, like your laptop’s CPU. It doesn’t ‘remember’ anything you don’t store and pass back into it yourself. Build your systems so that the LLM does the thinking, while you retain control over memory, data storage, and user history. You don’t need OpenAI or Google to unlock the value of LLMs; host your own internal models, and cut out the risk of third-party data exposure entirely.”

What people don’t understand, added Ironwall’s Zayas, “is that all this information is not only being sucked in, it’s being repurposed, it’s being reused. It’s being publicized out there, and it’s going to be used against you.”


Source link
Exit mobile version