HACKER Q&A
📣 tygra

Are you comfortable uploading sensitive data to ChatGPT or Gemini?


The idea of asking ChatGPT and sharing my financial or medical data with OpenAI, or asking Gemini and sharing it with Google, or any other cloud AI provider doesn’t sit well with me. How about you?


  👤 jonahbenton Accepted Answer ✓
No, absolutely not. I run local models to have those conversations.

Having read the myriad AWS data protection agreements I would feel comfortable running bedrock hosted models. Others may feel differently.


👤 doppelgunner
Yes for most things, no for passwords and secret keys. Even I do not trust myself with those sometimes.

👤 croes
No, I didn’t want the share it before, I don’t want now just because the build a AI UI before the data grabber

👤 WheelsAtLarge
No, but I've done it by mistake. I wanted chatgpt to proofread a letter and uploaded it without think over the consequences. It's very possible to do it if you aren't careful. Keep that in mind.

👤 rrmdp
A big NO

👤 kjok
For those who said NO, have you looked into alternative providers like PrivateGPT?

👤 lifestyleguru
10 years ago I was reluctant to use smartphone browser, because it's obvious everything is tracked and profiled as on mobile there was no ad blocker, no possibility to edit hosts file. Now I just use mobile browser for everything without second thought. Give it 3-5 years.

👤 Mo3
I usually try to anonymize whatever is going on. Personal conversations key details removed or slightly modified, no real person or company names at all, etc

Code I only run through the zero-retention API accounts anyway.


👤 kingstnap
I'm kinda meh about it.

The first thing to keep in mind is the illusion of transparency. You might internally know that something is wrong or exploitable or you've made an obvious mistake, but that's generally much less obvious to others.

The second to keep in mind is that we are currently in a crisis of attention. There's too much to think about and do nowadays, and there is a gigantic lack of motivated actors to act upon that information. You could consider it the dual of the illusion of transparency, but it's the illusion of motivation. Other people, by in large, just do not give a damn because they can't and don't have time for it.

Even a nation state if they wanted to go spy on everyone's private information would immediately find themselves with too much nonsense to sift through and not enough time to actually follow through even on surface level information. Let alone leaks that actually require some sort of sophisticated synthesis over two or three disparate pieces of info.

Lastly, it's the difficulty in exploitation. You know how projects and code and stuff seem easy until you try them, and it turns out that actually, this is taking forever, and it barely works? The whole devil in the details thing.

Well, that applies to exploits as well. It's easy until you try it, and then you have this Swiss cheese model of success where random stuff doesn't line up correctly and your workflow broke.

AI surveillance btw barely changes any of this calculus.


👤 kelseyfrog
My sensitive data? nah

Your sensitive data? load it up bruh


👤 rl3

👤 riyakhanna1983
If someone were to build a paid, privacy-preserving wrapper of ChatGPT, it may not see as much traction because ChatGPT is free.