Yinz shut your boxen down?
Living 20 minutes into the future. Eccentric weirdo. Virtual Adept. Time traveler. Thelemite. Technomage. Hacker on main. APT 3319. Not human. 30% software and implants. H+ - 0.4 on the Berram-7 scale. Furry adjacent. Pan/poly. Burnout.
I try to post as sincerely as possible.
Yinz shut your boxen down?
I’m not seeing a problem here.
It’s not that. It’s not getting complacent by eliding the semantics of what you’re doing. It’s being consciously aware that you’re doing something that could possibly fuck stuff up.
I don’t, because stuff like that is a little too touchy to wrap in a cute shell alias. If I’m going to update a box, I’m going to update a box. If I’m going to reboot a machine, I want to be reminded that I’m going to reboot a machine (which in turn is a reminder that there are other people using stuff there and not to fuck their days up without at least a little warning).
Are they going to unfuck the layer management UI?
Practice safe hex - always wear a write protect chip!
The golden rule: “He who has the gold makes the rules.”
The only way you won’t have to provide PII is if you buy it from someone outside of the exchange ecosystem (from somebody face to face with cash or a gift card (note: Local Bitcoin has been gone for about a year now)). Exchanges have to comply with KYC (Know Your Customer) laws if they want to operate in the US, which is why they’re asking for PII.
Librewolf on my personal laptop.
Let’s see here…
Potato Chat - This is the first I’ve heard of it so I can’t speak to it one way or another. A cursory glance suggests that it’s had no security reviews.
Enigma - Same. The privacy policy talks about cloud storage, so there’s that. The following is also in their privacy policy:
A super group can hold up to 100,000 people, and it is not technically suitable for end-to-end encryption. You will get this prompt when you set up a group chat. Our global communication with the server is based on TLS encryption, which prevents your chat data from being eavesdropped or tampered with by others… The server will index the chat data of the super large group so that you can use the complete message search function when the local message is incomplete, and it is only valid for chat participants… we will record the ID, mobile phone number, IP location information, login time and other information of the users we have processed.
So, plaintext abounds. Definite OPSEC problem.
nandbox - No idea, but the service offers a webapp client as a first class citizen to users. This makes me wonder about their security profile.
Telegram - Lol. And I really wish they hadn’t mentioned that hidden API…
Tor - No reason to re-litigate this argument that happens once a year, every year ever since the very beginning. Suffice it to say that it has a threat model that defines what it can and cannot defend against, and attacks that deanonymize users are well known, documented, and uses by law enforcement.
mega.nz - I don’t use it, I haven’t looked into it, so I’m not going to run my mouth (fingers? keyboard?) about it.
Web-based generative AI tools/chatbots - Depending on which ones, there might be checks and traps for stuff like this that could have twigged him.
This bit is doing a lot of heavy lifting in the article: “…created his own public Telegram group to store his CSAM.”
Stop and think about that for a second.
LEOs using what amount to phishing attacks to grab folks looking for CSAM has a long and storied history behind it.
They remember what happened when they migrated Hotmail to Microsoft Exchange.
That’s pretty well answered here: http://vger.kernel.org/lkml/#s15-3
Ew.
for compliance we’d have to get everything re-vetted yearly
Huge pain in the ass to set up, but from the user’s end of things it was pretty easy to do.
Some years ago, I had a client with a really fucked up set of requirements:
This was during the days when booting into a LUKS encrypted Gentoo install involved copy-and-pasting a shell script out of the Gentoo wiki and adding it to the initrd. I want to say late 2006 or early 2007.
I remember creating a /boot partition, a tiny little LUKS partition (512 megs, at most) after it, and the rest of the drive was the LUKS encrypted root partition. The encrypted root partition had a randomly generated keyfile as its unlocker; it was symmetrically encrypted using gnupg and a passphrase before being stored in the tiny partition. The tiny partition had a passphrase to unlock it. gnupg was in the initrd. I think the workflow went something like this:
I don’t miss those days.
Syncthing could do it.
So, it’ll cost them an hour’s worth of revenue in fines.
This is a big nothingburger because it doesn’t have a cute name, a marketing campaign, or a silly logo. /s