• 0 Posts
  • 23 Comments
Joined 1 year ago
cake
Cake day: June 10th, 2023

help-circle
rss
  • I just kinda vaguely name them after what they do and how big they are:

    smol: my tiny little 2 bay Synology NAS that I’m no longer using
    medium: my R620 with 4x 18TB drives that is my current NAS (medium, because it’s larger than my previous NAS). Is also a k3s worker and provides NFS PVCs.
    big: my old full-tower gaming rig that’s a k3s worker and runs my Home Assistant VM
    molecule: my current mini-ITX gaming rig and primary computer, also serves as the k3s master node and runs a lot of my home automation stuff. I think I picked molecule because it’s REALLY tiny (it’s in a Dan Cases A4v2, I think?) and it has a bunch of small stuff running on it (containers and pods)
    monolith: my old T440p laptop. It’s a large, black, featureless slab that doesn’t do much
    slab: my new Framework 13 laptop. I just kinda looked at it and said, “that’s a nice slab of metal”

    All of the above running Linux. I tinkered with Ubuntu for the NAS (because I heard Ubuntu was good at ZFS), but I still absolutely hate Ubuntu, so it’s all Arch Linux.




  • There are definitely a lot of good options out there. What are you using right now for regular old FTP? The odds are actually pretty good that it already supports SFTP. A lot of file management applications do both and lump them together, even though they’re completely different protocols (sftp is from the late nineties).

    If it doesn’t, then I don’t know what OS you’re using, so I’ll just recommend options for the big 3. For Windows, there’s WinSCP. For MacOS there’s Cyberduck. Most file managers on Linux distros let you just type sftp://me@wherever in the navigation bar, meaning you get a totally seamless experience with the rest of your FS.

    EDIT: or, you can use sshfs-win on Windows and have your remote filesystem show up as a regular ol’ drive, just like SMB. MacOS and Linux have sshfs, and I know there are GUIs wrapping sshfs on those platforms. I personally use sshfs at home and it’s great (although no GUI wrapper, I’m a weirdo who doesn’t use a graphical file manager at all).



  • PART 4.

    You expect a file transfer program to reliably and faithfully transfer your files, byte-for-byte, from one system to another. FTP spits in your face and shits on your chest. You know how Linux uses LF (i.e. \n) for newlines and Windows uses CRLF (i.e. \r\n) for newlines? Pretty annoying, right? Well, FTP’s ASCII mode will automatically rip off those \r characters for you! Sounds pretty sweet, right? Fuck no it’s not. All of the sudden, your file checksums have changed. If you pass the same file back to a Windows user with a different and more sane file transfer system, then they get a broken file because FTP didn’t mind its own fucking business. If you have a CRLF file and need an LF file, just explicitly use dos2unix. Wanna go the other way? unix2dos. The tool has been around since 1989 and it’s great.

    Now, what if you’re not transferring text, but instead are transferring a picture of a cute cat? What if your binary data happens to have 0x0D0x0A somewhere in it? Well, ASCII mode will happily translate that to 0x0A and fucking ruin your adorable cat picture that you were going to share with your depressed significant other in an attempt to cheer them up. Now the ruined JPEG will remind them of the futility of their situation and they’ll slide even deeper into cold emptiness. Thanks, FTP.

    You can tell your client to use binary mode and this problem goes away! In fact, modern clients do this automatically so your SO gets to see the adorable fuzzy cat picture. But let’s just stop and think about this. Why use a protocol that is dangerous by default? Why use a protocol that supports no form of security (unless you’re using fucking godawful FTPS or FTP over SSH)? Why use a protocol that is so broken by design that small business hardware has been designed to try to unfuck it? Is it faster? I mean, not really. SFTP has encryption/decryption overhead, but your CPU is so fast that you’d need to transfer at 25+ Gb/s to notice it. Is it easier? Fuck no it’s not easier, look at all of the stupid footguns I’ve just mentioned. Is it simpler? The line protocol is simple, but so is HTTP, and HTTP has a much simpler control flow path (merging the data and control planes is objectively the right thing to do in this context). And shit, you want a simple protocol for cases where you don’t have a lot of CPU power? Use fucking TFTP. It’s dogshit, but it was intentionally designed to be dogshit so that a fucking potato could receive data with it.

    There is no task that is currently being done with FTP that couldn’t be done more easily, more securely, and more quickly with some other protocol (like fucking SSH and SFTP, which is now built into fucking Windows for god’s sake). Fuck FTP.


  • PART 3.
    They made their STUPID MODEMS FUCK WITH THE FTP PACKETS. I have personally experienced this with Comcast Business. The stupid piece of shit DOCSIS modem they provide intercepts the FTP packet from your server saying “oh, connect to this address: x.x.x.x:44010” and they rewrite the fucking address to the public IP. There is no way to turn just this horse piss off. Now, for average business customers, this probably saved Comcast a bunch of money in support calls. However, if you’re using the so-called bridge mode on that degenerate piece of shit-wrapped-silicon (where rather than allowing the modem to give you a DHCP address, you just configure your system to have one of the addresses in the /29 space and the modem detects that and says oh okay don’t NAT traffic when it’s going to this address, just rewrite the MAC and shunt it over the right interface), then something funny happens. The modem still rewrites the contents of the packet, but it uses the wrong fucking IP address! Because the public IP that your server is running on is no longer available to the modem, the modem just chooses another fucking address. Then, the client tries to connect to 1.2.3.5 instead of 1.2.3.4 where your server is listening, the modem says “hey I’m 1.2.3.5 and you can fuck off, I’m dropping your SYN for port 44010”, and I get an angry call from the client asking why they can’t download their files using this worthless protocol. I remember having a conversation like this:

    Me: “Just use SFTP on port 22!”
    Client: “No! FTP is faster/more secure/good enough for my grandfather good enough for me/corporate won’t allow port 22.”
    Me: “Comcast is fucking me right now. What if we lied and served SFTP over port 21?”
    # we try it
    Client: “It’s not working! I can’t even connect!”

    I couldn’t connect either. I couldn’t connect to anything. Trying to do SFTP over port 21 caused the stupid fucking modem to CRASH.

    Are you starting to see what the problem is? It’s like Microsoft preserving bugs in Windows APIs so that shitty software doesn’t break, and then they end up doing crazy gymnastics to accomodate old shit like the Windows 8 -> Windows 10 thing where they couldn’t use “Windows 9” because that would confuse software into thinking it was running “Windows 95” or “Windows 98”. FTP has some bugfuck crazy design decisions that we’ve collectively decided to just “work around,” and it leads to fucking gymnastics.

    Speaking of bugfuck crazy design decisions, FTP’s default file transfer mode intentionally mangles data!

    Continued in part 4.


  • PART 2.

    NAT, much like the city of Phoenix, is a monument to man’s arrogance. Fuck NAT and fuck FTP. If your FTP server is listening directly on a public IP address hooked up directly to a proper router, then none of this applies. If you’re anything like me, the last company I worked for (a small startup), or my current company (many many thousands of employees making software you know and may or may not hate, making many billions of dollars a year), then the majority of your servers are living in RFC1918 space. Traffic from the internet is making it to them via NAT (or NAT with extra steps, i.e. L4 load balancers).

    A request comes in for $PUBLIC_IP TCP port 21 and is forwarded to your failure of a boxen at 10.0.54.187. Your FTP server is a big stupid idiot and doesn’t know this. It thinks that it’s king shit and has its own public IP address. Therefore, when it’s deciding what ADDR:PORT it’s going to tell the stupid FTP client to connect to, it just looks at one of the adapters on the box and says “oh, I’ll tell this client on the internet to connect to 10.0.54.187:44007” and then I fucking cry. The FTP client is an idiot, but the IP stack on the client’s home/business router is not and says “oh, that’s an address living in RFC1918 space, I shouldn’t send that out over the internet” and they don’t get the results of their LIST.

    So, how do you fix this? Well, you fix it by not using FTP. Use SFTP USE SFTP USE SFTP FOR GOD’S SAKE. But since this world is a shit fucking place, you have two options. The best option is to configure your FTP server to lie about its IP address. Rather than being honest about what a fool it is, you can tell it to send your public IP address to the client rather than the network adapter IP address. Does your public IP address change? Fuck you, you get to write a daemon that checks for that shit, rewrites your FTP server config, and HUPs the bastard (or SIGTERMs it if your server sucks and can’t do a live config reload).

    Let’s say that you don’t want to do that. Let’s say you work at a small company with a small business internet plan that gives you static IPs but a shitty modem. Let’s say that you don’t know what FTP is or how it works and your boss told you to get it set up ASAP and it’s not working (because the client over in Bendoverville Arkansas is being told to connect to a 10.x.x.x address) and it surely must be your ISP’s fault. So you call up Comcast Business/AT&T/Verizon/Whoeverthefuck and you complain at their technicians for hours and hours, and eventually you get connected to a human that knows what the problem is and tells you how to configure your stupid FTP server to lie like a little sinner. The big telco megacorps don’t like that. They don’t want to waste all those hours, and they don’t want to hire too many people who can figure that shit out because it’s expensive. You wanna know what those fucking asshole companies did?

    Continued in part 3.


  • I’d like to interject for a moment. What you’re referring to as FTP is, in fact, smelly hot garbage.

    For context, I wrote this while waiting for a migraine to pass. I was angry at my brain for ruining my morning, and I like to shit on FTP. It’s fun to be hyperbolic. I don’t intend for this to be an attack on you, I was just bored and decided to write this ridiculous rant to pass the time.

    I must once again rant about FTP. I’ve no idea if you’re serious about liking it or you’re just taking the piss, but seeing those three letters surrounded by whitespace reminds me of all the bad things in the world.

    FTP is, as I’ve said, smelly hot garbage, and the infrastructure built to support FTP is even worse. Why? Well, one reason is that FTP has the most idiotic networking model conceivable. To see how crazy it is, let’s compare to a more sane protocol, like HTTP (for simplicity’s sake, I’ll do HTTP/1.1). First, you get the underlying transport protocol stuff and probably SSL. The HTTP client opens a connection from some local ephemeral port to the destination server on port 80/443/whatever and does all the normal protocol things (so syn->synack->ack and Client Hello -> Server Hello+server cert -> client kex+change cipher -> change cipher -> encrypted data). FTP does TCP too! Same same so far (minus SSL, unless you’re using FTPS). Next, the HTTP client goes like this:

    GET /index.html HTTP/1.1
    Host: www.whatever.the.fuck
    # a bunch of other headers
    
    

    and you know what fucking happens here? The fucking server responds with the data and a response code on the same goddamn TCP connection. You get a big, glorious response over the nice connection you established:

    200 OK
    # a bunch of headers and shit
    
    HERE'S YOUR DAMN DATA NERD
    
    

    So that’s nice, and the client you’re using to read this used that flow (or an evolution of that flow if you’re using HTTP/2 or HTTP/3). So what does FTP do? It does one of two really stupid things depending on whether you’re using active or passive mode. Active mode is the default for the protocol (although not the default for most clients), so let’s analyze that! First, your FTP client initiates a TCP connection to your server on port 21 (by default), and then the server just sends this:

    <--- 220 Rebex FTP Server ready.
    
    

    ok, that kinda came out of nowhere. You’re probably using a modern client that saves you from all of the godawful footguns, so it then asks the server what it supports:

    ---> FEAT
    <--- 211-Supported extensions:
    <---  AUTH TLS;SSL;
    <---  CDUP
    <---  CLNT
    # A whole bunch of other 4 letter acronyms. If I was writing an FTP server, I'd make it swear at the user since there are a lot of fun 4 letter words
    
    

    There’s some other bullshit we don’t care about right now, although highlights include sending the username and password in plain text. There’s also ASCII vs binary mode. WE’LL GET BACK TO THAT. :|

    So then we want to do a LIST. You know what happens in active mode? Your computer opens up some random fucking TCP port. It then instructs the FTP server to CONNECT TO YOUR GODDAMN COMPUTER. Your computer is the server, and the other side is now the client. I would post a more detailed overview of the FTP commands, but most servers on the internet disable active mode because it’s a goddamn liability. All of the sudden, your computer has to be internet facing with open firewall ports, and that’s just a whole heap of shit.

    I’m probably not blowing many minds right now because people know about this shit. I just want to mention that this is how FTP was built. The data plane and control plane are separate, and back in 19XX when this shit was invented, you could trust your fellows on ARPANET and NAT didn’t exist and sure HAM radio operators here’s the entire goddamn 44.0.0.0/8 block for you to do packet switched radio. A simple protocol for simple times, back before we knew what was good and what was bad.

    So, active mode sucks! PASV is the future, and is the default on basically all modern clients and servers! Passive mode works exactly the same as the above, except when the client goes to LIST, the server opens some random TCP port (I’ve often seen something like 44000-44010) and tells the client, “hey you, connect to 1.2.3.4:44000 to get you your tasty data.” Sounds great, right? Well, there’s a problem that I actually touched on in my last paragraph. Back when this dogshit was first squeezed out in the 70s, everyone had a public address. There were SO MANY addresses! 4 billion addresses? We’ll never use all of those! That is clearly not the case anymore. We don’t have enough addresses, and now we have this wonderful thing called NAT.

    Continued in part 2.



  • There’s always Termux and whatever you can install there. That sounds silly, but when I download from my phone, I do it using aria2c in Termux. It works great, and everything (AFAIK) is FOSS. zsh + fzf history completion/file finding (<c-T> is a godsend) makes it possible to use a CLI on a phone without going crazy. Only really works well if you’re already comfortable with the command line, which is definitely a big if. It works really well for me, but I’m one of those weirdos that doesn’t have a graphical file manager installed on their computers.



  • I’m much less worried about human piloted craft. It’s very difficult to program complex decision making and discernment. The astronauts present in the first landers will have been intensively trained in how to avoid catastrophe and will likely be able to come up with solutions on the fly if unanticipated things happen. Still dangerous, but hopefully less so.

    It will be much easier to land completely automatically once we have landing pads, radar tracking, and other infrastructure present on the surface. It’s just hard to land a robot on an airless moon with a bunch of rocks and hills and shit everywhere.


  • I didn’t downvote those posts, but I did feel like the thread was aggressive when it didn’t need to be. I’d guess that a flippant/passive aggressive remark like “New to US civil law?” was (rightfully) upsetting to the user who clearly has an understanding of the law here. That user responded in kind and defended their original comment. However, they then kept responding to other users in a fairly aggressive fashion, even when those other users were communicating in alright way.

    I totally get it. I’d be pissed if, after posting a well reasoned and researched comment on Kubernetes, someone responded saying “new to container orchestration?” I try (and sometimes fail) to express the more vulnerable feelings underneath anger online after dealing with my anger in meatspace. I find it results in more productive conversations. It’s hard to do that, so I’m not casting aspersions. I think that’s probably why people downvoted in this case though. People try to suppress and avoid aggression and conflict because those things are uncomfortable and used to be precursors to actual physical danger. It’s just biology and emotions at work.





  • I’m not making this comment to disagree with your point, but the failure of the SL-1 reactor strikes me as an engineering and process failure more than anything else. The reactor was not designed in a safe fashion, probably because it was designed as a test bed for reactors that could be deployed via airplanes to the Arctic circle. The fact that an engineer was even able to fully remove a control rod, and the fact that removing that control rod lead to a fatal steam explosion make me think that they really tried too hard when they removed weight and volume from the reactor design.

    In well designed safety-critical systems, human error should not be able to cause any form of bodily harm. I don’t think it’s a great idea for a private company to be running nuclear reactors on Earth to power something as trivial as a data center (investing in storage + local solar/wind/geothermal/hamster wheel velodrome seems like a more efficient use of resources for one thing), but I also don’t think that SL-1 is the best example to cite here.

    As an aside, my high school Physics teacher went on a long diatribe about how the three SL-1 casualties were the only humans ever killed as the direct result of nuclear fission in the context of a nuclear reactor. Looking back on it, I think she was splitting hairs a bit, but it is an interesting point to make.


  • I want to offer my perspective on the AI thing from the point of view of a senior individual contributor at a larger company. Management loves the idea, but there will be a lot of developers fixing auto-generated code full of bad practices and mysterious bugs at any company that tries to lean on it instead of good devs. A large language model has no concept of good or bad, and it has no logic. It’ll happily generate string-templated SQL queries that are ripe for SQL injection. I’ve had to fix this myself. Things get even worse when you have to deal with a shit language like Bash that is absolutely full of God awful footguns. Sometimes you have to use that wretched piece of trash language, and the scripts generated are horrific. Remember that time when Steam on Linux was effectively running rm -rf /* on people’s systems? I’ve had to fix that same type of issue multiple times at my workplace.

    I think LLMs will genuinely transform parts of the software industry, but I absolutely do not think they’re going to stand in for competent developers in the near future. Maybe they can help junior developers who don’t have a good grasp on syntax and patterns and such. I’ve personally felt no need to use them, since I spend about 95% of my time on architecture, testing, and documentation.

    Now, do the higher-ups think the way that I do? Absolutely not. I’ve had senior management ask me about how I’m using AI tooling, and they always seem so disappointed when I explain why I personally don’t feel the need for it and what I feel its weaknesses are. Bossman sees it as a way to magically multiply IC efficiency for nothing, so I absolutely agree that it’s likely playing a part in at least some of these layoffs.


  • Badabinskitomemes@lemmy.worldIt's too damn cold!
    link
    fedilink
    19
    edit-2
    6 months ago

    People ought to be careful with the going outside thing. Like, if you’re just going out into your yard or apartment complex then it’s fine. If you’re commuting and there’s the possibility that you might end up stranded where there’s no climate control, then please at least stick that extra layer in your backpack or something.

    I had somewhat severe hypothermia once, and it’s an insidious thing. I got colder and colder until I just stopped noticing it, and then I stopped noticing most things. I didn’t realize what was happening to me, and I would have died if I had been alone. I had others who saw my slack, dumb face and my kinda blue lips and helped me, but I’m not going to risk ever going through that again, and I’d encourage everyone to please be careful. Keeping a coat or hat or whatever with you is worth the hassle.