• 0 Posts
  • 90 Comments
Joined 1 year ago
cake
Cake day: June 16th, 2023

help-circle
rss

  • Unfortunately for many, even in this day and age, there is not much choice. I main linux but also keep Windows on my PC as there are still tines when something will only work in Windows. Usually work related or gaming (VR in particular for me) and in fairness its increasingly rare.

    Many other users aren’t motivated to change. For Microsoft, its a bit like boiling a frog - if you turn up the heat slowly the frog just puts up with it. That’s what Microsoft is doing to its customers - a slow constant enshittification, seeing what it can get away with. Try something and it causes outrage? Don’t worry, just undo it and just try again in a few years! Many are already used to no privacy and being sold as a commodity that they don’t even question it happening on their own personal computer.



  • Unless you’re specifically wanting to play with a different OS then Debian again. Makes much more sense to be using the same version of Linux and all the software ypu use rather than potentially different versions.

    Also it will be simpler to maintain as everything is the same.

    If you do want to play / test another distro then Mint has a low learning curve. FreeBSD is more different but you could easily try it and switch to something else if you don’t like it. Its different but not so much that linux users would feel totally lost.

    Probably the most confusing thing for linux user trying FreeBSD is that Bash is not installed, and BSD uses sh instead by default. Bash can be easily installed and set as the default shell which will give a lot more familiarity. But otherwise it’ll feel like a familiar modern complete system, and you can use the same desktop environments you’re familiar with already in linux.

    EDIT: You did say “backup” in your title. If that’s the main use case then definitely Debian again. If your laptop breaks or is stolen it makes sense to have a familiar system to pick up. Also important to sync and backup your data so it can be picked up on the other laptop. If backup machine is your focus then I’d say same OS and look more into data retention and retrieval between the two laptops, and ensure your important data is continuously backed up.


  • Well all we have in the article are claims from the perpetrators family and vague innuendo about what was on the victims phone.

    The only facts outlined in the article were that the victim was shot 7 times in his own home, and managed to call from help from the street before dying. The purpetrator was on the run for 2 weeks, and allegedly on drugs during that time.

    Its trash journalism and a shit article. The allegations may be substantiated or they may not, but at the moment the story as written is the family’s opinion spliced into a few details about the crime.





  • It kind of makes sense except the vast majority of software in all distros is not being packaged by the developers, its being packaged by volunteers in the relevant project. Most software is being used on trust that it is built off the original code and not interfered with.

    Its very difficult for any distros to actually audit all the code of the software they are distributing. I imagine most time is spent making sure the packages work and don’t conflict with each other.

    The verified tick is good in flatpaks but the “hide anything not verified” seems a little over the top to me. A warning is good but most software is used under trust in Linux - if you’re not building it yourself you don’t know you’re getting unadulterated software. And does this apply to all the shared libraries on flathub? Will thebwarn you if your software is using shared libraries that ate not verified?

    And while Flatpak is a potential vector to a lot of machines if abused, it is also a sandboxed environment unlike the vast majority of software that comes from distros own repos.

    Also given the nature of Flatpaks, any distros could host its own flatpaks but everyone seems to use flathub. If they’re not going to take on the responsibility of maintaining flathub and its software then their probably needs to be some way of “verifying” packages not coming directly from the developers. Otherwise users may lose put on the benefits of a shared distros agnostic library of software.

    I get why mint are doing this but i think its a bit of a false reassurance. Although from mints point of view they would be able to take direct responsibility for the software they distribute in their own repos (as much as you can in a warrentyless “use as your own risk” system)


  • If you look into the data Steam OS Holo s listed and it is 45.3%. Arch separately is second at 7.9% and then third is the Flatpak installs across all Linux versions at 6%.

    The changes are more difficult to interpret as Linux is growing overall so changes between Linux distros are difficult. For example a small decline in overall share may still represent an increase in total numbers. While Steam OS is up another 3% points, other distros combined are up more - Ubuntu and PopOS combined are up 5% points. That suggests the Linux growth is split between Steam Deck and PC users rather than purely one or the other dominating.


  • Yeah wishful thinking but also a bit reassuring that this is then a meaningful if small shift. People are choosing Linux via steam decks or personally, and its been enabled via proton and wine rather than necessarily people fleeing win 11.

    I do think win 11 changes contribute to people trying Linux more but I think it is Linux that is keeping people that is what has changed. I don’t see some huge move to Linux though - just its growing faster as it supports gaming well and is increasingly easier to use and maintain (which has been a long trend). But win11 being increasingly anti user can’t be a bad think for Linux long term.


  • You can keep windows and install Linux next to it.

    The best way would be to add a new ssd or m.2 card to your pc and install Linux on that. Make that the main boot device and Linux normally will detect Windows and give you a boot menu where you can chose between Linux and Windows each time you boot.

    Alternatively you can resize the windows partition and install Linux onto free space on your main drive. This is more fiddly and things can go wrong with this if you don’t know what you’re doing.

    You can also boot Linux on an external USB drive but this will be slower and may guge you a false impression of Linux. You can also try Linux in a virtual machine like Virtualbox but again this will be slower and will give you a false impression of Linux as a daily driver OS.

    I personally run a dual boot system - I have two m.2 nvme drives, one with windows and one with Linux. I barely use the windows partition now but I keep it around for rare work stuff or the rare occasion I have a game I can’t get to run in Linux. And I mean rare - booted Windows maybe 3 times in last 6 months.


  • PPAs are flawed and limited to the Debian/Ubuntu ecosystem. They’re a security issue as you really need to trust to the person or group who has set up the PPA (yet many people just added PPAs for all sorts of random software based on a Google search). They need to be maintained which is variable depending on the size of the project and for developers they’re only a route to support part of the entire Linux ecosystem. They can also conflict with the main system provided packages and repost which can break entire systems or break upgrades (happened to me on Mint, and I needed to do a complete system reinstall to remove legacy package conflicts).

    They’ve fallen out of fashion and rightly so.

    There are other ways to get software to users. Arch has its AUR which is basically a huge open repo. OpenSuSE has its OBS which is also a huge open repo. These are also not without their risks as it’s hard to curate everything on such an expansive repo. However others can take over packages if the original developer stops updating them, and you can see how the package was built rathe than just download binaries which allays some security concerns. They are also centralised and integrated into the system, while PPAs are a bit of a free for all.

    Flatpaks are a popular alternative now - essentially you download and run software which runs in a sandbox with its own dependencies. Flatpaks share their sandboxed dependencies but it does lead to some bloat as you’ll have system level libraries and separate Flatpak versions of the same libraries both installed and running at the same time. However it does mean software can be run on different systems without breaking the whole system if library dependencies don’t match. There are issues around signing though - flathub allows anyone to maintain software rather than insisting on the original devs doing so. That allows software to be in a Flatpak that might otherwise not happen but adds a potential security risk of bad actors packaging software or not keeping up to date. They do now have a verified tick in Flathub to show if a Flatpak is official.

    Snap is the Canonical alternative to Flatpak - it’s controversial as it’s proprietary and arguably more cumbersome. The backend is closed source and in canonical control. Snaps are also different and for more than just desktop apps and can be used to in servers and other software stacks, while Flatpak is focused only on desktop apps. Canonical arr also forcing Ubuntu users to use it - for example Firefox only comes in a snap on Ubuntu now. It has similar fundamental issues around bloat. It has mostly the same benefits and issues as Flatpak, although Flatpaks are faster to startup.

    Appimage are another alternative way to distribute software - they are basically an all-in-one image. You are essentially “mounting” the image and running the software inside. It includes all the libraries etc within the image and uses those instead of the local libraries. It does and can use local libraries too; the idea is to include specific libraries that are unlikely to be on most target systems. So again it has a bloat associated with it, and also security risks if the Appimage is running insecure older libraries. Appimage can be in a sandbox but doesn’t have to be, unlike Flatpak where sandboxing is mandatory - which is a security concern. Also Appimages are standalone and need to be manually updated individually while Flatpaks and Snaps are usually kept up to date via an update system.

    I used to use PPAs when I was still using Ubuntu and Mint. Now I personally use Flatpak, and rarely Appimages, and occasionally apps from the OBS as I’m on OpenSuSE Tumbleweed. I don’t bother with snaps at all - that’s not to say they don’t have value but it’s not for me.

    Edit: in terms of permissions, with Flatpak you can install Flatseal and manage software’s permissions and access per app. You can give software access to more locations including system level folders should you need to or all devices etc for example. I assume you can do the same with snap but I don’t know how.

    Also you can of course build software form source so it runs natively , if you can’t find it in a repo. I’ve done that a few times - can be fiddly but can also be easy.


  • Jellfin can be configured to use specific installed versions of ffmpeg.

    If you do need the jellyfin-ffmpeg (which is needed in specific installs) then you can download releases from github or build it yourself. They do have portable releases.

    You do not necessarily need root access to use software on Linux unless you’re trying to install it to be available to all users. Users can often install their own software either binaries or compile themselves (unless the system has been locked down). They could sit within your /home/username/bin directory instead of the system level folders like /usr/bin normally used for non-root executable. Your home bin folder is only accessible and so runable by you, and is viable if you do not have access or permission to install into /usr/bin.

    You can configure jellyfin to run within your home bin folder or run other software within that folder.

    You can get the jellyfin ffmpeg source and releases including portables from their git: https://github.com/jellyfin/jellyfin-ffmpeg



  • As a software developer you should have a bit of a head start - you can read the code - one of the big pluses of open source projects is it’s all there in the open. Even if not familiar with the specific language used you can see the source and get a rough idea of scope and complexity.

    And look at the Github details like the age, the frequency between releases, commits, forks. Malicious projects don’t stick around for long on a host site like that, and they don’t get 1000s of stars or lots of engagement from legitimate users. It’s very difficult to fake that.

    Look at the project website. Real projects have active forums, detailed wikis, and evidence of user engagement. You’ll see people recommending the project elsewhere on the net if you search, or writing independent tutorials on how to deploy or use it, or reviews on YouTube etc. Look for testimonials and user experiences.

    Also look at where the software is deployed and recommended. If it’s included in big name Linux distros repos thats a good sign.

    Look at all the things you’d be looking at for paid software to see it’s actually in use and not a scam.

    And try it out - it’s easy to set up a VM and deploy something in a sandbox safe environment and get a feeling if it does what it claims to do. Whether that be a cut down system with docker or an entire OS in the sandbox to stress test the software and out it through its paces.

    There are so many possible elements to doing “due diligence” to ensure it’s legitimate but also the right solution for your needs.


  • There is no reason other than greed that tech companies have to have their fingers in so many pies. Regulators could split Google up - search separated from ads and separated from other services.

    It’s not the size so much as the breadth of it’s influence. We’ve gotten used to the idea that tech companies like Google and Microsoft do everything. But they’re only doing everything so they can get at every bite of our data. An email service doesn’t need to be run alongside a search engine or a news aggregator or an ad company. And certainly doesn’t need integration between all those things.



  • So is this adjusted for inflation? The word is not mentioned once in the article.

    Using inflation calculators I get the following (used https://www.calculator.net/inflation-calculator.html and https://www.officialdata.org/us/inflation/; getting similar results)

    • 1990s - $124,800 ($298,200 today)
    • 2000s - $165,300 ($299,800 today)
    • 2010s - $219,000 ($313,600 today)
    • 2020s - $327,100 ($394,700 today)
    • Now - $420,800

    Looking at FRED economic data (https://fred.stlouisfed.org/series/MSPUS), it looks like thats where they got their figures. As far as I can tell is it not inflation adjusted. They have picked the Q4 results for each year as base for the 1 Jan.

    When adjusted for inflation, the increase in value since the 1990s is much less AND the increase was biggest between 2010-2020.

    Also on their own figures in the article; between 2020 and now the median price is up 28% without inflation adjustment, and 7% with. Compared to 1990 the median price corrected for inflation is up 40%, but the biggest jump is 2010-2020; it began 2020 32% above the 1990 price.

    The point? House prices are up, but inflation has been uneven over that period, with a big spike recently - the dramatic figures in the article may not reflect the real story. According to the calculators from 2020 to 2024 the total inflation rate is 21.54%; equivalent to 4.7% a year. Inflation accounts for much more of the perceived price rise than the actual real value rise.

    The problem with inflation is people only think about today’s inflation rate. Current US inflation is 3.5% but that is on top of last years inflation, and the year before that etc. So dramatic articles like this are really of dubious value.

    EDIT: The article links to “analysis” by another website ResiClib. They do not seem to have looked at inflation at all either.


  • I’m not sure how I feel about this news story.

    On the one side, it’s good to make sure people are aware of the limitations of secure email providers. However on the other the article almost reads as of this should be a surprise to people?

    I use Proton mail and pay for my account. I don’t pay for anonyminity - I pay for privacy. They are two very different things.

    The article talks about Opsec (operational security) and they’re right - if you need anonyminity then don’t use your personal apple email as a recovery address. That is a flaw in the user approach and expectations that unencrypted data held by Proton is also “secure”. Your basic details and your IP address are going to be recorded and available to law enforcement. Use a VPN or Tor to access the service and use another untraceable email for recovery, and pay via crypto if you want true anonymity. And even then there are other methods of anonymous or untraceable secure email that may be better than Proton mail (such as self hosted).

    But for most users like myself, if you’re not looking for anonyminity then Proton is fine as is. My email address is my name and I use it to keep my emails secure and not snooped on by Google etc.

    Proton advertises itself as private, secure and encrypted. It does not claim to offer anonymity.