• 0 Posts
  • 78 Comments
Joined 1 year ago
cake
Cake day: July 28th, 2023

help-circle












  • Sorry, I misinterpreted what you meant. You said “any AI models” so I thought you were talking about the model itself should somehow know where the data came from. Obviously the companies training the models can catalog their data sources.

    But besides that, if you work on AI you should know better than anyone that removing training data is counter to the goal of fixing overfitting. You need more data to make the model more generalized. All you’d be doing is making it more likely to reproduce existing material because it has less to work off of. That’s worse for everyone.


  • What you’re asking for is literally impossible.

    A neural network is basically nothing more than a set of weights. If one word makes a weight go up by 0.0001 and then another word makes it go down by 0.0001, and you do that billions of times for billions of weights, how do you determine what in the data created those weights? Every single thing that’s in the training data had some kind of effect on everything else.

    It’s like combining billions of buckets of water together in a pool and then taking out 1 cup from that and trying to figure out which buckets contributed to that cup. It doesn’t make any sense.


  • ayaya@lemdro.idtoLinux@lemmy.mlFlathub has passed 2 billion downloads
    link
    fedilink
    English
    arrow-up
    35
    arrow-down
    5
    ·
    edit-2
    5 months ago

    For me on Arch, Flatpaks are kinda useless. I can maybe see the appeal for other distros but Arch already has up-to-date versions of everything and anything that’s missing from the main repos is in the AUR.

    I also don’t like how it’s a separate package manager, they take up more space, and to run things from the CLI it’s flatpak run com.website.Something instead of just something. It’s super cumbersome compared to using normal packages.



  • ayaya@lemdro.idtoLinux@lemmy.mlLinux really has come a long way
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    5 months ago

    Oh if you have it set to sRGB mode then they should be accurate enough. That means it’s something else. My previous monitor also had a green tint in HDR and that was just because that monitor’s HDR was awful. If you want to check if it’s the monitor itself, you could try it with Windows or attach a Roku/Chromecast/Firestick type device that can output HDR. If it’s still green it’s the monitor’s fault and if it looks fine then it’s Plasma’s fault.

    And yeah plenty of monitors have “some HDR support” it’s just not real HDR unless it gets bright enough (and dark enough). The whole point of having a High Dynamic Range is that the range is well… high. Black should be black and extremely bright things should be extremely bright. A lot of monitors advertise “HDR400” or “HDR600” but don’t have local dimming and only go to like 450 nits. At that level it’s barely going to look different from SDR which a lot of people run at 300-400 nits anyway. The overall range of brightness is around 0.2-450 when it should be 0-1000. That 0.2 doesn’t seem like a lot but if you’ve ever seen a completely black image in a dark room you know how not-black that is. Which is why OLED and local dimming are so important for HDR.


  • ayaya@lemdro.idtoLinux@lemmy.mlLinux really has come a long way
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    5 months ago

    I also tried running a game in game scope with HDR enabled, experimenting with options and env cars I found online, but it just didn’t work.

    To be fair I don’t play a lot of games so I have only used HDR in Baldur’s Gate 3 and Elden Ring but it worked perfectly in both so I am 2 for 2.

    An extra annoyance is the fact that the LDR colors are quite off with HDR enabled on Plasma. I suspect this is the fault of the display or configuration, but it’s still something I’d have to spend time researching and fixing, only to barely get any use out of it.

    Plasma is supposed to be able to display SDR content correctly while HDR is enabled (which Windows 10 can’t even do) but I can’t actually test that properly because my monitor doesn’t allow you to disable local dimming while in HDR mode so desktop stuff is completely unusable anyway. But if it doesn’t look right it is probably something you can fix in your monitor’s OSD.

    I actually suspect the colors are correct and your normal colors are the incorrect ones. If your monitor has a wider gamut than sRGB you need to either A) set it to sRGB mode or B) use a calibrated ICC profile. If you aren’t doing one of those then all of your colors are oversaturated. When you switch into HDR they are correct but it looks dull in comparison because you’re used to them being wrong. It’s a pretty common thing people experience on Windows as well. Not a lot of people realize their colors are horribly inaccurate by default.

    Also, most people only turn HDR on when it’s needed. You can add a keybind for it in Plasma’s shortcut settings. The commands are kscreen-doctor output.1.hdr.enable and kscreen-doctor output.1.hdr.disable. You may need to change the output number to the correct one.

    I haven’t tried setting up steam itself in gamescope, but wouldn’t it be limited to one window then?

    Yep. I don’t like it honestly. It’s just an option if you want to set it up once rather than on a per-game basis.

    but I feel like there’s a lot of people who will just pay up for a good screen that includes HDR

    That’s the thing, even if you pay up there aren’t actually any “good” HDR monitors. At least not in the same way as there are good HDR TVs. That’s why some people use 48 inch TVs as monitors instead of actual monitors. There’s a few monitors that are “good enough” but I wouldn’t call any of them “good” right now. I am one of those people who considers anything below HDR1000 to not be real HDR. If you look at the rtings.com monitor table, out of 317 monitors they’ve reviewed only TWO of them actually hit the 1000 nits of real scene brightness needed for HDR1000. And both are miniLED with local dimming which have haloing and blooming because there’s not enough dimming zones.

    I have a feeling that by the time genuinely “good” HDR monitors exist (maybe 2-3 more years) that will be enough time for Linux programs to seamlessly support it instead of requiring launch arguments.


  • ayaya@lemdro.idtoLinux@lemmy.mlLinux really has come a long way
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    5 months ago

    I don’t know when the last time you used Wayland was but in Plasma 6 I wouldn’t say it “breaks other things.” Before Plasma 6 I had plenty of problems and stuck on X11 but now it’s great. So give it another try if you haven’t recently. Every issue I used to have with it a year ago is gone.

    As for the obscure parameters, as of Plasma 6.1 all you have to do for games is add gamescope --hdr-enabled to the launch options for the necessary games. I don’t think that’s particularly difficult or obscure. You can also set up Steam itself to run in gamescope with --hdr-enabled and then every game will have it.

    For HDR movies/TV/YouTube you can copy/paste the necessary options into your mpv.conf and then forget about it. It’s a one-time thing and then it works forever.

    The biggest place HDR is missing is in Firefox, but Firefox doesn’t have HDR on Windows either so that’s not a Linux thing that’s a Firefox thing.

    In my opinion, HDR on the desktop isn’t really there yet in general. Not just on Linux but on computers as a whole. HDR right now is really only for enthusiasts. The only monitors that properly support HDR1000 are $500+ for the entry level ones and $800+ for the decent ones. And you have to choose between miniLED with local dimming that don’t have enough zones yet or OLEDs that get burn-in after a year.