I posted the other day that you can clean up your object storage from CSAM using my AI-based tool. Many people expressed the wish to use it on their local file storage-based pict-rs. So I’ve just extended its functionality to allow exactly that.

The new lemmy_safety_local_storage.py will go through your pict-rs volume in the filesystem and scan each image for CSAM, and delete it. The requirements are

  • A linux account with read-write access to the volume files
  • A private key authentication for that account

As my main instance is using object storage, my testing is limited to my dev instance, and there it all looks OK to me. But do run it with --dry_run if you’re worried. You can delete lemmy_safety.db and rerun to enforce the delete after (method to utilize the --dry_run results coming soon)

PS: if you were using the object storage cleanup, that script has been renamed to lemmy_safety_object_storage.py

  • webghost0101@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    16
    ·
    1 year ago

    We Evolve!

    There something really satisfying about witnessing a community starting to talk about serious a issue and days later see things already improved.

    Lemmy is now the internet, glory to all volunteer devs. Lets make it the best place we possibly can!

  • XaeroDegreaz@lemmy.world
    link
    fedilink
    English
    arrow-up
    12
    ·
    1 year ago

    I’m curious… How does one even test such a thing before distributing it without having offending files to test against.

    Like during the development process of this project, how on earth can you test it properly? 😂

    • Rescuer6394@feddit.nl
      link
      fedilink
      English
      arrow-up
      11
      ·
      1 year ago

      it uses a model that describes a photo, then it searches the generated description for some terms and ranks the image to some levels of safety.

      to test it you use a more general filter, for all nsfw for example, and see if the matches are correct.

      • poVoq@slrpnk.net
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 year ago

        Hmm, thinking out loud… Wouldn’t that also make it easy to remove scat porn and Hitler + Nazi flag images?

        There has been a lot of spam like that on Lemmy and at least the latter is somewhat illegal to host in Germany as well.

        • Rescuer6394@feddit.nl
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 year ago

          yes… maybe.

          as the dev said, it flags a lot of false positive. so a human should look at them anyway.

          maybe when this is a bit more evolved, we can use it to preprocess posts, and if a post gets flagged for something, a mod / admin needs to approve the post manually.

          maybe for CASM, it gets sent to an external service specialized to that stuff, so the mod / admin doesn’t have to look at the images.

  • Decronym@lemmy.decronym.xyzB
    link
    fedilink
    English
    arrow-up
    6
    ·
    edit-2
    1 year ago

    Acronyms, initialisms, abbreviations, contractions, and other phrases which expand to something larger, that I’ve seen in this thread:

    Fewer Letters More Letters
    HTTP Hypertext Transfer Protocol, the Web
    IP Internet Protocol
    SSH Secure Shell for remote terminal access
    nginx Popular HTTP server

    3 acronyms in this thread; the most compressed thread commented on today has 8 acronyms.

    [Thread #93 for this sub, first seen 31st Aug 2023, 04:15] [FAQ] [Full list] [Contact] [Source code]

  • doot
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 year ago

    seems like clip-interrogator isn’t explicitly supported on apple silicon, I’ll give it a try later

    thanks for making this!

  • poVoq@slrpnk.net
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 year ago

    Thanks for adding this. I guess I now have my weekend planned for moving Pict-rs to a server with a fast enough GPU and try this out 🤔

    • db0@lemmy.dbzer0.comOP
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      You don’t need to move pict-rs to a GPU server. In fact that would be prohibitedly expensive long-term. I suggest you just use your PC to run this against your current pict-rs server, or just rent a GPU server for this time.

      • poVoq@slrpnk.net
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        1 year ago

        I have a server with a smaller Nvidia GPU available. I hope the 3GB vRAM it has will be sufficient.

        • db0@lemmy.dbzer0.comOP
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          OK, but just to point out that the script hasn’t been setup to run locally yet. Just through ssh. Making it look for the files locally is my next to-do

          • poVoq@slrpnk.net
            link
            fedilink
            English
            arrow-up
            3
            ·
            1 year ago

            Ah, I was already wondering why it would need SSH keys to scan local files 😅 Well, I only have time to look into this on Sunday I think, so no rush.

  • BitOneZero @ .world@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    1 year ago

    I hope people share the positive hits of CSAM and see how widespread the problem is…

    DRAMTIC EDIT: the records lemmy_safety_local_storage.py identifies, not the images! @bamboo@lemmy.blahaj.zone seems to think it “sounds like” I am ACTIVELY encouraging the spreading of child pornography images… NO! I mean audit files, such as timestamps, the account that uploaded, etc. Once you have the timestamp, the nginx logs from a lemmy server should help identify the IP address.

  • fmstrat@lemmy.nowsci.com
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Is there a reason it needs a PK vs just being able to point it at a local folder and running as a user with wrote access?

  • rentar42@kbin.social
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    It's great that there's now a tool, but this kind of issue is why I'm not considering self-hosting a fediverse service: due to the nature of the beast even a single-user instance effectively becomes a publicly accessible distributor of content that others created/uploaded. I'm sure this could be restricted somewhat (by making the web UI inaccessible for the public), but the federation means that other instances need to be able to get content from the server. That's way to much legal risk for me.

    • db0@lemmy.dbzer0.comOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Actually the other instances don't pull content. You push it out instead. Basically if you don't run pict-rs and you don't allow user registreations, you're safe for everything except potential copyright infringement from some text someone might post. If you make your webui inaccessible to anyone but your IP, you protect against even that.