I noticed a bit of panic around here lately and as I have had to continuously fight against pedos for the past year, I have developed tools to help me detect and prevent this content.

As luck would have it, we recently published one of our anti-csam checker tool as a python library that anyone can use. So I thought I could use this to help lemmy admins feel a bit more safe.

The tool can either go through all your images via your object storage and delete all CSAM, or it canrun continuously and scan and delete all new images as well. Suggested option is to run it using --all once, and then run it as a daemon and leave it running.

Better options would be to be able to retrieve exact images uploaded via lemmy/pict-rs api but we’re not there quite yet.

Let me know if you have any issue or improvements.

EDIT: Just to clarify, you should run this on your desktop PC with a GPU, not on your lemmy server!

  • @Starbuck@lemmy.world
    link
    fedilink
    English
    910 months ago

    The fun part is that it’s still a valid JPEG file if you put more data in it. The file should be fully re-encoded to be sure.

    • db0OP
      link
      fedilink
      English
      310 months ago

      In that case, PIL should be able to read it, so no worries

      • @Starbuck@lemmy.world
        link
        fedilink
        English
        610 months ago

        But I could take ‘flower.jpg’, which is an actual flower, and embed a second image, ‘csam.png’ inside it. Your scanner would scan ‘flower.jpg’, find it to be acceptable, then in turn register ‘csam.png’. Not saying that this isn’t a great start, but this is the reason that a lot of websites that allow uploads re-encode images.

        • db0OP
          link
          fedilink
          English
          810 months ago

          my pict-rs already re-encodes everything. This is already a possibility for lemmy admins