"Companies will have three months from when the guidance is finalised to carry out risk assessments and make relevant changes to safeguard users…“Platforms are supposed to remove illegal content like promoting or facilitating suicide, self-harm, and child sexual abuse.”

This is already impacting futurology.today - one of the Mods is British, and because of this law doesn’t feel comfortable continuing. As they have back-end expertise with hosting, if they go, we may have to shut down the whole site.

How easy is it to block British IP addresses? Would that be enough to circumvent any legal issues, if no one else involved in running the site is British and it is hosted somewhere else in the world?

  • Scrubbles@poptalk.scrubbles.tech
    link
    fedilink
    English
    arrow-up
    5
    ·
    edit-2
    4 days ago

    I’m not a Brit (but worried similar stuff is coming here to America), but it sounds like you have to enforce those rules. Theoretically, from my high level understanding, would it be enough to have those rules in place, and when reported actively remove the content as a mod?

    I can’t tell exactly, but it sounds like the biggest thing is that there needs to be a way to remove the content quickly, which we have. Facebook is obviously an offender, where they have a “process” but it takes days and as we all know, 99% of the time they don’t actually do anything.

    As a higher more automated level, I’m guessing our automod stuff that most of us admins are using would probably be enough, or if not that some basic AI models.

    It doesn’t sound like they expect it to be perfect, here in the states they don’t expect me to be perfect, but they damn well expect me to follow correct process if I do become aware of something. It’s essentially A) I need to take reasonable preventative measures, like actively moderating, doing what I can automatically, banning bad users and removing content when needed and then B) Immediately taking action if I do become aware of anything, keeping evidence for the feds.

    • LughOPA
      link
      fedilink
      English
      arrow-up
      8
      ·
      edit-2
      4 days ago

      would it be enough to have those rules in place, and when reported actively remove the content as a mod?

      We’re pretty good with daily moderating of content on futurology.today, so I’d be confident we could cover that aspect.

      However I’m wondering about federation issues. Are we liable for UK users who use their futurology.today account to access other instances we don’t mod?

      • Scrubbles@poptalk.scrubbles.tech
        link
        fedilink
        English
        arrow-up
        3
        ·
        4 days ago

        That’s a good point, and I don’t know. My gut says no, it would be on the other instance owner, but obviously I’m not a lawyer or anything