Skip Navigation

InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)D
Posts
11
Comments
225
Joined
3 wk. ago

  • That is not remotely what the bill says. It just say that a school can't fail a kid because of their religion or politics. The kid still has to do the academic work.

    https://www.flsenate.gov/Session/Bill/2026/835

    (3)(a) A student may express his or her religious,58 political, or ideological beliefs in coursework, artwork, and other written and oral assignments free from discrimination oracademic penalty. A student's homework and classroom assignments shall be evaluated, regardless of their religious, political, or ideological content, based on expected academic standards relating to the course curriculum and requirements. A student may not be penalized or rewarded based on the religious, political, or ideological content of his or her work if the coursework, artwork, or other written or oral assignments require a student's viewpoint to be expresse

  • I guess my question is more about what constitutes a "truck suv". There is no way that SUV built on a truck frame out number SUVs built on car unibody frames by that much.

  • How are they breaking out car SUVs vs truck SUVs?

  • The only thing you can trust about liars is that they are lying.

  • I meant the El Paso county DA.

    ... But also her too.

  • Of course parents also have a responsibly to keep their children safe. But it can't ONLY be on the parents. The platforms need accountability too.

  • Do I admit that I came here to shill for child safety policies?

    Guilty as charged.

  • BOT! FOUND ONE! INTERNET ATTACK!

  • Where is the DA?! You have an autopsy report that says he was murdered. Charge people! That's your job!

  • I don't care.

    If I can get some people on this site to start thinking about child safety in new ways that will be a win for me today.

  • The only real solution is for the parents/guardians to be engaged and involved in their childrens’ lives.

    I don't agree with that. It's not all on the parents. It can't be all on the parents.

    This is like if the Boy Scouts said "Hey, it's not our responsibility to protect kids. The parents should have been more involved." No, if you are providing the service then it's your responsibility to make sure that service is safe.

    And yes, I believe you should be held accountable for the services you provide.

  • ID and age verification for users.

    That's not the only solution, and I've offered several others. And I'm also not the only one with ideas. But completely frictionless encrypted anonymous one-to-one communication is probably not going to last much longer. And shouldn't.

  • Not exactly, and not for long. Mastodon, for example, is working on end-to-end encryption in messages. Matrix is also private by design.

    And again, it's not that I think end-to-end encrypted one-to-one messaging is bad. But if you are going to offer it then you need to be held responsible for it.

  • Remember, I originally started this chain by asking you if every single site online should be forced to implement age-ID and you said yes.

    Fair. But I really meant that every network should have policies in place, where as age-verification is one option. Elsewhere on this thread you'll see that I offer alternative solutions, such as simply keeping everything public and not allowing 1-to-1 messaging.

  • I'm only going to say this one more time.

    They 100% will care.

  • How would they know?

    Well, if, and really when, a predator is caught by the police, that police department will do a full investigation and find all the places they are having communications with kids. Sooner or later, one will be found to be using Lemmy. On that day, the host is going to need a good lawyer.

    It's not enough to "not allow this". A person that allows anonymous strangers to use their servers to store information in secret is asking for trouble. They need to take much more care than that.

    And I never said that age-verification is the only solution to this problem. >>

  • 100% they would. Yeah.

    If child pornography was found to be stored on a host's server by one of their 1000 users, "I didn't think you guys would care about a platform with less than 1000 monthly users" isn't going to be a great argument in courts.

  • Well, first I would recommend server hosts that "can't afford to protect children" be much more careful who they let onto their personal network.

    Second, I would recommend the developer community start training this problem seriously and use the power of the open source development process (which is really good at finding creative solutions to problems) to set this as a development priority.

  • You wouldn't have to treat it like an organization. Go after individual hosts. If a police investigation found that a forumverse host was providing an opportunity for a child predators to use their system to blackmail kids into sending nude photos of themselves, then I think the host, the individual, should be held responsible for what happens on their server. Just like they would be held responsible if it happened in their house.