• 0 Posts
  • 24 Comments
Joined 1 year ago
cake
Cake day: June 12th, 2023

help-circle






  • The point is, laying all these people off with performance as reason protects Cloudflare in not having to pay extra (which would be legally needed if the employee was not at fault).

    This is probably not any kind of proof she can use, but it does make people aware of how Cloudflare operates.

    It’s understandable companies have to fire people and as an employee you’d probably do best to accept the harsh reality of a business. But if they really communicate fake causes with lay-offs (not only hurting the employee mentally, but also financially bypassing rightful compensation by law), this should be known by the public.

    To be fair though, we cannot confirm her statements to be true either. But I think it’s an interesting share nonetheless.




  • A bunch of cells in rapid development with the potential to become a human being. Murder is a strong term, but in a broad sense I don’t think your insinuation is wrong per se.

    This might be getting a bit controversial, but for the sake of discussion:

    The important thing here is, do you mind if that potential for life is taken away. In this case we place priority on the human being that eventually has to dedicate her life to that potential. Or is that new potential more important than that already existing, conscious human being (especially when there are physical / mental problems involved)?

    It comes down to why we live, and why must we live? Personally I believe trying to avoid (potential of) suffering is a more reasonable concept.

    If one gives life to a baby, you give it a potential for suffering which it otherwise does not. I’d say the ways one can suffer is of a greater weight than the ways one can be happy. So if you go the route of creating life, you better be damn confident that you are in a good position to do that.

    In that philosophy ‘murdering’ a potential with a large chance of creating more suffering for the collective is not that bad. One might view this differently when the being is conscious and might actively not want to die, as we bring the complexity of individual human choice to the table and what worth that has; but I think we can agree that is not applicable on the unborn potential human being discussed in this topic.


  • I think it stems from the more difficult cases, and people failing to realize the actual suffering that comes with that.

    As with all extremes, a lot of emotions are involved. People who see / experience the hardships don't feel heard. As the general tendency is that one needs to be alive and that this is good, this hurts people who do not want to live (like this).

    Going to a lot of trouble to conceive, and bringing triple the amount of possible suffering that people experience can be felt as worse than a death sentence. Therefore people feel the need to be vocal about this.

    But in the end I agree, there is nuance. But there is the extreme as well, which weighs heavier here?


  • In this case it’s the “suspect” of racism, in which I think we all agree should not directly lock you out of your account, but perhaps give you a warning.

    But what if it’s suspected illegal actions or content? Like them catching the home server being part of a DDoS attack, or overhearing signs of child abuse / identifying possible child pornography content or noticing illicit gun ownership. Their AI will determine that there’s a 97% chance of that being the case.

    I wonder if that would change things.

    If such a system is not allowed to block usage, it will probably at least inform the local police.Your home setup will in most cases eventually act as a panopticon.





  • I’m thinking new interfaces/concepts of interaction might be where we lose touch.

    Just like the previous baby boom generation had people with a lot of technical knowledge about for example how punch cards were used to configure computers and how to type with an old typewriter, we might know much about more advanced technical software and touch interfaces, but many might skip the Snapchat/TikTok scene and feel out of place.

    Not to mention future upcoming things like a Brain-Computer Interface connected to an AI; perhaps to socialize, to create tools / content. Some of us, and maybe you as well, will join this scene too, but I already see people giving up and staying away from new stuff.

    We will have a role in the technical side because of our knowledge, but that core knowledge is not that important any longer in many fields just like most developers don’t have to worry about machine code any more.