• 1 Post
  • 21 Comments
Joined 1 year ago
cake
Cake day: August 2nd, 2023

help-circle
  • If we’re just talking math, triangles can be defined in terms of 3-element subsets of all 3 (A)ngles and 3 (S)ides:
    SSS - unique
    SAS - unique
    ASS - may be unique depending on the lengths of the sides
    ASA - unique
    SAA - unique
    AAA - infinite solutions

    Maybe someone cleverer than me can figure out how that maps on to love and gender.




  • If they actually did this correctly, it would be great. Whether or not it’s possible, or even desirable to eliminate all hate speech, it should be possible to minimize the harms.

    When somebody mutters some hateful comment to themselves, do we care? Not really. We care that the hateful comment gets repeated and amplified. We care that someone might take that hateful comment. We care that someone might take harmful actions based on the comment.

    If those algorithms successfully let these comments die in ignominy they’ve done their job. My fear is that they won’t really do this though. Instead they’ll mostly hide these comments and “accidentally” let them slip out when the company thinks they need an “engagement” boost.


  • nednobbins@lemm.eeto196@lemmy.blahaj.zoneBatterules
    link
    fedilink
    arrow-up
    22
    ·
    5 months ago

    There are many subcultures around food. It’s not like the world is split between vegans and junk food addicts.

    The Cheeto and McDonalds eating crowd may have crappy nutrition but they’re an extreme. The other extreme is meal-preppers. They know exactly how much chicken, rice and broccoli they’re eating.

    There are huge communities of people who are very health conscious. Some of them focus their consciousness on science, some of them on other methods. Some of those people are vegans. Some aren’t.






  • When “they used to tell us we couldnt trust Wikipedia” it wasn’t in contrast to random websites; it was in contrast to primary sources.

    That’s still true today. Wikipedia is generally less reliable than encyclopedias are https://en.wikipedia.org/wiki/Reliability_of_Wikipedia.

    The people who tell you not to trust Wikipedia aren’t saying that you shouldn’t use it at all. They’re telling you not to stop there. That’s exactly what they told us about encylopedias too.

    If you’re researching a new topic, Wikipedia is a great place for an initial overview. If you actually care about facts, you should double check claims independently. That means following their sources until you get to primary sources. If you’ve ever done this exercise it becomes obvious why you shouldn’t trust Wikipedia. Some sources are dead links, some are not publicly accessible and many aren’t primary sources. In egregious cases the “sources” are just opinion pieces.











  • I think you’re sort of right but it will depend heavily on how radical a shift the new technology is. In order for there to be this kind of divide there needs to be a steep learning curve to the technology. People are only willing to put up with those learning curves if there’s a significant advantage. That means that manufacturers can only successfully market “difficult” technologies if they provide a big advantage.

    I’m not aware of any old people having difficulty transitioning from quills to, fountain pens to ball point pens. They all basically did the same thing and you only had to make minor adjustments. Nobody bothered learning how to use the Writer since it didn’t actually let you do anything better. They were willing to go through the significant curve of learning how to use typewriters because, once they did, they could write significantly faster.

    Computers and cell phones are a whole different way of interacting with people and information than “hardcopy” was. You didn’t just swap some objects that did the same thing with a different approach. It wasn’t even just a slightly different way of doing the same thing. Those technologies allowed us to interact with the world in a totally new way. It was worth learning a bunch of weird computer stuff that older generations had never heard of because we could do things they never dreamed of. (eg I used to get rushed when talking with my grandmother to save on long distance bills, now I don’t even think about long distance costs other than latency.

    I’m sure that sort of thing will happen again but it would require a far more disruptive technology than AR. That’s a small iteration that we’ve already been primed for. When Terminator 1 came out, nobody was confused when it switched to “terminator vision” and you saw the AR display. That’s why I joke about neural interfaces. In theory, that could give a person significantly higher throughput rates to their computer. There are all kinds of potential benefits to. It would be worth it for people to put up with steep learning curves, unintuitive interfaces and lots of troubleshooting if it meant they could suddenly “read” at 10,000 words a minute or control complex robots. Not everyone would go through that effort and it would create the kinds of divides that we saw with computers.

    When I look at current technologies as an old(ish) person, it’s a very different view than my parents and grandparents had. They didn’t understand the new technologies. I have no trouble understanding them, I just think a lot of them are a waste of my time (unlike screwing around on Lemmy, which is totally productive /s).


  • My wife and I regularly joke that one day we’ll harass our kids to help us with our neural interfaces but I don’t think that sort of thing will happen any time soon.

    When I was a kid in the 80’s a lot of people could already afford computers. They weren’t so cheap that everyone had them but they were affordable to a fair number of people if they really wanted one. A C64 cost $595 at launch, that’s under $2,000 in today’s dollars.

    The biggest barrier to computers were that they weren’t “user friendly”. If you wanted to play a simple video game you needed to know some basic command line instructions. When I wanted to set up my first mouse for my 8086 it involved installing drivers and editing config.sys and autoexec.bat. You couldn’t really do anything with a computer those days unless you were willing to nerd out.

    At the same time, nerding out on a computer could easily get you deep into the guts of your computer in a functional way. I learned that the only way I could play video games at night was if I opened up the computer and disconnected the speaker wire so it wouldn’t alert my parents. I also learned that I could “hack” Bards Tale by opening up the main file with debug and editing it so that the store would sell an infinite number of “Crystal Swords”.

    Today there are 2 cell phones for every human on earth. Kids walk around with supercomputers in their pockets. But they’ve become so “user friendly” that you barely even need to be literate to operate one. That’s generally a good thing but it removes an incentive to figuring out how the stuff works. Most people only bother with that if they’re having some trouble getting it working in the first place.

    At the same time it’s gotten much harder to make changes to your computer. The first Apple was a pile of circuits you needed to solder together. You can’t even remove the battery on a modern one (without jumping through a lot of hoops). If you edit some of your games it’s more likely to trigger some piracy or cheat protection than to let you actually change it.

    There are still large communities of computer nerds but your average person today basically treats computers like magic boxes.

    I’d expect that kind of gap in other areas. I’d take 3d printing as an example. You can get one now for a few hundred bucks. They’re already used in industry but, at this point, they’re still very fiddly. The people who have them at home are comfortable doing stuff like troubleshooting, flashing ROMs, wading through bad documentation and even printing custom upgrades for their printer.