• 0 Posts
  • 10 Comments
Joined 1 year ago
cake
Cake day: June 26th, 2023

help-circle

  • metiulekm@sh.itjust.workstoProgramming@programming.dev...
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    6 months ago

    I really need to try out Mercury one day. When we did a project in Prolog at uni, it felt cool, but also incredibly dynamic in a bad way. There were a few times when we misspelled some clause, which normally would be an error, but in our case it just meant falsehood. We then spent waaay to much time searching for these. I can’t help but think that Mercury would be as fun as Prolog, but less annoying.

    I actually use from time to time the Bower email client, which is written in Mercury.



  • Edit: Actually, I thought about it, and I don’t think clang’s behavior is wrong in the examples he cites. Basically, you’re using an uninitialized variable, and choosing to use compiler settings which make that legal, and the compiler is saying “Okay, you didn’t give me a value for this variable, so I’m just going to pick one that’s convenient for me and do my optimizations according to the value I picked.” Is that the best thing for it to do? Maybe not; it certainly violates the principle of least surprise. But, it’s hard for me to say it’s the compiler’s fault that you constructed a program that does something surprising when uninitialized variables you’re using happen to have certain values.

    You got it correct in this edit. But the important part is that gcc will also do this, and they both are kinda expected to do so. The article cites some standard committee discussions: somebody suggested ensuring that signed integer overflow in C++20 will not UB, and the committee decided against it. Also, somebody suggested not allowing to optimize out the infinite loops like 13 years ago, and then the committee decided that it should be allowed. Therefore, these optimisations are clearly seen as features.

    And these are not theoretical issues by any means, there has been this vulnerability in the kernel for instance: https://lwn.net/Articles/342330/ which happened because the compiler just removed a null pointer check.



  • I’m super conflicted about this article. The portion on disabilities is great! But then, we see this:

    It’s considered an ‘AI-complete’ problem, something that would require computers that are as fully complex as, and functionally equivalent to, human beings. (Which about five minutes ago was precisely what the term ‘artificial intelligence’ meant, but since tech companies managed to dumb down and rebrand ‘AI’ to mean “anything utilizing a machine-learning algorithm”, the resulting terminology vacuum necessitated a new coinage, so now we have to call machine cognition of human-level complexity ‘AGI’, for ‘artificial general intelligence’.)

    This is honestly the first part that’s outright objectively wrong. A quick look at the Wiki will tell us that the term AGI was already used in 1997, for example. You can’t say that it was made up by tech companies about five minutes ago. And the author returns to this “rebranding” later in the article, so you can’t just brush this away as a misguided aside; it’s just clear that the author does not really know anything about AI, yet is willing to write an article about it. Mix this with the snarky tone, and it just gets very sad.

    It’s not like that I don’t agree with what they say about AI either, and I definitely agree with the big conclusions; it’s not like there are no people with a similar opinion that know more about AI (Gary Marcus, for instance), the comparision to disabilities is the novel (to me) part. But I just couldn’t share this article with anyone. As I am writing, the top comment on [email protected] is criticizing the same part of the article, except in less nice words. I don’t think that the person who wrote that comment will learn anything helpful about disabilities from this article…



  • Imagine a soccer ball. The most traditional design consists of white hexagons and black pentagons. If you count them, you will find that there are 12 pentagons and 20 hexagons.

    Now imagine you tried to cover the entire Earth in the same way, using similar size hexagons and pentagons (hopefully the rules are intuitive). How many pentagons would be there? Intuitively, you would think that the number of both shapes would be similar, just like on the soccer ball. So, there would be a lot of hexagons and a lot of pentagons. But actually, along with many hexagons, you would still have exactly 12 pentagons, not one less, not one more. This comes from the Euler’s formula, and there is a nice sketch of the proof here: .