• zarkanian@sh.itjust.works
      link
      fedilink
      arrow-up
      3
      ·
      9 months ago

      So, you want an AI that will disobey a direct order and practices deception. I’m no expert, but that seems like a bad idea.

      • Uriel238 [all pronouns]@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        4
        ·
        9 months ago

        Actually, yes. Much the way a guide dog has to disobey orders to proceed into traffic when it isn’t safe. Much the way direct orders may have to be refused or revised based on circumstances.

        We are out of coffee is a fine reason to fail to make coffee (rather than ordering coffee and then waiting forty-eight hours for delivery or using pre-used coffee grounds, or no coffee grounds.)

        As per programming with any other language, error trapping and handling is part of the AGI development.