Running AI is so expensive that Amazon will probably charge you to use Alexa in future, says outgoing exec::In an interview with Bloomberg, Dave Limp said that he “absolutely” believes that Amazon will soon start charging a subscription fee for Alexa

  • LEX@lemm.ee
    link
    fedilink
    English
    arrow-up
    32
    arrow-down
    1
    ·
    edit-2
    1 year ago

    That’s already here. Anyone can run AI chatbots similar to, but not as intelligent as, Chatgpt or Bard.

    Llama.cpp and koboldcpp allow anyone to run models locally, even with only a CPU if there’s no dedicated graphics card available (although more slowly). And there are numerous open source models available that can be trained for just about any task.

    Hell, you can even run llama.cpp on Android phones.

    This has all taken place in just the last year or so. In five to ten years, imo, AI will be everywhere and may even replace the need for mobile Internet connections in terms of looking up information.

    • Zetta@mander.xyz
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 year ago

      Yes, and you can use run a language model like Pygmalion Al locally on koboldcpp and have a naughty AI chat as well. Or non sexual roleplay

      • LEX@lemm.ee
        link
        fedilink
        English
        arrow-up
        6
        ·
        1 year ago

        Absolutely and there are many, many models that have iterated on and surpassed Pygmalion as well as loads of uncensored models specifically tuned for erotic chat. Steamy role play is one of the driving forces behind the rapid development of the technology on lower powered, local machines.