Compact business desktops like others have mentioned are great. Depending on your needs, I also like using older or used laptops. They’re still power efficient if you get a recent processor model, people sell them for fairly cheap used, and sometimes having an attached keyboard and display is more convenient than having to hook up a crash cart
This was a real bummer for anyone interested in running local LLMs. Memory bandwidth is the limiting factor for performance in inference, and the Mac unified memory architecture is one of the relatively cheaper ways to get a lot of memory rather than buying a specialist AI GPU for $5-10k. I was planning to upgrade the memory a bit further than normal on my next MBP upgrade in order to experiment with AI, but now I’m questioning whether the pro chip will be fast enough to be useful.