Yea, but Gabe is not going to be around forever, and any successor leadership might have a different philosophy. And it’s never a bad idea to have a backup.
Yea, but Gabe is not going to be around forever, and any successor leadership might have a different philosophy. And it’s never a bad idea to have a backup.
Day 1 of suddenly having the urge to keep an offline and DRM free copy of all my steam games.
I started with linux mint, and if I had to start again I would still go with mint as my first distro. It was just familiar enough while allowing me to figure out what was different on linux. I only switched to arch due to the quality of the wiki and the AUR (after a short trial run of manjaro).
They are, but training models is hard and inference (actually using them) is (relatively) cheap. If you make a a GPT-3 size model you don’t always need the full H100 with 80+ gb to run it when things like quantization show that you can get 99% of its performance at >1/4 the size.
Thus NVIDIA selling this at 3k as an ‘AI’ card, even though it wont be as fast. If they need top speed for inference though, yea, H100 is still the way they would go.
If it crashes hard I look forward to all the cheap server hardware that will be in the secondhand market in a few years. One I’m particularly excited about is the 4000 sff, single slot, 75w, 20GB, and ~3070 performance.
Here is a thread from 7 months ago where more people noticed the video was plagiarized due to a DMCA of a re-upload: https://www.reddit.com/r/youtubedrama/comments/1391d4o/internet_historians_man_in_cave_video_was/
But to actually answer your question, it takes time to prove (or even notice) when a work has been plagiarized, particularly when the person who did wrong does not mention, or intentionally hides the original source. The Hbomberguy video is about exploring that in depth and the IH video is just one example, not the main topic.