Just a stranger trying things.

  • 3 Posts
  • 77 Comments
Joined 1 year ago
cake
Cake day: July 16th, 2023

help-circle

  • Exactly, this is about compression. Just imagine a full HD image, 1920x1080, with 8 bits of colors for each of the 3 RGB channels. That would lead to 1920x1080x8x3 = 49 766 400 bits, or roughly 50Mb (or roughly 6MB). This is uncompressed. Now imagine a video, at 24 frames per second (typical for movies), that’s almost 1200 Mb/second. For a 1h30 movie, that would be an immense amount of storage, just compute it :)

    To solve this, movies are compressed (encoded). There are two types, lossless (where the information is exact and no quality loss is resulted) and lossy (where quality is degraded). It is common to use lossy compression because it is what leads to the most storage savings. For a given compression algorithms, the less bandwidth you allow the algorithm, the more it has to sacrifice video quality to meet your requirements. And this is what bitrate is referring to.

    Of note: different compression algorithms are more or less effective at storing data within the same file size. AV1 for instance, will allow for significantly higher video quality than h264, at the same file size (or bitrate).








  • The Witcher 3. It’s not far from being 10, but got a very nice graphics update for free and has 2 DLC. The game and the DLC and the free graphics update and a very recent mod kit, all for around 10-15 USD right now on GOG . it’s a steal! I highly recommend it. It became my favorite game of all time, very fast. And it will offer around 100h, and it will also offer replayability. What is there not to like?

    Edit link




  • The Hobbyist@lemmy.ziptoLinux Gaming@lemmy.worldBest Graphic card for Linux Gaming
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    1
    ·
    edit-2
    2 months ago

    Both AMD and Nvidia GPUs work well. There is mainly a philosophy difference where AMD GPUs work particularly well with open source drivers whereas Nvidia still mostly depends on its proprietary drivers (though they work fine on Linux too).

    Phoronix is a reputable website when it comes to benchmarking on Linux. Here is a previous benchmark with Nvidia GPUs, as an example:

    https://www.phoronix.com/review/nvidia-rtx4080-rtx4090-linux/2

    Of note: when people complain about nvidia on Linux, you need to determine whether they complain about open source or proprietary drivers.

    I have been running Nvidia GPUs on Linux for years and have had no issue with the proprietary drivers, both for an old and recent GPU. Of course YMMV.

    Edit: my personal recommendation though would be to stick with AMD which offers more memory and bandwidth compared to similarly priced Nvidia GPUs (Nvidia uses 8GB for many of its GPUs which is quire disappointing these days). And with open source drivers it may be easier to get issues fixed and find support.