"Set for a year-end release, AV2 is not only an upgrade to the widely adopted AV1 but also a foundational piece of AOMedia’s future tech stack.

AV2, a generation leap in open video coding and the answer to the world’s growing streaming demands, delivers significantly better compression performance than AV1. AV2 provides enhanced support for AR/VR applications, split-screen delivery of multiple programs, improved handling of screen content, and an ability to operate over a wider visual quality range. AV2 marks a milestone on the path to an open, innovative future of media experiences."

  • utopiah@lemmy.ml
    link
    fedilink
    arrow-up
    5
    ·
    11 hours ago

    So… a lot more people now have :

    • 4G/5G on the go and proper broadband at home and office and even in unique location (sadly via MuskSat for now…) other ways to get data
    • very capable devices in mobile phones, (mostly Android) clients e.g. video projector or dongles, of course computers
    • human eyes… that can’t really appreciate 4K on average

    … so obviously we should NOT stop looking for more efficient ways and new usages but I’m also betting that we are basically reaching diminishing return already. I don’t think a lot of people care anymore about much high screen resolution or frequency for typical video streaming. Because that’s the most popular usage I imagine everything else, e.g XR, becomes relative to it niche and thus has a hard time benefiting as much from the growth in performances we had until now.

    TL;DR: OK cool but aren’t we already flattening the curve on the most popular need anyway?

    • MrMcGasion@lemmy.world
      link
      fedilink
      arrow-up
      4
      ·
      6 hours ago

      It’s not for the end user at this point, it’s for YouTube/streaming companies to spend less on bandwidth at existing resolutions. Even a 5% decrease in size for similar quality could save millions in bandwidth costs over a year for YouTube or Netflix.

      • utopiah@lemmy.ml
        link
        fedilink
        arrow-up
        1
        ·
        6 hours ago

        Thanks for the clarification, it makes me wonder though, is it bandwidth saving at no user cost? i.e is the compression improved without requiring more compute at the end to decompress?

        • MrMcGasion@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          5 hours ago

          Without hardware decoding, it will take more compute to decompress, but sites usually wait to fully roll out new codecs until hardware decoding is more ubiquitous, because of how many people use low-powered streaming sticks and Smart TVs.

          • utopiah@lemmy.ml
            link
            fedilink
            arrow-up
            1
            ·
            2 hours ago

            Then it’s arguably delegating some of the cost to the final user, large streaming companies spending a bit less on IXP contracts while viewers have to have newer hardware that might need a bit more energy too to run.

    • Ferk@lemmy.ml
      link
      fedilink
      arrow-up
      2
      arrow-down
      1
      ·
      edit-2
      7 hours ago

      This! Also there’s AI upscaling, if good enough it could (in theory) make a 1080p video show with a 4k quality only very few lucky and healthy young people would be able to tell apart. In the meantime, my eyesight progressively gets worse with age.