Trying it out in Shadows of Doubt right now, took performance from an unstable 25-31 fps to 61-71 fps with I set on performance mode and x2 fps. Don’t really notice input lag.

It’s not on the decky store yet, so you have to download the extension zip manually.

Here’s the extension github with full instructions and details.

Basically you’ll:

  1. Install the plugin. Once it’s on the decky store you can install it from there, but in the meantime do this:

    • Download the .zip from the release page
    • In Game Mode, go to the settings cog in the top right of the Decky Loader tab
    • Enable Developer Options
    • In the new Developer tab, select “Install from zip”.
    • Choose the “Lossless Scaling.zip” file you downloaded (likely in the Downloads folder)
    • If it does not show up, you may need to restart your device
  2. Purchase and install Lossless Scaling from Steam

  3. Open the plugin from the Decky menu

  4. Click “Install lsfg-vk” to automatically set up the compatibility layer

  5. Configure settings using the plugin’s UI controls:

    • Enable/disable LSFG
    • Set FPS multiplier (2-4) Note: The higher the multiplier, the greater the input lag
    • Enable performance mode - Reduces gpu load, which can sometimes majorly increase FPS gains
    • Adjust flow scale (0.25-1.0)
    • Toggle HDR mode
    • Toggle immediate mode (disable vsync)
  6. Apply launch commands to the game you want to use frame generation with:

    • Option 1 (Recommended): ~/lsfg %COMMAND% - Uses your plugin configuration
    • Option 2: Manual environment variables like ENABLE_LSFG=1 LSFG_MULTIPLIER=2 %COMMAND%
  • xthexder@l.sw0.com
    link
    fedilink
    arrow-up
    8
    ·
    3 days ago

    I was confused about what it meant by “Lossless” since it’s frame gen… there’s no compression, or anything to lose, it’s starting from nothing.

    As far as I can tell it means nothing, it’s just the branding for the “Lossless Scaling” tool on Steam. There’s no new lossless algorithm involved here.

    • Fubarberry@sopuli.xyzOPM
      link
      fedilink
      English
      arrow-up
      4
      ·
      3 days ago

      My understanding is the tool originally was focused on upscaling, and “lossless upscaling” was the apps main feature (along with allowing you to apply other kinds of upscaling).

      So the frame generation part is more accurately “the lossless upscaling app’s unique frame generation”, but it’s shortened to just lossless frame gen even though that’s not really accurate.

  • morgan423@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    edit-2
    3 days ago

    So I’m 0-1 so far (Samsung-screened SD OLED). Tried Baldur’s Gate 3 with a large variety of settings, it either crashed upon boot or booted with no video.

    I know it’s a DX11 game so it rarely agrees with tools like this, but I was hoping, lol. If I try anything else, I’ll edit this same post so as not to take over the thread.

    EDIT: OMG. Make that 1-1. It was user error, I’d accidentally slid the files into the folder next door instead of the plugins folder.

    After fixing it, I booted it up and… WOW. I already had BG3 set to 720p Balanced, after doing the x3 multiplier (because who cares about input lag in this game) I’m now up into the 75 to 90 FPS range.

    This is absolutely NUTS, I’ve never consistently gotten more than 30 on the Deck on this game. What a game changer, at least for non-action games. Will have to see how bad the input lag feels on action titles, but just speeding up modern/slower RPGs alone is a big, big thing.

  • vividspecter@aussie.zone
    link
    fedilink
    arrow-up
    2
    ·
    edit-2
    3 days ago

    Another useful use case is that the tool works on videos with mpv to interpolate to a higher frame rate. I know that subjectively not everyone likes that for film, but for footage that doesn’t rely on sets and the like such as sport and Youtube videos it’s a nice improvement.

    In terms of quality vs performance, I’d say it’s somewhere between the lower quality SVP default and the higher quality (but very resource intensive) RIFE implementation. There’s also LSFG_PERF_MODE=1 and decreasing the flow rate, but the former was a pretty obvious decline in quality, but might be needed on slower GPUs.

    EDIT: Another piece of advice I’ll give is to set PROTON_USE_WOW64=1 if you’re trying to run a 32-bit game, as there isn’t a 32-bit build for lsfg-vk at the moment. The env above allows 32-bit games to use 64-bit dependencies, provided it’s a Windows game and you use a recent version of Proton (Experimental and likely Proton GE 10 or greater).

  • morgan423@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    3 days ago

    Thanks for the heads up!

    I have a couple of RPGs where I have zero concern on input lag, but they could definitely use a frame boost. I’m going to give this a try today. 😀

    • Fubarberry@sopuli.xyzOPM
      link
      fedilink
      English
      arrow-up
      10
      ·
      edit-2
      3 days ago

      Different framegen techs have different requirements. Some like DLSS and the newer FSR require specific GPU hardware, some require being built into the game specifically. Lossless is great because it works on most hardware and most games.

      My understanding here is that it’s working as part of the Vulkan pipeline, but I don’t have enough knowledge in that area to answer more accurately than that. This article discusses what the dev of lsfg-vk had to do to get lossless framegen working on Linux, and it can give some insight into how it’s working.

  • Drasglaf@sh.itjust.works
    link
    fedilink
    arrow-up
    2
    ·
    3 days ago

    I’ve tried it with 2 games in my Legion Go with CachyOS.

    Heaven’s Vault: It launched without forcing a particular version of Proton, but it did nothing, same framerate with it on or off.

    Tacoma: I had to force GEProton in order for the game to run. It did nothing, same framerate with it on or off.

    And yes, I’ve followed the instructions and put “~/lsfg %COMMAND%” as an environment variable. Not sure if I’m doing something wrong, or if it just doesn’t work with every game.

    • morgan423@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      3 days ago

      Since it didn’t crash you, but didn’t seem to kick in, check and make sure that you’re not full screen on your in-game settings. Go windowed or bordered instead.

      I have used it a handful of times on Windows and that was always a prereq to get it to do anything.

    • morgan423@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      3 days ago

      Hey, I figured out what I had done wrong, I dropped the files in the wrong place. Everything seems to be working now.

      If you’re still struggling, this YT video shows you where you have to put everything to get it all to work. If CachyOS is Arch based it should be pretty similar.

      • Drasglaf@sh.itjust.works
        link
        fedilink
        arrow-up
        1
        ·
        1 day ago

        Thank you, I followed the video step by step but it’s still not working. The only thing that I was different in my setup is that I wasn’t using the pre-release update channel for decky loader.

    • Fubarberry@sopuli.xyzOPM
      link
      fedilink
      English
      arrow-up
      2
      ·
      3 days ago

      Make sure games are windowed or borderless and that you don’t have an external frame cap like steam overlay.

      • Drasglaf@sh.itjust.works
        link
        fedilink
        arrow-up
        1
        ·
        1 day ago

        It’s funny because after rebooting, the performance in Tamcoma was halved for no reason. Then I disabled fullscreen and the performance got better indeed. The funny part? The performance with Lossless Scaling on and fullscreen off is the same as the performance with lossless scaling off. Touching the other options in the plugin don’t change anything.

        • Fubarberry@sopuli.xyzOPM
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 day ago

          Some games don’t get any performance increase, so you’ll have to try it game by game. So far most games I’ve tried have worked, but maybe I’ve just been lucky.

          • Drasglaf@sh.itjust.works
            link
            fedilink
            arrow-up
            2
            ·
            23 hours ago

            I tried The Invincible too, it gave me a black screen with it activated. Success after success! I’ll keep trying more games in the following days, some have to work sooner or later.

    • miss phant@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      14
      arrow-down
      1
      ·
      edit-2
      3 days ago

      Having it available as a technology is great, what sucks is how Nvidia marketed their new 50 series on the basis of triple fake frames due to lack of actual hardware improvements. They literally claimed 5070 = 4090 performance.

    • Fubarberry@sopuli.xyzOPM
      link
      fedilink
      English
      arrow-up
      8
      ·
      4 days ago

      It depends on the game, framegen techs, and your base fps.

      It can be a great way to squeeze more performance out of a game in some circumstances, but it’s a big problem when games like MH:Wilds rely on it to meet an acceptable fps at all.

        • MentalEdge@sopuli.xyz
          link
          fedilink
          arrow-up
          19
          arrow-down
          1
          ·
          3 days ago

          I don’t know about anyone else, but the reason I say stuff like “fake frames, real flames” about nvidia, is that they include framegen in their performance stats.

          As if it’s a tech that boosts actual performance. They use it to lie.

          When they say “this card is twice as powerful as last gen” what they really mean is, it is exactly the same, but we turned on 2x framegen. Nevermind that there’s no reason you couldn’t do the same on the last gen card.

    • Yttra@lemmy.world
      link
      fedilink
      arrow-up
      4
      ·
      3 days ago

      No, they’re still all awful, lossless scaling especially, but:

      a) Nvidia simps have fewer excuses to support Corpo #1 over Corpo #2 now and

      b) people using lossless scaling often have either weaker hardware or lower standards in the first place

    • vividspecter@aussie.zone
      link
      fedilink
      arrow-up
      1
      ·
      3 days ago

      Neither Reddit nor Lemmy are monoliths. Yes, some are likely being hypocritical, but it’s also likely that there isn’t much overlap between those that were critical of Nvidia’s FG and not of LSFG. I say this because there is still a lot of people shitting on FG in general, whether it’s justified or not.