Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
I completely understand that supporting Linux right now might not fit into your roadmap, whether due to technical complexity, limited demand, or business priorities. No judgment at all — it’s your project, and you've already created something truly special.
That said, I just wanted to gently share a thought that might be useful for the long-term future of Lossless Scaling.
When a tool as unique and useful as yours isn’t available on Linux — especially with the rise of the Steam Deck and hybrid GPU setups — someone in the community often steps in to replicate or approximate the functionality. It’s not out of competition or malice — it’s just how the open-source world tends to work when a gap is felt deeply enough.
But here’s the thing:
Those community solutions, while functional, usually lack the polish, vision, and care that your software clearly has. They’re often stitched together for power users, and rarely deliver the same smooth experience for everyday gamers.
Without native support, what might happen is:
A rough, free tool gets built for a small niche
Everyone else continues to struggle with inconsistent or clunky workarounds
And your work, which could be a perfect fit, misses out on helping a growing segment of the gaming scene
Please don’t take this as pressure — not at all. I just wanted to share it out of respect for the quality of what you’ve made. If you ever reconsider Linux or even just Proton support down the line, I know many of us would be excited to test, promote, and even contribute where possible. The community is strong and very willing to help.
In the meantime, thanks again for what you’ve already created. It’s a great tool, and it’s made a big difference for a lot of people. We just hope one day we’ll be able to enjoy it too. 😊
Wishing you continued success and all the best in your work!
Im going to argue that you might not have a ton of experience or knowledge in some of these areas if you are making these 2 claims. I cut out the bulk of the post to save space.
The compositor situation is not even close to in a good spot for something like Lossless Scaling on linux.
In regards to wayland, while the protocol itself is a standard it has to be implemented individually across each compositor and they all not only do it slightly differently but they dont even all support the same set of features. There is also the problem of capturing frames in wayland due to its security focused design. Currently you need some sort of portals implementation or to go the gamescope route and run your own compositor which comes with its own entire set of problems which you can just look at the issue tracker for it to see. Gamescope works best on its own, on the steam deck. EDIT: forgot about vkcapture which bypasses portals and hooks into the game to capture frames using a vulkan layer of some sort i believe but i dont use it due to it causing artifacts in recordings.
Then thee is the issue of X11 which is in maintenance mode seeing little or no feature work. X11 cant handle variable frame rates or resolutions well and while basically anything can capture frames the issue of synchronization with the display continues to be a problem on X11 not to mention issues on compositing in X11 causing major input delay which Wayland doesnt suffer from when compositing using direct scanout (which gnome and kde do these days). and for lossless scaling to work well i dont see it being doable without some sort of compositing.
There are other issues with X11 and wayland compositors beytond this but those are the ones immediately coming to mind.
In regards to multi gpu, it really isnt any better or worse than windows. My laptop in fact struggles a bit more on linux to pick the right GPU on default for games but i can obviously make games only see a single one i want them to but on Windows it wasnt any more diffcult to accomplish. I simply dont see any evidence to point to Linux handling multi gpu in a way that Windows cant do also besides if youre doing things like VMs and gpu passthrough.
Im lucky in that my system is an all AMD one when it comes to Linux so my experience is generally more favorable (still has issues) .Anecdotally Ive personally heard of more struggles to get Nvidia dGPU and Intel iGPU working well together on Linux than i have on Windows because Intels windows drivers tend to just be better than their linux ones and with Nvidia GPUs still needing proprietary drivers till the newer OSS drivers are actually usable you see a lot of problems with Nvidia failing to support all the same features/standards everyone else agreed upon for multi GPU systems and having to do some shenanigans to get things functioning smooth.
I really dont think any of these points here in OP are necessarily true or even any real benefit to lossless scaling. I will say linux does have some advantages over windows for gaming but they arent these and dont really have any really bearing on lossless scaling so i see no reason to really go into them atm.
Hello level 0 With Anime profile "MASTER"
Let’s Get the Basics Straight:
Linux handles multi-GPU setups just fine—as long as you understand how Linux expects you to manage them, which is different from Windows. It doesn’t "automagically" pick a GPU based on an obscure driver profile like Windows does with Nvidia Optimus. Instead, you choose which GPU to use, explicitly and predictably. Power users prefer this level of control. If that’s overwhelming, maybe you should stick to a console.
🧠 Dual GPU Use Cases on Linux:
There are typically two GPU configurations on laptops or desktops:
iGPU + dGPU (Hybrid Graphics): e.g. Intel/AMD iGPU + Nvidia/AMD dGPU
Two dGPUs: e.g. AMD + Nvidia, Nvidia + Nvidia (less common and complex)
Let’s tackle the common scenario: an iGPU and a dGPU, with the goal of using the iGPU for power-saving desktop tasks and the dGPU for games or heavy workloads.
🛠️ Step-by-Step Guide: Dual GPU on Linux
✅ 1. Identify Your GPUs
Use this to check both cards are recognized:
lspci | grep VGA
Expected output:
00:02.0 VGA compatible controller: Intel Corporation UHD Graphics 770
01:00.0 VGA compatible controller: Advanced Micro Devices, Inc. [AMD/ATI] Navi 31
✅ 2. Make Sure the Drivers Are Installed
For AMD and Intel:
sudo pacman -S mesa vulkan-radeon vulkan-intel libva-mesa-driver
For Nvidia (if needed):
sudo pacman -S nvidia nvidia-utils libva-nvidia-driver
Replace with your distro’s package manager if you're not on Arch.
✅ 3. Enable DRI_PRIME (Render Offload)
This is the Linux equivalent of GPU selection. Here's how to run an application on the dGPU:
DRI_PRIME=1 glxinfo | grep "OpenGL renderer"
If that outputs your discrete GPU (e.g. AMD or Nvidia), then render offload is working.
To launch a game on the dGPU:
DRI_PRIME=1 %command%
Or in Steam:
Right click a game > Properties > Launch Options:
DRI_PRIME=1 %command%
✔️ Your game will now run using the powerful GPU instead of the energy-efficient iGPU.
✅ 4. Check Vulkan Support
Vulkan apps (like most modern games) also respect GPU selection. Use:
vulkaninfo | grep "GPU id"
To run a Vulkan game on the dGPU:
VK_DEVICE_SELECT=1 %command%
Or use vkBasalt to verify render device and apply effects (FSR, CAS, etc.).
✅ 5. Optional: Use Gamescope for Scaling + Compositing
If you want Windows-style lossless scaling or custom resolution rendering:
gamescope -f -h 1080 -H 1440 -- %command%
Use DRI_PRIME=1 if needed:
DRI_PRIME=1 gamescope -f -h 1080 -H 2160 -- %command%
This is exactly how the Steam Deck scales games to 800p or 720p internally. You can run it on any setup.
✅ 6. Desktop Environment Integration (Optional)
GNOME: Uses Mutter, which supports GPU offloading and direct scanout on Wayland.
KDE Plasma: Works with AMD/Nvidia via KWin and supports DRI_PRIME, as well as gamescope-session.
XFCE/LXQt: Lightweight and simple, great for testing setups.
You can run your whole session on the dGPU if you want max performance. For AMD:
DRI_PRIME=1 startx
🧯 Troubleshooting
If the wrong GPU is used by default: force rendering via DRI_PRIME or VK_DEVICE_SELECT.
If Vulkan apps crash: ensure vulkan-radeon or vulkan-nvidia is installed.
Nvidia iGPU + AMD dGPU setups are rare and tricky. Stick with AMD + AMD or Intel + AMD for simplicity.
Want OBS to record with dGPU? Use obs-vkcapture, pipewire, or ffmpeg with VA-API acceleration.
🧠 Final Thoughts:
You said:
“I simply don’t see any evidence to point to Linux handling multi gpu in a way that Windows can’t.”
Right, because you’re not looking. Linux doesn’t need to hide things behind “smart profiles” or vendor-specific black boxes. It gives you the tools. If you don’t know how to use them, that’s not Linux’s fault.
🏁 TL;DR
Linux multi-GPU works just fine. In fact, Linux gives you more control, lower input latency, and tools like gamescope, DRI_PRIME, and Vulkan device selectors to do things you can’t easily do on Windows without closed-source driver hacks. All it asks in return is that you learn how your system works.
If that's too much, no worries. But don’t mistake your own inexperience for a universal limitation.
PLEASE use less spacing, emojis, etc. in your post formatting, it makes it much more difficult to keep all the information on display at one time.
Thats said lets adress the first thing.
"Hello level 0 With Anime profile "MASTER""
Your very first sentence is an attempt to discredit anything Ive said simply on the base of my profile picture. If this is your means of demeaning people then you might want to check out a number of the profile pictures for tools you use on linux because youll find more than a couple anime pictures (theeviltskeleton, TKG, and Georges Stavracas for instance). Thats beyond the fact that what was picked for my PFP has very little bearing on the discussion at hand so your only purpose in this statement was to attempt to belittle me.
"Linux handles multi-GPU setups just fine"
never claimed it didnt
"—as long as you understand how Linux expects you to manage them, which is different from Windows. It doesn’t "automagically" pick a GPU based on an obscure driver profile like Windows does with Nvidia Optimus. Instead, you choose which GPU to use, explicitly and predictably. Power users prefer this level of control. If that’s overwhelming, maybe you should stick to a console."
Thats not how Windows works either, The applications in both Windows and Linux will look to see which GPUs the system has and will attempt to pick what they see as the proper primary GPU to render on. There isnt a ton of magic profiles on Windows that makes multi GPU work. On the topic of consolees, besides that i havent owned one in over 20yrs I find it interesting that you attempt to yet again belittle me with that as if gaming on a console for anyone who does is a negative.
We're going to skip the attempt to flex pacman install commands (not sure why thats even included) as they have little relevance to the conversation.
DRI_PRIME only works on the OSS drivers, Nvidia requires prime-run or bumblebee. DRI_PRIME can in fact be ignored by applications and isnt a guaranteed way to force an application to run on a specific GPU. It depends on how GPU selection is handled by the application. EDIT: Forgot to mention that the DDX driver for AMD on X11 doesnt support this feature when in combination with an nvidia dGPU
Gamescope you shouldnt use DRI_PRIME as it has its own GPU selection mechanisms for the compositor. That said running gamescope on Intel GPUs currently is pretty universally a bad idea as there are a number of corruption issues. Nvidia tends to be fineish but AMD tends to work best for it for obvious reasons. Gamescope has problems though with stuttering, mouse capture, pipewire frame capture, running multiple windows, and a number of other issues when running in nested mode. you "can" run it most places but YMMV considerably as its main focus is the steam deck and not nested.
Im not sure why you mention these compositors? Also DRI_PRIME is not how you select the GPU for the session. GNOME and KDE both have their own mechanisms for selecting the primary session GPU. I dont remember for the other DEs as its been some time since i used them but for KDE you would do this for instance
KWIN_DRM_DEVICES="pathtogpu:pathtogpu2"
Again in the next point, DRI_PRIME is not a 100% guarantee that the application will for sure select that GPU. Vulkan its pretty close though as you can add ! to blind the vulkan application to other GPUs but OpenGL you cant do this and it may not in fact work. DXVK and VKD3D also have their own mechanisms for GPU selection which tend to be a bit more consistent than DRI_PRIME in fussy applications. The OBS part is just fine but really has no relevance hear, game recording has been pretty well solved by OBS folks and NoWrep and Georges Stavracas on linux for some time so nothing to really say here but "ok?"
Overall i dont see the point in what you posted here. It fails to adress most of what i said, doesnt refute anything i said about multi gpu, and only seems to be a sort of flex of your ability to follow the arch wiki and attempt to make fun of my PFP as if that matters lol
You seem to misunderstand how GPU selection works on both Windows AND Linux, dont entirely understand the limitation of Prime selection be it prime-run, bumblebee, or DRI_PRIME, You also seem to think i claimed multi gpu doesnt function on Linux which i never claimed but ok. Windows allows you to select your render GPU also in a very similar manner and it functions pretty well and i dont see Linux being better or worse in this regards.
The claims on lower input latency though is going to depend. If you use a mouse/kb and you dont run into the over 500hz issue that Wine/Proton have had for years that cause stuttering when using polling over 500hz then at the very least input latency should be similar assuming using mostly identical setup between windows and Linux. Gives you more control, yeah but i dont think there was any argument to the contrary.
The tools argument is kinda funny considering youre posting here to ask for a tool that Windows has that is obviously better to be ported to linux but then imply Linux gives you the tools? Does linux give you the tools or not?
finally
"If that's too much, no worries. But don’t mistake your own inexperience for a universal limitation."
again, you resort to trying to belittle me vs refuting anything a said in any meaningful way. If youre going to make an entire post just detailing the Arch Wiki/Reddit steps for things i mean more power to you but not a single thing you posted was contrary to my original statement and doesnt change the fact that brininging lossless scaling to linux just isnt currently feasible in a way that makes any sort of monetary sense and has some serious complications in the current linux landscape.
Let’s clear up a few things based on actual testing and experience.
First, I use a 500Hz monitor and a mouse with 8000Hz polling rate — zero issues on Linux. Not "less issues," not "some quirks" — zero issues. The only time I’ve had problems with high polling rates was on Windows, specifically in games like Chivalry 2, which actually freaks out at high polling rates on Windows. On Wayland, Chivalry 2 runs butter smooth. No drama. Just works.
And speaking of performance: I literally just finished testing CS2 on new maps, running via Gamescope, and finished first in multiple Deathmatch rounds — smooth input, no mouse problems, no stutter, just domination. If Linux gaming and input latency were truly flawed, I wouldn’t be sweeping the floor with Windows players using this setup.
Also worth noting: Witcher 3 – Next Gen (Wild Hunt) with full ray tracing, everything maxed out at 4K, runs 70 FPS stable on my rig — thanks to mesa-tkg-git 25.2.0_devel.205539.e674823d551-1. I don’t see many Windows users getting that kind of performance without major tweaking, driver roulette, or proprietary nonsense.
Now, about the tone of your post:
You started by dismissing someone for their anime profile picture. Not only is that irrelevant, it’s also childish. Some of the top contributors to the Linux gaming ecosystem use anime avatars. Your point doesn’t just fall flat — it shows bias and a lack of respect for community contributors.
You claim to be about facts, yet your post is littered with assumptions, strawman arguments (like pretending someone said multi-GPU doesn't work on Linux), and borderline elitist condescension.
If you have a legitimate point, make it without personal digs.
Let’s keep it real:
Linux absolutely gives you the tools — power and control — but it also requires knowledge. That's not a flaw, it's a strength. And many of us thrive in that.
Gamescope, DRI_PRIME, DXVK, VKD3D, and OBS — they work incredibly well if you know what you're doing.
Arguing endlessly over platform superiority doesn't push Linux gaming forward. Testing, documenting, and helping each other does.
And last thing — if you’re going to make bold statements, have the courage to stand behind them with a public profile. People who dish out misleading info or toxic energy while hiding behind locked profiles are part of the problem — not the solution.
Linux gaming is thriving. If that bothers you, maybe it’s not the OS you should be mad at.
I didnt claim anything regarding 8000hz mice or 500hz monitors having issues, i mentioned that there is a long standing bug that can happen with Wine/Proton and mice/kb polling over 500hz. It seems to be game/application dependent as some trigger it some dont.
I also didnt mention performance but i also didnt say linux and input latency are flawed. CS2 is native and should never encounter the Wine/Proton bug.
"You started by dismissing someone for their anime profile picture. Not only is that irrelevant, it’s also childish. Some of the top contributors to the Linux gaming ecosystem use anime avatars. Your point doesn’t just fall flat — it shows bias and a lack of respect for community contributors."
WHAT!!!????? WTF are you saying? What kind of backwards gaslighting mind games are you playing here. that was the very first sentence of your last post and i never even did this WHAT!!!???
The second 1/2 of this post is absolute insanity in a way i cant even.....You are trolling at this point you absolutely have to be
ME TROLLING ? L O L
JUST TELL ME WHO ARE YOU ?> AND WHY YOU SHOW UP HERE POSTING IRRELEVANT INFO FROM 2022 ?
Well, you showed some pretty rude and dismissive behaviour here with your "Hello level 0 With Anime profile "MASTER"" comment, the icing on the cake was when you tried to deflect by gaslighting Vornis, you accused him of something he didn't do, you accused him of something that YOU did. Stop that behaviour, first and final warning.
" Im going to argue that you might not have a ton of experience or knowledge in some of these areas if you are making these 2 claims. I cut out the bulk of the post to save space."
Wish all good luck and all the best :)