STEAM GROUP
Linux User Group s-lug
STEAM GROUP
Linux User Group s-lug
255
IN-GAME
1,757
ONLINE
Founded
April 16, 2010
Language
English
Aquila Oct 9, 2013 @ 9:52am
gaming on Linux: AMD vs NVIDIA
What would you choose to play games on linux? It's several months that "Steam for Linux" is out, so I guess a lot of things changed during these months. What company offers better drivers now in October 2013? What is your personal experience?
Thanks in advance :)
< >
Showing 1-15 of 36 comments
Cheeseness Oct 9, 2013 @ 9:11pm 
I suspect the responses here are going to be overwhelmingly geared towards Nvidia with a caveat about Optimus/Bumblebee.

I've not had any good experiences with ATi video cards on Linux, though I haven't actually owned one since 2006, so my experience is out of date :D
KingBeowulf Oct 9, 2013 @ 11:54pm 
The F/OSS AMD/ATI has improved a lot, but are not quite there yet. The proprietary Catalyst drivers still drop too many older GPUs for may taste. The proprietary Nvidia just work and I can run older GPUs. Nvidia's Linux support is still way ahead of AMD/ATI. For my money, I'll stick with Nvidia: best OPENGL bang for my buck there is!
Aquila Oct 10, 2013 @ 11:08am 
Thanks a lot for the informations. I will stick to Nvidia :) Let's hope AMD will keep working to improve their drivers for Linux.
Cheeseness Oct 10, 2013 @ 7:30pm 
Oh, also, as we touched on in our last SteamLUG Cast episode[steamlug.org], both Nvidia and AMD have been releasing hardware specs which will help their respective opens ource drivers grow.
Aoi Blue Oct 11, 2013 @ 7:10pm 
I must say I've had good experiences and issues with both.

I'm glad both have decided to release far more complete documentation without a non-disclosure-agreement to the Mesa project. This will certainly help in the future.

My experience with AMD Open Source drivers is certainly better. They run about twice as fast as the nVidia ones.

Honestly, the graphics cards are hard to compare on performace, as under one set of effect options AMD will perform better, on another the nVidia cards will perform better.

As of the control panels. I like some features of AMD's better, but I like some features of nVidia's better.

I have issues with both proprietary driver sets. Neither properly has KVM or Wayland support. They also both lack EGL, OpenVG and GL-ES2/3 support. AMD is closest to releasing GL-ES2/3, OpenVG and GL-ES2/3 support, having released special developer's versions with those features enabled.

The nVidia driver's install method is a total mess, and honestly it's almost broken. Even the packages suffer horrible dependency hell in most distributions, making a simple driver upgrade a nightmare.

AMD's drivers have numberous bugs. One of which is causing nasty crashes in Kerbal Space Program (why I am currently using an nVidia card.)


Finally, as of embedded options, the A-Series beats the performance of the nvidia ion and any of the nVidia mobile chips that don't use substancial sideport memory (so basically all the ones that aren't for high-end gaming laptops.) I am honestly impressed with the AMD A-Series for an entry-level option for home theater PCs and
white_waluigi Oct 12, 2013 @ 12:11am 
I have made far better experience with NVIDIA. But you have to use prop drivers, even if i that makes tux cry.
Brune Oct 15, 2013 @ 9:46pm 
I have an AMD Radeon 7870. After researching I've figured out that I might as well wait until some time in 2014 to switch 100% away from windows. I have a Linux box already and I have a few VMs of linux but neither of these solutions allow me to game. AMD 7000 series is really poor on Linux. 60 FPS on CSGO vs the 200+ I get on Windows. That said I've heard the AMD Vishera is better on Linux than Windows.
Last edited by Brune; Oct 15, 2013 @ 9:46pm
Ari El Uno Oct 15, 2013 @ 11:45pm 
I choose Nvidia for the best proprietary display driver, perform as the same as on Windows (not significantly different).
Aoi Blue Oct 16, 2013 @ 3:22am 
If you are only getting 60fps, then that is simply because you have the anti-tear function on.

On Linux the anti-tear function runs on ALL imagery, not just video. This causes your frame rate to be locked to your refresh rate.

However, as a note, there is no benefit to having your frame rate above your monitor's refresh rate. All information to the contrary is from older games that had their simulation refresh rate tied to the video refresh rate. Newer games use interpolation and/or extrapolation to form the extra framse, so you aren't actually getting any edge.
intok Oct 18, 2013 @ 5:23pm 
Yeah, but you can't tell the 133t gamers that, they will go on and on about some hundreth of asecond advantage from having the game render at 100+ FPS.

In video you only need 16 FPS for it to not look choppy, in games you need 30FPS for the same, but faster is better up until the monitor refresh rate limit is hit. Most mointors cap out at 60FPS because of the limitations of the connection standard, most of the bandwidth is reserved for the top resolution and color depth. Take the current revisions of DisplayPort and HDMI, they are geared for 10-bit color @ 3840x2160 @ 60FPS.

Yes, I know full well that the human eye can detect well over 100 FPS, fighter pilots have been tested to be able to recognise a model of plane that was only on the screen for 1 300th of a second.
Aoi Blue Oct 20, 2013 @ 5:53pm 
There is no advantage UNLESS the game is updating at 100fps, which most games on the server RUN at 30-66 simulation frames a second. (L4D2, Portal, HL2 run at 30, TF2 and HL2-DM run at 66).

It's just filler to smooth out the animation. Those frames don't even matter to the game. ONLY physical frames matter in the game. They aren't even simulated. When you do an action, it's timing is rounded to the nearest physical frame.

Additionally, you won't get an advantage unless those frames are displayed and your server is actually updating at that rate.

The real way to get an advantage is to turn down lerp or Interpolation to 2 physical frames. This reduces the delay. For instance, on Team Fortress 2 the default lerp is a whopping 100ms but the ideal lerp is 1/33rd to 1/66th of a second.

On all modern source engine games (all that have an updated codebase) you can set your fixed lerp to zero with "cl_interp 0" and it will default to cl_interp_ratio for the interpolation buffer size, which defaults to 2 frames.
Last edited by Aoi Blue; Oct 20, 2013 @ 5:58pm
_ply Oct 23, 2013 @ 6:07pm 
Intel Iris pro (HD5200) is usable, especially for less demanding games, comparable to some lower end nvidia laptop cards. But hardware with it is not that common.
Really, nvidia+binary driver is by far the best option. Yes having optimus laptops is a bit flakly compared to windows since there is no support from nvidia and you have to use a 3rd party tool, but even then, it's still better then most other solutions.
intok Oct 23, 2013 @ 6:28pm 
Wel yeah, for a laptop that is going to be used for light gaming your best bets are the Iris Pro 5200
http://www.notebookcheck.net/Intel-Iris-Pro-Graphics-5200.90965.0.html

and the Radeon HD8650G
http://www.notebookcheck.net/AMD-Radeon-HD-8650G.87916.0.html

Both have good OSS drivers and they are the fastest of the AMD and Intel mobile intergrated GPUs.

On the high end, anything that doesn't have an intergrated GPU is what you want. Anything to stay out of the Optimus hell.
Last edited by intok; Oct 23, 2013 @ 6:33pm
edo Oct 23, 2013 @ 9:36pm 
For amd ati hd 2000-6000, the free driver works great, almost the same level (or equal) than the privative driver (of course, dpm and sb shaders optimizations should be enabled). I use it, and games run as good as they did in windows, for me tf2 runs better in linux. For newer cards, is nvidia, their privative driver is as good as the windows driver.
white_waluigi Oct 24, 2013 @ 12:03pm 
@ruediix:
Source still runs on Simulaton Frames? Lol.
< >
Showing 1-15 of 36 comments
Per page: 15 30 50