Zainstaluj Steam
zaloguj się
|
język
简体中文 (chiński uproszczony)
繁體中文 (chiński tradycyjny)
日本語 (japoński)
한국어 (koreański)
ไทย (tajski)
български (bułgarski)
Čeština (czeski)
Dansk (duński)
Deutsch (niemiecki)
English (angielski)
Español – España (hiszpański)
Español – Latinoamérica (hiszpański latynoamerykański)
Ελληνικά (grecki)
Français (francuski)
Italiano (włoski)
Bahasa Indonesia (indonezyjski)
Magyar (węgierski)
Nederlands (niderlandzki)
Norsk (norweski)
Português (portugalski – Portugalia)
Português – Brasil (portugalski brazylijski)
Română (rumuński)
Русский (rosyjski)
Suomi (fiński)
Svenska (szwedzki)
Türkçe (turecki)
Tiếng Việt (wietnamski)
Українська (ukraiński)
Zgłoś problem z tłumaczeniem
I smell that the same will happen here.
here are the two versions:
http://www.lg.com/au/it-monitors/lg-34UM95
http://www.lg.com/au/it-monitors/lg-34UM65
What I want is 21:9, IPS and GPU sync.
Preferably not G-sync and Nvidia support of it anyway. Because the G-sync monitors are so expensive.
errr....my monitor is 21:9 and IPS too. as for GPU sync they are developing a third party external unit/cable to enable it in the future instead of inbuilt tech like gsync. so technically speaking my monitor qualifies for all those features you just mentioned.
Also on another note, i have tried gsync before and i can tell you it is VASTLY OVERRATED. Dont get me wrong i was just as excited about gsync as much as anyone when i first heard about it and watched demos of it. But the fact is, in reality, a lot of games i tried gsync with were not even compatible with it! What's the point in new tech if 30% of your games dont run with it or run worse with it on?
eg Borderlands 2 runs great with gsync on....but in Battlefield 4 it stuttered all over the place...worse than if it was turned off.
thats why the monitor comes in two resolutions: 3440 x 1440 and 2560 x 1080. i decided to buy the latter and @ only 10% frame rate hit compared to 1080p its a winner! i avoided the 3440 x 1440 res because of its big hit on fps....
i apologize i may have been confused and provided wrong infro before. displayport 1.2a and 1.3 have native support for variable refresh rates via AMD's freesync tech....
i thought this tech would work on any GPU regardless of nvidia or AMD but it appears it only works for select AMD cards. so technically there can be no sync of any type with this monitor as my monitor only has 1.2 and not 1.2a and im using nvidia cards.
But yeah like i said before all this freesync and gsync techs are highly overrated man. Demo it instore or buy a cheap gysnc monitor , try it and return it....you'll realise its not all that its cracked up to be...the results are fairly inconsistent to say the least.
The most misleading part is nvidia's pendulum demo that nvidia have released on their website. I downloaded the official pendulum test and gsync worked perfectly on my asus gsync monitor! But as soon as i launched some games with all the correct settings and maxed it out so that it would run between 40-50 fps (which is where gsync should shine) i saw very variable results. some games would stutter and freeze like mad with the gynsc settings enabled whilsts others benefited from smoother stutter free gameplay. it was inconsistent enough for me to say it wasn't 'revolutionary' or 'reliable' enough tech
I think nvidia have misled A LOT of people....
It could be supported by everyone who wanted too. Nvidia state 1.2a support for their graphics cards but I don't know whatever it's a hardware thing or a software thing and whatever they could easily offer it if they wanted to. It seem like they want to be asses about it and only support G-sync for now because they have invested in that, have profits or partners invested into their hardware solutions and possibly want profits from products using it and also I guess possibly divide the market and try to delay a solution for AMD cards and make people think twice before going that route so they get the biggest benefit themselves.
Or something such.
If Adaptive-Sync got widely implemented which would be easier with Nvidia standing behind it but I guess may happen anyway I assume Nvidia will start using it. Maybe they wouldn't want to enable it on their old cards but rather be ♥♥♥♥♥♥♥ about it and just do it for a new card to sell yet another card to people though. Who knows?
Nvidia also confuse people by having something called "Adaptive V-sync" which sounds very similar and which they support on older cards but which isn't Adaptive-Sync and doesn't give the same result. It's another thing.
I don't know how much I'd notice various frame rates but I'd say I've got a good idea for what it is and do and I totally much rather have images shown as soon as rendered rather than at a steady phase. Lower lag and an image output in time with the game state changes is what interest me the most.
I can't talk for the experience or support since I haven't used it :)
What I dislike in the case of Nvidia is 1) The price of G-sync products and 2) They not willing to support the standard which could had limited 1.
Also let's face it. If I where to buy a $1000+ monitor I would appreciate if it was fully supported NO MATTER WHAT VIDEO CARD I GOT!
Because I'll always want to be able to pick the option I think is better and not be locked into some choices.
Currently I own 3 x Asus 27" 144 hz, but only play on one screen. I find that with three screens (Albeit beautiful) the field of view is way too much for two eyes to assimilate. I mostly play Far Cry 3 and will play FC4.
I find Ticklemerifle's observations pretty accurate, and thank god I have not pulled the trigger on the 3440X1440 lg 34UM95-P yet, because I fear dual titans may not be enough for smooth gameplay.
I do know the acer supports Gsync, which I know very little about. I have ruled out all other 4k screens because of mixed reviews. I wonder if dual titans could run 2 x 2560 x 1080 lg's??
Or would i use only one 34" and place 2 x 27" on either side for extra browsing and work space.
Money is not an object thank god, and yet, It's so incredibly hard to find a good monitor.
We are very spoiled here, we have a SONY xbr900A 4k tv, which delivers a stunning PQ and color. My asus 27" TN panels, have horrible colors compared to the TV.
So i'm trying to find a monitor with stunning color rendition that I can also play games on.
Guess the closest I know of is the Korean monitors with no scalers which people force a higher refresh rate on and possibly the Samsung AMOLEDs which are used in the Oculus Rift which is supposed to have a higher refresh rate than 60 Hz at least.
The 34" LG have nice aspect ratio and likely will have good colors too. But in speed it's not a gaming monitor. But it has a nice aspect ratio and colors.
For game / competitive performance rather than visual splendor I assume the 27" G-synced one is the better choice.
Speed may not be everything and maybe for some less competitive gaming you'd prefer the 34" 21:9 or say having three 40" TVs (feel free to skip 4K =P) at some distance (for driving game or whatever.) Rift is likely less than a year away too.
If money is no problem I kinda wonder if one 34" 21:9 for desktop, nice looking gaming, video clips and one 27" G-sync equipped one for mostly FPS gaming if more competitive can't be the way to go?
Or something?
Personally I would be pretty ok with GPU-synced 60 Hz low input-lag not horrible response time (flickering?) IPS-panel too. But since the Korean ones can be "overclocked" I assume it may not be impossible to make an IPS panel which actually is supposed to run at 75 or 90 Hz?
I corrected my error, i actually said adaptive v-sync by mistake.
Adaptive V-sync is an Nvidia technology which disables V-sync at frame rates lower than that of the monitor to remove stuttering but accept tearing. At frame rates higher than that of the monitor V-sync is enabled and you don't won't get any tearing but will be locked at 60 FPS.
The problem for the VESA standard Adaptive-Sync at the moment is that Nvidia says they don't intend to support it but rather focus on their G-sync. (I think the claim included supporting the scalar manufacturers which I assume may be doing chips to support G-sync and also that Nvidia may make some money on the technology and if not at least they know they will sell the graphics cards to support those monitors (but they wouldn't sell them for the monitors which used Adaptive-Sync instead so I assume they have to make money on G-sync itself to make it worth it.)
If Nvidia supported Adaptive-Sync and it was as good or close enough and even more so if it was cheaper I could see how that would become the winner and G-sync would become irrelevant. Which Nvidia may not want to happen because it helps AMD sell graphics cards and Nvidia may earn money on G-sync (or just hype because it's their technology.)
But since Nvidia say no that mean that G-sync both have an advantage in being out now and being used more and also that the larger of the two (then again Intel is large too) will only use G-sync which will hurt the market for an will to include Adaptive-Sync support in monitors.
As such they will delay uptake of Adaptive-Sync and AMD will be less competitive because there will be fewer monitors to choose from.
Imho it's a nasty tactic which I dislike because I won't benefit from it and would much rather buy an expensive monitor which used a standard rather than Nvidia exclusive technology. It's imho one reason to pick an AMD card instead too which Nvidia should view as bad.
Nvidia have adaptive V-sync on lots of older cards (~GTX 650 and up, which is only "intelligent" picking between screen tearing or V-sync depending on what is viewed as most beneficial for the gamer.)
Nvidia as said so far won't have VESA Adaptive-Sync support on their cards.
The new GTX 970 and 980 cards also only list DisplayPort 1.2 not 1.2a or better.
Adaptive-Sync is an optional part of DisplayPort 1.2a and op.