安装 Steam
登录
|
语言
繁體中文(繁体中文)
日本語(日语)
한국어(韩语)
ไทย(泰语)
български(保加利亚语)
Čeština(捷克语)
Dansk(丹麦语)
Deutsch(德语)
English(英语)
Español-España(西班牙语 - 西班牙)
Español - Latinoamérica(西班牙语 - 拉丁美洲)
Ελληνικά(希腊语)
Français(法语)
Italiano(意大利语)
Bahasa Indonesia(印度尼西亚语)
Magyar(匈牙利语)
Nederlands(荷兰语)
Norsk(挪威语)
Polski(波兰语)
Português(葡萄牙语 - 葡萄牙)
Português-Brasil(葡萄牙语 - 巴西)
Română(罗马尼亚语)
Русский(俄语)
Suomi(芬兰语)
Svenska(瑞典语)
Türkçe(土耳其语)
Tiếng Việt(越南语)
Українська(乌克兰语)
报告翻译问题
I am still experiencing these crashes, though they seem rarer now. However, it is different now in that these crashes have happened while I am using two monitors, where the TV HDMI signal cuts out (aka black screen) but the second monitor on DVI port is still functional. The TV HDMI is still active with my programs open on it but inaccessible, but I can seemingly "reboot" it by unplugging and re-plugging the HDMI cable back into TV. It is now apparent that this is what was going on the entire time, just harder to see it when it would crash to black screen while only using it on TV, while the TV itself treats a black screen as a disconnection (resembling an OS crash). Seems like an actual computer monitor on DVI can somehow survive this screen crash, while TV HDMI connection cannot.
Now that I know for sure that it is a black screen crash, I looked it up and apparently this is REALLY common with RTX 3000 series GPUs, and appears to be a driver issue. Nobody seems to know for sure what causes it beyond that. Some people are returning their GPUs because of this, but I don't think that is necessary until it is confirmed that this is a hardware problem instead of just driver incompatibilities.
Only reason i can think of why it would act up while DVI continues to function at the same time.
i very recently had to change HDMI cables because i was having somewhat similar issues, though at bootup, changing HDMI cables fully resolved it.
I have not tried that yet, simply because I do has a "certified" cable ready to use... but I am just waiting for new PSU to arrive before final configuration of everything as I want it to be.
There was a new Nvidia gameready driver that I updated to a couple days ago and haven't experienced this since, so maybe they fixed it with that.
dOBER is right; power draw isn't tied to resolution (or color depth) in that way. If you're letting the GPU run to its limit in either case, you're going to be approaching similar values to the cap of what it can draw.
I'm also not sure why you bring up practicality of running resolutions in games (I thought we weren't talking about games?) beyond what a GPU is better suited for. I never said to "run it as high as it can just because it can". All dOBER and I were addressing was the approach of running at a lesser resolution to attempt to alleviate a suspected issue with power draw. If your issue was with power draw (which you seem to have discovered it's not), this wouldn't have mattered a lot.
Really...
Interesting
Since when?
I never said anything about the colour palette. And that's not how the colour pallet works, at all.
Colour palette is an exponential scale. 8bit colour palette means that there can be 2^8 colour values: 256 in total. The "2" is binary, meaning the actual machine code is 0 or 1. That means that each pixel would be assigned a value of 0-255; the number value correlates with a specific colour. This is something that I have experience with in for instance processing satellite imagery.
Modern TVs have 3 primary colours, RGB, where each colour has its own 256bit range/palette. This means that there are 256^3 possible colour combinations: 16,777,216 in total. Now what I'm not sure about is how this works per pixel on most TVs, since the idea is that different light emitters work together to produce different combinations of colour, yet that doesn't mean that the TV itself has LEDs per pixel. The idea is that it is trying to get the assigned colour in the pixel through various algorithms of colour combinations, I guess. Apparently OLED displays actually do have 3 (RGB) LEDs per pixel, which is what sets them apart.
Next thing to consider is that HDR10 is not 8bit at all, but rather 10bit (hence HDR"10"). That means it has 1024^3 combinations: 1,073,741,824 total. OLED is 12 bit at 4096^3: 68,719,476,736 total.
Does this mean that it seriously affects computation speeds and power usage for the end user? I doubt it is that serious, because all it is, is variables, whereas the filtration and processing algorithms are what multiply resource demands. I imagine you'd notice more of a difference with an older GPU hooked up to an OLED display, but I'm pretty sure that it is the TV itself that is doing the bulk of the processing, NOT the GPU. However, the PC does want to know details about the output (like mine does with HDR), because it wants to match content with the display. Why? Because it does use more data and computations overall to compute and display larger numbers, especially in binary. It is objective fact that more binary digits takes up more storage and computational space, and this does matter when using larger datasets, like 8bit vs 10bit.
And again, I was talking about resolution, which is about calculating the filtered value per pixel. Colour palette is simply the variable used in the calculation. Now, my TV has two menus of filtering pixels for various effects, but it is certainly the GPU that is outputting the output at set scale/resolution. The GPU is calculating the area differences between 720p, 1080p, 4k, 8k, etc.. If I'm streaming something from Youtube through Chrome, it is unpacking and re-configuring 4k video at 60fps, for instance. Why do you think streaming a video at 4K downloads significantly more data than 1080p, or 720? Because it holds more data detailing pixel values.
If I'm playing a game at 4K, then it has to generate values for every single pixel per frame (again, frames per second). If you were playing a game with low textures at 4K, like a simple 2D platformer, then the GPU would just be repeating textures making computations easier; it would be very different with a 4k game running high textures, because the GPU has to generate those virtual environments in order to create a more realistic environment, otherwise you're going to notice a lot of blurry textures on your fancy 4K TV. FPS is the refresh rate your GPU is refreshing those graphics at, which is going to hit the hardest when it needs to produce new environments an objects all the time like in a AAA game (otherwise it will load the graphical objects and environments and keep them on VRAM until you don't need them).
I was playing games on 4K with my GTX 1060, simply because I had the 4K TV to play it on. If what you guys were saying is true, then I should be able to do this without any issues, which begs the question of what the new GPU is even needed. Well I can tell you right now that there is a significant difference when playing, say, Wolfenstein New Order where 4K cannot run at 60fps, but 1440p can. Again, why is this happening? Because the GPU has to calculate pixel display values AND virtual graphical objects of high textures, and refresh it all 60 times per second. The 1060 would be drawing maximum power with full VRAM load trying to draw and display what the 3080 can do much more easily.
And I can also play some games at 4K just fine on a GTX 1060, depending on how graphics intensive they are. Maybe FTL or Into The Breach lol. Even other simpler 3D games can be played on 4K, since all it is really doing is upscaling the graphics instead of generating new ones of higher detail, while also providing an upscaled UI. Could also use the general display at 4K too, but not all programs are 4K friendly, and while it isn't that common these days, programs can crash your computer by meeting logical loops, which can result from resolution issues- this probably has more to do with their UIs not working well at 4K, with various bugs.
Obviously running my PC at lower resolution is not just due to concerns about PSU, but I definitely want to play it safe. Why push my PC when the new, quality PSU is just about to be delivered to my doorstep? There's no reason to chance it. I also already explained that this Strix 3080 is daisy chained using the only two PCIE cables that came with the PSU into 3 PCIE power slots, which I am very weary of. I also have not had time to do much else than get everything operational, because I am busy, and I need this new PSU to install additional hardware. Additional hardware = more power usage.
Except the rules of thermodynamics do not bend in such ways. More calculations and processing require more power. There are more calculations and processing occurring with higher resolution and more frames per second. The ability to do this processing is based on technical hardware advances, which require more power draw to sustain it. Hence why this Strix 3080 has 3 PCIE power plugs, some other 3080s only have two, and older GPUs like my 1060 only have one, and before that they were just powered by the PCIE slot itself (like my 750ti). What do you think those additional power inputs are for, more fans and pretty lights? They are there to power the GPU processing unit.
The other guy was also all mixed up because he assumed that I was going from 4K 60fps to 1440p 120fps. My goal here in the end is 4K 120fps. That's why I bought a 4K 120fps TV, and the RTX 3000 series is the first Nvidia cards to feature HDMI 2.1 outputs, which are 120fps.
Since always? It's, based on load and clocks, can be under the same power draw regardless of resolution if your cpu can keep up.
If this was determined by CPU, then what would we need the GPU for, outside of games and graphic-intensive programs? Why don't they just have integrated GPUs inside of motherboards? Oh wait, they already do and they suck... And don't tell me that my GPU is not working while just surfing the web or using simple programs, because it certainly is. My GPU load right now is 14% and I am just using Chrome and Steam. I'm also using two displays, both hooked up to the GPU.
Oh look, I turned off the second display and now GPU is down to 11%. Hmmmm.... could there possibly be some kind of correlation there???
I think you guys are operating on the idea that you can merely view things on your display in 4K resolution that were not meant to be in 4K; for instance, you are watching a 420p YT video on your 4K display. Of course you can do that, even with older GPUs. It's still not just pulling the displayed resolution out of thin air though, as it actually has to run a filter to estimate pixel values to fill in between supplied pixel values, hence why it will be loaded with graphical artifacts. This is going to be far less computationally-intensive than having your GPU process an actual 4K video through digital transfer at high frames. The fact that I needed to buy a "certified" HDMI 2.1 cable that can actually transfer data at the rates needed for 4K 120fps is proof that there is more data that is already processed at the GPU than normal HD 1080p. Otherwise I just get a janky video inconsistently bouncing between 60-120fps. It's not the TV dictating 120fps, it is the PC. If 4k 120fps didn't use more processing, then why would I have this problem?
The part about cpu keeping up was that at 1080p a 3090 will be cpu limited so it will use less power, however, if the cpu can keep up it would use the same power T 1080p or 4k, so, let's drop down the stack, let's take an old card that didn't use gpu boost to simplify things further.
So a 980 with a 10900k as a pure example to remove any cpu bottleneck.
At 1080p or 4k the gpu will be able to be used to 100% utilisation.
At 100% it does not care if its rendering 4k or 1080p they both are using the gpu to its limit and as such pulling the same ammount of power.
This us why clocks and utilisation matter to power draw, not resolution.
Yes a new top end gpu will pull less power at 1080p but that is largely due to the cpu being a bottleneck and nothing to do with 1080p using less power, it's simply that the card isn't utilised as much, if you had a cpu that could push a 3090 to 100% usage at 1080p it would have the same power draw as at 4k.
I hope this makes sense as I don't know how else to explain it, but I feel the 980 example should get it accross fairly well.
https://nvidia.custhelp.com/app/answers/detail/a_id/5158?linkId=100000031463074
Great, that's what I was hoping for. It looks like they also address some issues with LG OLED TVs, which I don't have (OLED anyway) but was also a common complaint I came across. This means they are stamping out these main issues that users are experiencing.
Maybe if I feel like it later, I'll do an actual experiment of showing different PSU power rates at different resolutions, since my PSU gives me wattage readouts. It would be hard to do without shutting down every program though, since it is really hard to keep automated processes from skewing results.
I also have both a 1060 and 3080 installed and operational.
These 3 power connectors are because your card has a 450w powerlimit while other have only 320w. My point was 4k or 1440p doesnt matter because on both resolutions its very easy to reach max power limit of 450w. Your gpu cant go higher only because its 4k. Max is max except you powermod your card with unlimited powerlimit.