Instalează Steam
conectare
|
limbă
简体中文 (chineză simplificată)
繁體中文 (chineză tradițională)
日本語 (japoneză)
한국어 (coreeană)
ไทย (thailandeză)
български (bulgară)
Čeština (cehă)
Dansk (daneză)
Deutsch (germană)
English (engleză)
Español - España (spaniolă - Spania)
Español - Latinoamérica (spaniolă - America Latină)
Ελληνικά (greacă)
Français (franceză)
Italiano (italiană)
Bahasa Indonesia (indoneziană)
Magyar (maghiară)
Nederlands (neerlandeză)
Norsk (norvegiană)
Polski (poloneză)
Português (portugheză - Portugalia)
Português - Brasil (portugheză - Brazilia)
Русский (rusă)
Suomi (finlandeză)
Svenska (suedeză)
Türkçe (turcă)
Tiếng Việt (vietnameză)
Українська (ucraineană)
Raportează o problemă de traducere
look in the nvidia control panel and be sure power management is set to adaptive or prefer max performance mode
also try a different set of nvidia drivers, I played BL3 on 441.87 with my GTX 1070 and it ran fine here paired with a Ryzen 5 2600x, and 16GB ram
Hardware Unboxed Video: https://www.youtube.com/watch?v=e6FWiVwq1fI
I have a similar setup as you and this is what my settings are:
BenchmarkResults 2020-05-01_10-34-36
- FramesPerSecondAvg: 106.78
- FrameTimeMsAvg: 9.37
OS: Windows 10
- Version: Build 19619
GraphicsAPI: D3D12
CPU: Intel(R) Core(TM) i7-9750H CPU @ 2.60GHz
- PhysicalCores: 6
- LogicalCores: 12
RAM: 16239.50 MB
GPU: NVIDIA GeForce GTX 1660 Ti
- VRAM: 5991 / 0 / 8120 MB (Dedicated / System / Shared)
- DriverVersion: 445.87 (Internal: 26.21.14.4587, Unified: 445.87)
- DriverDate: 4-3-2020
ScreenResolution: 1920x1080
RenderResolution: 1920x1080
ScreenPercentage: 100
HDR: Off
GameUserSettings
- FullscreenMode: Fullscreen
- UseVSync: 0
- PreferredMonitor: LGD05E8
- bPrimaryIsPreferredMonitor: 1
- PreferredRefreshRate: 0
- StatsLevel: 2
- FPSLimit: Unlimited
- FPSLimitCustom: 144
- GfxQuality-Override: Undefined
- GfxQuality-Recommended: Medium
- GfxQuality: Medium
- TextureStreaming: Medium
- MaterialQuality: Low
- Aniso: SixteenX
- Shadows: Low
- DrawDistance: Low
- EnvironmentDetail: Low
- Terrain: Low
- Foliage: Low
- CharDetail: Low
- CAS: 0
- CameraBlur: 0
- ObjectBlur: 0
- AA: FXAA
- VolumetricFog: Off
- SSR: Off
- AO: Low
The thing is though, I am choosing low settings so that I get high fps, not because I can't play at higher settings. Even if I crank everything to ultra I am pretty sure I'd average higher fps than the results in the opening post.
Going to switch to Ultra to check that right now, but I highly doubt my scores will be as bad as OP's despite my hardware being weaker.
- FramesPerSecondAvg: 50.44
- FrameTimeMsAvg: 19.83
OS: Windows 10
- Version: Build 19619
GraphicsAPI: D3D12
CPU: Intel(R) Core(TM) i7-9750H CPU @ 2.60GHz
- PhysicalCores: 6
- LogicalCores: 12
RAM: 16239.50 MB
GPU: NVIDIA GeForce GTX 1660 Ti
- VRAM: 5991 / 0 / 8120 MB (Dedicated / System / Shared)
- DriverVersion: 445.87 (Internal: 26.21.14.4587, Unified: 445.87)
- DriverDate: 4-3-2020
ScreenResolution: 1920x1080
RenderResolution: 1920x1080
ScreenPercentage: 100
HDR: Off
GameUserSettings
- FullscreenMode: Fullscreen
- UseVSync: 0
- PreferredMonitor: LGD05E8
- bPrimaryIsPreferredMonitor: 1
- PreferredRefreshRate: 0
- StatsLevel: 2
- FPSLimit: Unlimited
- FPSLimitCustom: 144
- GfxQuality-Override: Undefined
- GfxQuality-Recommended: Medium
- GfxQuality: Ultra
- TextureStreaming: Ultra
- MaterialQuality: Ultra
- Aniso: SixteenX
- Shadows: Ultra
- DrawDistance: Ultra
- EnvironmentDetail: Ultra
- Terrain: Ultra
- Foliage: Ultra
- CharDetail: Ultra
- CAS: 1
- CameraBlur: 1
- ObjectBlur: 1
- AA: Temporal
- VolumetricFog: High
- SSR: High
- AO: Ultra
Just realized OP was on Badass, not Ultra, and that he's on 1440P. That makes the results a bit less bizarre, but they still seem really low.
I can try changing my setting to Badass and change my Screen Percentage setting to 1.33, that shoud simulate 1440P.
- FramesPerSecondAvg: 35.67
- FrameTimeMsAvg: 28.03
OS: Windows 10
- Version: Build 19619
GraphicsAPI: D3D12
CPU: Intel(R) Core(TM) i7-9750H CPU @ 2.60GHz
- PhysicalCores: 6
- LogicalCores: 12
RAM: 16239.50 MB
GPU: NVIDIA GeForce GTX 1660 Ti
- VRAM: 5991 / 0 / 8120 MB (Dedicated / System / Shared)
- DriverVersion: 445.87 (Internal: 26.21.14.4587, Unified: 445.87)
- DriverDate: 4-3-2020
ScreenResolution: 1920x1080
RenderResolution: 2400x1350
ScreenPercentage: 125
HDR: Off
GameUserSettings
- FullscreenMode: Fullscreen
- UseVSync: 0
- PreferredMonitor: LGD05E8
- bPrimaryIsPreferredMonitor: 1
- PreferredRefreshRate: 0
- StatsLevel: 2
- FPSLimit: Unlimited
- FPSLimitCustom: 144
- GfxQuality-Override: Undefined
- GfxQuality-Recommended: Medium
- GfxQuality: Badass
- TextureStreaming: Ultra
- MaterialQuality: Ultra
- Aniso: SixteenX
- Shadows: Ultra
- DrawDistance: Ultra
- EnvironmentDetail: Ultra
- Terrain: Ultra
- Foliage: Ultra
- CharDetail: Ultra
- CAS: 1
- CameraBlur: 1
- ObjectBlur: 1
- AA: Temporal
- VolumetricFog: Ultra
- SSR: Ultra
- AO: Ultra
It didn't give me the option to choose 133%, only 125%, but that rendered everything at 2400x1350, which is pretty close to 1440P. I got 36 fps.That's on a laptop with a 1660 TI. Your 1080 should be about 31% more powerful than my 1660 TI, which'd result in an expected 47 fps at 1350P or 45 fps at 1440P if results scale linearly due to:
36/1440*1350=34
34*1.31=45
You're getting about 4 fps lower than expectations based on that, which doesn't seem too crazy I suppose.
Your card is performing about 10% worse than I'd expect it to by scaling my own experience, definitely not half the performance that 50% utilization would imply.
BenchmarkResults 2020-05-01_18-52-50
- FramesPerSecondAvg: 73.83
- FrameTimeMsAvg: 13.54
OS: Windows 10
- Version: Build 18363
GraphicsAPI: D3D11
CPU: AMD Ryzen 7 3700X 8-Core Processor
- PhysicalCores: 8
- LogicalCores: 16
RAM: 16314.74 MB
GPU: NVIDIA GeForce GTX 1080
- VRAM: 8079 / 0 / 8158 MB (Dedicated / System / Shared)
- DriverVersion: 445.75 (Internal: 26.21.14.4575, Unified: 445.75)
- DriverDate: 3-17-2020
ScreenResolution: 2560x1440
RenderResolution: 2560x1440
ScreenPercentage: 100
HDR: Off
GameUserSettings
- FullscreenMode: Fullscreen
- UseVSync: 0
- PreferredMonitor: BNQ78D6
- bPrimaryIsPreferredMonitor: 1
- PreferredRefreshRate: 0
- StatsLevel: 0
- FPSLimit: Unlimited
- FPSLimitCustom: 144
- GfxQuality-Override: Undefined
- GfxQuality-Recommended: High
- GfxQuality: High
- TextureStreaming: High
- MaterialQuality: High
- Aniso: SixteenX
- Shadows: High
- DrawDistance: High
- EnvironmentDetail: High
- Terrain: High
- Foliage: High
- CharDetail: High
- CAS: 1
- CameraBlur: 0
- ObjectBlur: 0
- AA: Temporal
- VolumetricFog: Medium
- SSR: High
- AO: Medium
I agree, his performance is about what I'd expect him to get. 4 fps lower, but basically the same.
I initially thought it was really poor performance, but it was because I assumed 1080P. 1440P is a lot more demanding.
Its through Task Manager. After more testing it seems to usually be at 87% but jumps around drastically.
https://www.evga.com/Products/Specs/GPU.aspx?pn=AB7B6447-6329-4361-8AB7-4AD62EA6A2AF
Thank you, it seems your right that other people with the same cards are having similar performance. Its a bit suprising to me, as borderlands Graphics even on badass don't look like they should be so demanding. Ill try everyones settings out and see how they work. Thanks everyone!!
BenchmarkResults 2020-05-01_21-23-33
- FramesPerSecondAvg: 64.15
- FrameTimeMsAvg: 15.59
OS: Windows 10
- Version: Build 18363
GraphicsAPI: D3D11
CPU: Intel(R) Core(TM) i7-6800K CPU @ 3.40GHz
- PhysicalCores: 6
- LogicalCores: 12
RAM: 16285.38 MB
GPU: NVIDIA GeForce GTX 1080
- VRAM: 8079 / 0 / 8143 MB (Dedicated / System / Shared)
- DriverVersion: 445.87 (Internal: 26.21.14.4587, Unified: 445.87)
- DriverDate: 4-3-2020
ScreenResolution: 2560x1440
RenderResolution: 2560x1440
ScreenPercentage: 100
HDR: Off
GameUserSettings
- FullscreenMode: Fullscreen
- UseVSync: 0
- PreferredMonitor: AOC2713
- bPrimaryIsPreferredMonitor: 1
- PreferredRefreshRate: 0
- StatsLevel: 2
- FPSLimit: Unlimited
- FPSLimitCustom: 144
- GfxQuality-Override: Undefined
- GfxQuality-Recommended: Low
- GfxQuality: Badass
- TextureStreaming: Ultra
- MaterialQuality: Ultra
- Aniso: SixteenX
- Shadows: Ultra
- DrawDistance: Ultra
- EnvironmentDetail: Ultra
- Terrain: Ultra
- Foliage: Ultra
- CharDetail: Ultra
- CAS: 1
- CameraBlur: 0
- ObjectBlur: 0
- AA: Temporal
- VolumetricFog: Medium
- SSR: Ultra
- AO: Ultra
The rumors from what I've seen are that the RTX 3k series are gonna be a pretty big jump from the 2k series and are going to be at a cheaper price point than the 2k series was. So honestly I'd wait because you'll either be able to get a cheaper 2k series card or a hopefully still not a bad price 3k series card.
if you are playing on 2K resolution or better then I would wait for next gen
with the way the optimization is on games these days I don't think it will be long before even the GTX 2000 series will be running at 1080p to achieve playable frames at max detail