Fallout 4

Fallout 4

Vis statistikker:
Tune 12. mar. 2016 kl. 18:52
Crash Fix when OC GPU
All my games besides Fallout 4 are able to run error / crash free with my GPU overclocked and have been doing so for years.

IF you get the following crash types and your GPU is manualy over clocked over factory settings then return your gpu back to stock/factory settings. If anything my testing has shown that the game crashes less frequently if at all.


IF game crashes straight back to desktop without any windows error message.
IF nvidia driver had faild but recovered.
IF screen frezzes then blue screens.

IF ANY ARE TRUE THEN STOP OVERCLOCKING THE GPU.

NOTE: Game is run at 1080p on ULTRA
If your game does not crash and you are not running it on ULTRA than this does not apply to you or it just might. For consistancy sake run on ultra,

My game would crash every time. Some times after 30 min an hour but would allways crash when gaming. I turned off my overclocks and it has been stable for 3hrs -- straight -- which would not be possable otherwise. I remember having to do this before with a new game as there is many causes why this happens but hoppfully, and like the other game, as the driver/game files mature i can return to my OC and the 5 to 8 additional FPS it brings.

EDIT:
Today I did a Steam “verify integrity of game cache” function and got this message, “161 files failed to validate and will be reacquired” Steam re-downloaded the corrupt files in no time and upon launching Fallout re-detected my video hardware and set video options accordingly. I don’t understand why it auto sets my video settings to high quality and not ULTRA as I meet the recommended requirements. I have I7-4790k and GTX 780 – so WTF! I’m not sure how this happened nor do I understand entirely what’s going on with “failed to validate” – WTF does that mean(I’m thinking hash code hum…)? Any ways I played the game 3 times and each time it crashed it is even worse im so tired of this $hit.
PS IF ANY ONE KNOWS OR HAS HAD SOME SUCCESS IN MAKING THIS GAME CRASH FREE, BY ALL MEANS SHARE YOUR FINDINGS!!!!@#$@%$
Sidst redigeret af Tune; 18. juli 2017 kl. 18:33
< >
Viser 1-15 af 51 kommentarer
Tune 14. mar. 2016 kl. 15:25 
I dont fully understand why it is the case but it is and like the last game that had this same problem -- as drivers and game software progresses it will work it self out.

ps im using a ASUS Poseidon Platnum GTX 780, which when oc'ed performs and benchmarks just like the TI variant.

CPU: I7-4790k @ OC 4.6GHz < 55°C @ 1.30V-- Custom Water Cooling. Ambient=22°C
GPU: ASUS Poseidon GTX 780 @ OC 1280MHz CPU & 7000MHz RAM < 58°C -- Water Cooled
RAM: 2X4=8GB G.Skill RipjawsX @ OC 1866MHz CL9
Mobo: Gigabyte Z97X-Gaming7 Bios Ver. F7
SSD: Samsung 840 pro 256GB + 4TB hdd’s
PSU: EVGA Supernova G2 750W 80+ Gold
Hobo Misanthropus 14. mar. 2016 kl. 15:32 
Little bit of manual overclocking shouldn't be bad, but you'll of course want a GPU with adaquate headroom (Power Phases, cooling) in order to stay stable.

I have a modest overclock on the already Turbo-as-♥♥♥♥ EVGA SSC, and my aggressive overclocks of +150CORE-500MEM would crash in minutes. But +80CORE-250MEM is stable with maybe a crash once every 2-3 days which could be more Windowslol10 and less Overclock.

Ending clocks are 1505 CORE and 3750(7500) Mem

This is on top of a 4.4GHZ overclock I5 6600k. I didn't feel like pushing it with voltage though, so it's still at its stock 1.2V (4.5GHZ was unstable) Skylakes overclock extremely well though, even better than Devil's Canyon CPUs. I've heard 5.0GHZ isn't impossible with 6600k and a good motherboard, and 5.2GHZ on the I7 6700k, though non-power users should expect 4.6-4.8 respectively with voltage tweaking. (I may do Voltage tweaks if I start to CPU bottleneck)
Sidst redigeret af Hobo Misanthropus; 14. mar. 2016 kl. 15:37
Tune 14. mar. 2016 kl. 18:45 
Oprindeligt skrevet af Prince Rahl:
Oprindeligt skrevet af KOZ♍:
I dont fully understand why it is the case but it is and like the last game that had this same problem -- as drivers and game software progresses it will work it self out.

ps im using a ASUS Poseidon Platnum GTX 780, which when oc'ed performs and benchmarks just like the TI variant.

CPU: I7-4790k @ OC 4.6GHz < 55°C @ 1.30V-- Custom Water Cooling. Ambient=22°C
GPU: ASUS Poseidon GTX 780 @ OC 1280MHz CPU & 7000MHz RAM < 58°C -- Water Cooled
RAM: 2X4=8GB G.Skill RipjawsX @ OC 1866MHz CL9
Mobo: Gigabyte Z97X-Gaming7 Bios Ver. F7
SSD: Samsung 840 pro 256GB + 4TB hdd’s
PSU: EVGA Supernova G2 750W 80+ Gold

Regarding future drivers improving performance or stability, there are some who believe we won't see much if any when the new Pascal architecture releases and Nvidia focus driver support on that. Nvidia are already focusing more on Maxwell and it would seem like they are just making sure Kepler is not borked/nerfed

Then again we are both using Kepler chips and are kind of "due" to updgrade - at least I feel that way ;-)

That is disheartening if true considering my 780 and even a 690 GTX are more than capable to max out ALL Current games at 1080p. I would like to think that as development time goes on driver tweaking will eventually reach a point where it is as good as it can get given architectural limitations. Kepler is only one gen behind or little over a year which means Nvidia Engineers involved in the Kepler architecture should continue to support it for several more years! Me as an Engineer and the programs i make for industrial automation and my passion for those projects allows me to revisit past implementations while working on current projects to allow for improvments. However the last driver released that mentioned an improvment for Kepler chips was driver version 353.06 which is for win7 only so... yea it would appear that this is true.
Tune 14. mar. 2016 kl. 18:50 
but really, how different is maxwell compaired to Keplar? Of the 50+ games i play none are using DX12 so all the game coding remains unchanged between the two chips. So any new driver would benifit both chips
Tune 15. mar. 2016 kl. 12:48 
People that are passionate about there work find the time to make improvments! Any project has an AIL to aid in project managment and as long as you finish your items you are fufilling your work responsibilities. I dont know what work envirnment you are use to but im my field i often find my self with extra work time, typically between projects. I also take my work home with me and it is under these circumstances that improvments can be made.
Tune 15. mar. 2016 kl. 13:06 
Oprindeligt skrevet af Prince Rahl:
Oprindeligt skrevet af KOZ♍:
but really, how different is maxwell compaired to Keplar? Of the 50+ games i play none are using DX12 so all the game coding remains unchanged between the two chips. So any new driver would benifit both chips
I'm no expert, I can only barely grasp these concepts so I'll try to answer but don't take my word as science/fact ;-)

The drivers apparently have some interaction with the hardware itself and the archicture specifically - as you mentioned, the driver updates did previously mention improvements for specific gpu architecture.

Without doubt the people at nvidia who do the actual driver coding (who are not the engineers, they are programming/coding developers) would continue to support and refine their drivers as much as possible. The question becomes time and resource management.

Being that you work in a comparable field you must realize that if you only the had the time/resources to focus properly on your current or upcoming projects (as GPU's and their architecture are always moving forward and needing to be adapted to more programs) you would do the same.

What would your employers think if you were tweaking software/drivers (however they are reffered to, sorry I don't know :) ) for equipment that was if not nearly obsolete, as low a priority as nearly obsolete design? I think they would see it as a waste of your valuable time to not focus on "tweaking" your current/future designs for maximum potential.

I could be wrong though, and I mean no offense by any of that - as I said I know very little about these subjects and only just grasp them :-)


Also FYI the job title for Nvidia driver programmers or as you put it "who do the actual driver coding (who are not the engineers, they are programming/coding developers) "
Are called: SENIOR GRAPHICS SOFTWARE ENGINEER
see this link:
http://jobs.monster.com/c-nvidia-corporation-v-it-q-device-drivers-software-engineer-jobs.aspx

People that work in programming or hardware design offten use the "Engineer" title as it is a general discriptor in fact there is very little differance in education between someone with a BS in Computer Science or Computer Engineering as they take the same classes. I know this because in just about all my classes as an Computer Engineer many of the students were majoring in Computer science. In a nut shell we all know the same stuff with the exception of several advanced classes/credits leaning towords eather programming or hardware. But one cant do the other without knowing both.
Sidst redigeret af Tune; 15. mar. 2016 kl. 13:08
Foe 15. mar. 2016 kl. 14:18 
My gpu is factory oc,ed never had a crash.
Tune 15. mar. 2016 kl. 18:06 
Oprindeligt skrevet af c3lix:
My gpu is factory oc,ed never had a crash.

not talking about factory oc! Read the entire post! Go ahead and MANUALY OC your card and see how far you get. use another game besides Fallout4 for the stablity test. My card is also clocked higher than referance design by default. Try at a min OCing your gpu cpu boost clock by+100 and VRAM by +250.

My stable oc is +190mhz cpu and +1000mhz vram which brings OC totals to 1280mhz cpu and 7000mhz vram and default referance clocks are 954mhz cpu and 6000mhz vram. You can see the types of games i play from my steam page, games like COD, BF4, grid, Metro... never had a problem with crashing or even artifacts with these clock settings!!!

I believe this game relys heavly on Vram IO opperations because of the large map sizes and overclocking the vram causes internal setup and hold time violations in the main memory ic's. Altho, i have not tested this theroy, a moot point because my Poseidon 780 is a beast and i am able to maintain over 60 fps at 1080p @ Ultra...
Sidst redigeret af Tune; 15. mar. 2016 kl. 18:32
Tune 15. mar. 2016 kl. 18:14 
Oprindeligt skrevet af Prince Rahl:
Thanks for the info and corrections! I always like to learn about different fields :-)

As to the point of why the software engineers at Nvidia would no longer be working on improvements to older hardware there is only one reason I could think of for it. Pardon my tinfoil hat ;-)

Nvidia would appear to be a company that operates on the business model of developing and then selling new hardware. I would assume that they would sell less new hardware if they constantly improved the abilities of older hardware to compete with their newer/newest hardware.

That is just speculation although it does make excellent business sense from a bottom line point of view which is often held by those that make the final/ultimate decisions in these matters.

I remember reading that the Kepler architecture was refined to (seemingly) it's maximum potential with the 700 series and thus Nvidia moved on to the Maxwell design. Perhaps with the release of these refinements in the Kepler 700 series architecture there was little performance to be gained no matter the efficiency of the driver software.

At this point I'm more interested in the new Pascal GPUs and what they will bring to the table other than small die size (more power efficient thus less heat) and the ability the handle HBM memory. Although that last point is debatable with the first offerings of Pascal as there are rumors that the initial launch will still use DDR5 - or possibly DDR5x.

Either way I guess we'll be waiting until the Nvidia GTC in April for a formal announcement that may or may not include the specifics about the desktop/gaming line specifics.

yea... I just think it is rediculous for Nvida to expect its customers to spend 400 500 bucks every couple of years. When i spend that much i expect it to remain competitive for much longer . At least 5+years. Its a total pain for me to change out my gpu because it is water cooled as you can see in my Steam avitar pic!
Tune 15. mar. 2016 kl. 18:30 
Oprindeligt skrevet af Prince Rahl:
Oprindeligt skrevet af KOZ♍:
yea... I just think it is rediculous for Nvida to expect its customers to spend 400 500 bucks every couple of years. When i spend that much i expect it to remain competitive for much longer . At least 5+years. Its a total pain for me to change out my gpu because it is water cooled as you can see in my Steam avitar pic!

I agree about expecting more longevity from a GPU, also about your sweet gaming rig :steamhappy:

I doubt it's very easy to switch out a GPU under a proper WC loop but you have to admit, putting hardware together (even WC loops) is half the fun on 'putering ;-)

Thanks but its not fun not at all when i have to take my pc to the kitchen and balance it just right over the sink to drain the die staining liquid out that son o beich! O and checking for leaks...
Tune 15. mar. 2016 kl. 18:41 
Oprindeligt skrevet af Prince Rahl:
Oprindeligt skrevet af KOZ♍:

not talking about factory oc! Read the entire post! Go ahead and MANUALY OC your card and see how far you get. use another game besides Fallout4 for the stablity test. My card is also clocked higher than referance design by default. Try at a min OCing your gpu cpu boost clock by+100 and VRAM by +250.

My stable oc is +190mhz cpu and +1000mhz vram which brings OC totals to 1280mhz cpu and 7000mhz vram and default referance clocks are 954mhz cpu and 6000mhz vram

I believe this game relys heavly on Vram IO opperations because of the large map sizes and overclocking the vram causes internal setup and hold time violations in the main memory ic's. Altho, i have not tested this theroy, a moot point because my Poseidon 780 is a beast and i am able to maintain over 60 fps at 1080p @ Ultra...

My GPU is also clocked higher than reference but that didin't mean I couldn't push for a little bit more ;-)

1392MHz Core and 7260 MHz Memory <70C stable on an EVGA GTX 770 ACX 2.0 SC.

It would seem (so far) that after doing a complete removal (thank you DDU) and reinstall of drivers it's also stable in Fallout 4. I haven't been to Diamond city with that clock yet so it may not hold up - time will tell.

WOW thats hard core! cant believe you are able to pull that one off -- good temp too. Are you serious 1392 core and 7260 vram on a 770??? I cant push mine any higher than 1293mhz or i get artifacts in Grid Autosport! But if that is really true then you really lucked out on that gpu. Hay what is your Passmark score with those clocks?
Yhwach 15. mar. 2016 kl. 18:43 
You can push that 770 harder, the 750 on my benchmark rig can put out around 1520 at maximum without the temp exceeding 70c, with a stable temp it can run at about 1460mhz
Tune 15. mar. 2016 kl. 18:45 
Are you able to play the game with that OC? No crashing? Hum i just uped my driver to 362.00 last week. Did not consider DDU because i am not having this prob with any other game.
Yhwach 15. mar. 2016 kl. 18:45 
Me?
Tune 15. mar. 2016 kl. 18:48 
good lord your guy's computers must sound like a jet engine taking off. See the benifit of water cooling is big Over clocks and silent opperation. I mean when gaming i cant hear my PC at all it is so quite! i hate fans buzzing so anoying and am able to keep both my cpu and gpu which are on the same loop under 60C
< >
Viser 1-15 af 51 kommentarer
Per side: 1530 50

Dato opslået: 12. mar. 2016 kl. 18:52
Indlæg: 51