LEGO® Harry Potter: Years 1-4

LEGO® Harry Potter: Years 1-4

Fatal error: it requires a graphic card which supports shader model 2
Hi, I was trying to run year 1-4 and got this error. I checked my directx and the version is 12 which supports shader model 5, and the graphic card driver is the newest. The system is Windows 10.
Does anyone have idea how to resolve it? Thanks
< >
Showing 1-10 of 10 comments
Quick Wesh Nov 8, 2017 @ 6:25am 
Hi.
Basti Nov 11, 2017 @ 4:33am 
Got the same problem. But i already played the game for 22 hours without any problems.
Now it crashes on starting the game. I didn't change anything.
jparkerrandall Apr 2, 2018 @ 1:06pm 
Were you able to resolve, same issue here
Praiwron [J.P.T.] Sep 27, 2018 @ 1:46pm 
This is a problem that I just encountered as well.. I managed to play 3 hour, then shut the game off and wanted to play the next day, but now I get the same message upon launch
Praiwron [J.P.T.] Sep 27, 2018 @ 1:53pm 
I found a russian post with the same issue. It did have a solution, though it didn't work for me, but maybe it'll work for someone else:

Solution:
1. Open the file C: \ Users \ UserName * \ AppData \ Roaming \ WBGames \ LEGO HarryPotter \ pcconfig.txt (Windows Vista / 7/8 / 8.1)
2. Set the parameters:
DesiredDynamicLightQuality 0
UseHires 0
UseHiresPending 0
ForceShaderModel 0
sounds like you might have installed the spring creators update. only solution is to rollback windows to a previous version, as the spring creators update removed gaming support.
Praiwron [J.P.T.] Oct 11, 2018 @ 2:11am 
I found out the issue.
My computer had, apparently, uninstalled Nvidia Controlpanel, thus rendering my graphics card useless
sounds like spring creators update might be responsible. the control panel however is not the most important, the drivers are.
Praiwron [J.P.T.] Oct 19, 2018 @ 8:48am 
Originally posted by andreasaspenberg575:
sounds like spring creators update might be responsible. the control panel however is not the most important, the drivers are.
I may be wrong, but my computer wouldn't recognize my graphics card as the primary source of input, thereby using the internal one which wouldn't run much..
Originally posted by Praiwron J.P.T.:
ForceShaderModel 0
Old post (found it through google), but still thanks, this worked for me. I changed only this line, it was enough.
< >
Showing 1-10 of 10 comments
Per page: 1530 50

Date Posted: Nov 4, 2017 @ 8:56pm
Posts: 10