Steam installieren
Anmelden
|
Sprache
简体中文 (Vereinfachtes Chinesisch)
繁體中文 (Traditionelles Chinesisch)
日本語 (Japanisch)
한국어 (Koreanisch)
ไทย (Thai)
Български (Bulgarisch)
Čeština (Tschechisch)
Dansk (Dänisch)
English (Englisch)
Español – España (Spanisch – Spanien)
Español – Latinoamérica (Lateinamerikanisches Spanisch)
Ελληνικά (Griechisch)
Français (Französisch)
Italiano (Italienisch)
Bahasa Indonesia (Indonesisch)
Magyar (Ungarisch)
Nederlands (Niederländisch)
Norsk (Norwegisch)
Polski (Polnisch)
Português – Portugal (Portugiesisch – Portugal)
Português – Brasil (Portugiesisch – Brasilien)
Română (Rumänisch)
Русский (Russisch)
Suomi (Finnisch)
Svenska (Schwedisch)
Türkçe (Türkisch)
Tiếng Việt (Vietnamesisch)
Українська (Ukrainisch)
Ein Übersetzungsproblem melden
https://www.youtube.com/watch?v=cmgC2YhqbWU
https://www.youtube.com/watch?v=gSZkhyZ6x5o&t=140s
Unless Hogwarts Legacy is TWICE more GPU demanding than Cyberpunk or RDR2, it will do lot more than 20 fps.
In the above videos, iris Xe is comfortably doing 30+ fps in both games 720p. Which is playable, and expected from an iGPU.
@OP,
Now that all the disturbances are gone, we are back to square 1, now read comment # 4.
No you had it right as they will be crap.
And then turn around and refer to a post that states "I don't know how it performs in this title" in a thread asking how it will perform in that very title?
Huh?
Generalized statements of "it's X times better than Y, and Y can do Z in titles A and B" aren't a good way to figure this out. Sure, extrapolating is fine to do at times. I do it... but I won't turn around and call other input disturbances and then point OP to my own guess as the only thing to pay attention to. That's a bit rude.
However, another post (#14), while still not the right IGP, is maybe a closer example to extrapolate from because at least the game is the same. Still not enough to firmly say "it'll be X times better because Y GPU is X times faster than Z GPU" because it doesn't ALWAYS work out that way (especially with this title where strange results are being gotten, such as 12 GB RTX 3060s situationally outperforming not only RTX 3070s, but also RTX 3080 10 GBs due to VRAM demands). But maybe at least to say "it should perform as well as this, and then some". So if you really want to extrapolate, post #14 is a better one to point to as a basis.
They are not only creating ''disturbances'', but also giving totally FALSE information to misguide the OP.
Saying that ''General rule is don't even think about running games if all you have is Intel Onboard GPU'' (Comment #9), is misleading and far from the truth, and despite me showing them all the evidences that their statement is not true, nothing entered in their head.
And when I showed benchmarks (comment #16) that Intel Iris Xe can play games just fine, even the most modern and demanding games.
They realized (obviously) that they were wrong, but to save their previous stand - they lied that the iris Xe can do only 20fps, which is AGAIN far from the truth (as shown in the benchmarks).
But I still think calling others' input disturbances, while pointing to yourself where you extrapolated at best as the answer to OP, was the wrong move. OP asked about how Iris Xe performed in Hogwarts Legacy and I didn't see anyone provide that.
Technically they didn't. They asked whether there device would be able to run, nothing more
Minimum requirements for 720p all low and 30 fps requires a GPU more than 2x what OP has so at best 15 fps all low and 720p but likely lower if OP wants to tweak it some. Running is one thing, whether it's playable is another.
the intel hd can run games that use rt and other features that bring the rtx cards to their knees, but nowhere near playable on the intel gpu
Either way, I was presuming OP had some expectation of "playable", even if that is 20 FPS or something, rather than wanting to know if it only "technically" ran. After all, something that takes twenty minutes to start and renders a frame every few seconds is still "running", no? I doubt OP was asking from that angle though, so yes I did presume there was a degree of "how will it run" by asking.
they would be better off asking in the games forum or looking up yt vids with similar hardware
I have already shown the benchmarks here in comment #16 and there are plenty more you can find on YouTube, so I'm confused where this playable and unplayable debate is coming from?
Every game is PLAYABLE with Iris Xe.
But this game 720p low settings at 30 fps is listed for a GPU more than twice the power of that GPU. So less than 15 fps and that's assuming it even loads, nothing you've provided shows that. It's guess work based on other games. Find proof of THIS game running on that iGPU until then the GPU is below minimum requirements by more than half so it's possible it won't even load let alone me playable.
I have the same 1135g7 on my laptop. Had I purchased the game on PC I'd install it on that to get a definitive answer but I'm not buying a game to test and then refund.
I wouldn't even dare to spend the money on a game I am quite confident would not be playable, such as Hogwarts Legacy. I'm not a Harry Potter fan either,...
I understand your point, but you also have to consider the fact that the game developers usually list only dedicated Nvidia and AMD GPUs which are within past few generations and which met the VRAM amount required by the game.
By that criteria the minimum GPU for Hogwarts Legacy is GTX 960, below that is GTX 950 which has only 2 GB VRAM, so not applicable. GTX 960 is the least they can mention.
If Iris Xe can do 30+ fps in Cyberpunk, do you really think that the Hogwarts Legacy is 2X more demanding than that? - Not possible.