Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
Let's say you're playing a game in 4k resolution. That means that your graphics card has to calculate 3840 x 2160 = 8,294,400 pixels for each frame. If, however, you play at 1080p resolution, your card only has to calculate 1920 x 1080 = 2,073,600 pixels - which can be done much faster, obviously.
What "upscaling" does, is that it calculates only a lower-resolution image (for better speed), but then applies scaling techniques that turn it into higher-resolution image. (for better looks). Due to the scaling algorithm used, the output looks better, and more fine-grained, then e.g. putting a 1080p image directly on a 4k monitor.
Upscaling isn't perfect - it can produce artifacts, and sometimes it makes very thin objects vanish (like power lines against the sky). On the plus side, it applies a sharpening filter that many gamers really like. It's definitely worth a shot if you'd like to get some more FPS.
In Hogwarts Legacy it's basically identical so it's essentially free performance gain if you have an Nvidia RTX 20, 30 or 40 series GPU. You do get a tiny bit of ghosting of the player's legs when running if they drift off screen for a moment, but I doubt you'd be able to spot it.