Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
You're essentially running two games at once; of course it taxes your system a bit.
Nothing is being sent/received except the Unity core and the code that runs the app, and caching does a fairly decent job of making sure you only do that once. All the processing is done on your local machine; there's no need to send anything to the site except the initial "gimme the webpage" request. You can click the star icon to "favorite" a specific seed, so it will store the data it generates in the local "userdata" storage instead of having to recreate it every time you "visit the site".
As for monitoring the internet traffic, you can do that with Task Manager (assuming you're running Windows). Right-click your taskbar, choose "Task Manager", then select the "Performance" tab at the top and click on "Ethernet" on the left side.
If you want to verify that the page actually isn't doing anything on the network, you can click the "Open Resource Monitor" link at the bottom of the Performance view in Task Manager, then click on the "Network" tab in Resource Monitor and watch the send/receive stats for individual applications in real time... once the page has loaded, the stats for that Chrome instance (or whatever browser you use) should drop to zero, even if you input a new seed and have it generate a new map.
I ask because when I have both running (game + website) the temp does not increase anymore than when I just have the site running by itself (both scenarios push to 80C) - shouldn't it increase more so with both running?
Will fully admit that my knowledge in this area is severely limited.
Thanks for the replies, btw. ;)
Yes, running WebGL in a browser pushes your system much harder than running a GL, DX, or Vulkan renderer in a dedicated process; for starters, there's a whole system of hoops the browser has to jump through just to show you "active" graphics in the first place, whereas the dedicated executable can (in effect) "just do it" because that's what it's designed to do.
In addition, the browser has to use the GDI layer to display the graphics... where the game sidesteps that and interfaces directly with the GPU. The browser-based app is doing almost everything on the CPU, because it's not actually allowed to talk directly to the hardware (for system security reasons).