Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
Depending on the libraries they use for compilation, the library could be dropping support for 32 bit as well, it also requires more work to change which dll goes where and testing the built product. Sometimes these libraries even require 16-bit code to run, which isn't going to happen on a 64-bit processor.
So whatever it is, they decided to drop it for some sort of valid reason. I believe the original reason was multiplayer desyncs, but has clearly evolved into more than that.
relevant fff: https://www.factorio.com/blog/post/fff-158
You should be able to get a decent 64-bit pc from a university surplus store or craigslist (or local equivelant) for less than $100. Not a thrift store, they only get the trash of computers.
http://imgur.com/a/uL2K6
64-bit Operating systems have been around for a little over 10 years now. It makes sense to abandon the 32-bit platform at this point. Most developers have already done this years ago.
This is actually quite interesting. As the last time I check how many steam users there were it came out to be roughly 11.83 million[www.statista.com], now I know this isn't true but what if 100% of steam users did the system survey. Going off the statistics given developers would be cutting off 807,989 users.
Drop in the bucket compared to the full 11,830,000 but still quite a bit. I can see and understand why some if not most developers drop 32 bit. Especially factorio devs as I can imagine it would take 2 seperate teams to maintain 32 and 64 bit varients.
As for apple and linux users, I don't think they offer 32 bit systems anymore. As last time I checked you can't change the OS on a Mac. Linux is an odd ball.
Thanks for the link to the Factorio Friday Facts post, I can understand more throughly now. You talking in USD or AUSD or something else completely when talking cash?
They are already steadily going past the "safe" RAM limitations for 32 bits for the minimum requirements (anything past 2GB starts to have weird things going, and if you breach the 3-3.5 limit you go into the territory of heavy crashes and potentially even OS related issues).
While it sucks for the people that don't have the means to get a somewhat recent machine, there comes a point where the programs have to move on with the vast majority, or they will just end up having to continue fighting with limitations for an increasingly small portiong of their users.
The blame is pretty one-sided here, and it doesn't belong to the developers. They're not supporting MS-DOS either.
At some point, things get old and phased out. They're not "perfectly good" -- they're obsolete.
I heard the same arguments for people resiting getting a DVD player, angry that the stores weren't carrying VHS tapes anymore, as if somehow THEY were being victimized by an obsolete technology being phased out.
The 7% are wrong.
And in a decade, the 64-bitters will be going out of fashion. The right thing to do will be to upgrade to the 128 bit operating system, not to whine that games need to stay simple enough for us to run.
I wouldn't be so sure about that. The real driving factor for the move from 32-bit to 64-bit systems hasn't been the need for being able to operate on larger numbers, but a need to be able to address more memory. Most software still mostly works on numbers 32 bits long, often smaller. Even new games still only use 32-bit floating-point values, because that's really all the precision they need. Now, their memory systems, those use 64-bit integers because they need to be able to address more than a couple of gigabytes.
64-bit will probably last longer than 32-bit did, given it has a far larger limit. Twenty years ago, your typical PC had 16MB of memory. Today it's 16GB. That's a thousand-fold (technically 1024) increase over that time. The 64-bit limit is 16EB. That's a billion times current usage. If you have a thousand-fold growth in memory usage every 20 years, we'll hit the 64-bit limit sometime around 2075.
The 128-bit limit is about 3.4*10^38 bytes. It would take about 150 years after passing the 64-bit limit to hit it with a growth rate of 1000 times every 20 years.
The user would have to re-install to 64 Bit, mostly they do not have a recovery DVD for 64Bit, need to ask for that, etc. etc.
It's changing now finally and you mostly get a 64bit OS with your 64Bit machine, but this has been done a lot of years for no reason.
Which is pretty amazing when you think about it. We only have terabyte HDs for storage at the moment. After that comes petabytes. Only after petabytes do we finally reach exabytes. It currently takes huge teams of programmers, 150-300 people for the AAA stuidos, 5-6 years or so to code something thats around 60-80GB in size. Most of that is unoptimzed code since we now have so much storage and memory to work with that condensing things and optimizing things has been knocked down to a super low priority these days. Hell, even colleges aren't even teaching proper optimization methods any more because of how much space is available.
I cannot honestly imagine anything being so large that it fills even a single petabyte, let alone an exabyte.
Remember the infamous quote by Bill Gates saying nobody would ever need more than 640k memory? ^^
What takes up most of the space in a game is the HD textures and audio and so on. The code itself is virtually negligible in comparison. As monitors get bigger and graphics cards get better, games will keep getting larger and larger to keep them looking pretty.
Also filesize is not directly related to optimization. If your PC can load uncompressed textures faster then decompressing them on the fly, you can achieve better performance by using up more diskspace.
Your imagination is lacking. You have to think big. Or capture a few hours of uncompressed 60fps 4k videos ;-)
I could easily fill a petabyte disk if you give me one. But you are right, the amount of data needed for this is impressive.
I have not found a single proof that he actually ever said this. It seems to be a simple (but very entertaining) myth...
Here is a better (and confirmed) quote from him:
But it seems like we went a bit offtopic here...
USD. But it should be comparable anywhere that computers are in large supply. I have seen some pretty decent computers using ddr3 ram with an i5 cpu for $50 at my local university surplus store (probably would add a little more ram, powerpoint doesn't use much).
If you are going to go this route I recommend doing some brief research on the numbering scheme of CPUs. Intel's numbers the i series {generation number}{3x important number}{letter}, avoid U and Y they are low power. There has been little improvement from generation 4 to generation 7, so a 4570 is only 14% slower than a 7500. The cpu is hard to upgrade, so make sure it starts as high as you can get, its usually trivial to add more ram though. Look for one with ddr3 or ddr4 ram. If it isn't specified you can google the cpu model or the model number on the motherboard to find some information.
If it comes with a video card all the better, (it seems to be quite hard to find ultra budget cards nowadays, radeon is cheaper.) You can salvage your hard drive, disk drive and video/sound card from your old computer if its not a laptop.
Alternatively, you can investigate if the computer you have is 64-bit compatible and either upgrade your windows installation or dual boot a linux install. If you right click your start menu and go to system it should list your cpu, google it and see if its 64-bit. If it doesn't go to your device manager in the control panel and look at processors.
I hate to also run the risk of making that same mistake, but as much as I think terabyte RAM may become a thing I can't imagine it being needed for the average user. The problem I see is, unless we discover something greater than VR, our pixel density is literally getting denser than the human eye can see.
The code base can only get as big as a team/person/company can write and large data (which is easy to make) isn't bottlenecked by the amount of RAM its bottlenecked by the fact we have been struggling for half a decade to make cpu/board traces any smaller than 14 nm (which is a material science problem involving silicon) and whether that even helps anymore. The decade journey from 45nm to 14nm (intel 750 vs 7500) only got us a 50% improvement in benchmarking, not even close the the 300% you would expect. Factorio itself, a perfect example of making infinite data, is bottlenecked by the memory bus long before cpu or ram is maxed out
So we are gradually coming upon this point where we need a breakthrough on what a computer physically is because we are reaching the limits of physics. The cpu is literally going to burn a hole in the motherboard if we make it any faster and no amount of liquid cooling is going to pull the heat away fast enough. Most of the improvements in the past decade have been increasing cache size, raising memory bandwidth, and cutting cycles short on instructions that finish quicker. How much more of this can we do? I don't know. But we definitely aren't following moore's law anymore.
Interesting article I found while running about:
https://www.theverge.com/2015/7/9/8919091/ibm-7nm-transistor-processor
Quantum physics dictates that we could go much faster than what we have, https://en.wikipedia.org/wiki/Bremermann%27s_limit But this is what I meant with, "breakthrough on what a computer physically is."
Code is tiny. Textures and pre-recorded audio are what eats up all the space in your typical big-budget game, even with compression.
Check out the Demoscene's productions. They regularly make impressive visual productions which are several minutes long, which are rendered in real-time and which fit into 64KB, sometimes even less. One demoscene group made a one-level FPS with graphics on par with Doom 3, which fit into just 96KB. The game itself is known as ".kkreiger".
We already have supercomputers which have petabytes of disk space and hundreds of terabytes of RAM (granted it's divided between thousands of CPUs). If you consider worldwide data, we're already past the exabyte age. Current estimates for total worldwide data are into the low zettabytes (10^21 bytes).