Starship Simulator

Starship Simulator

DirectX/Vulkan
I have no idea what DirectX and Vulkan are (I'll go Google), but whichever I choose, it doesn't work :(

If I try DirectX I get an error message about dropping directx from the command line, and if I try to use Vulkan it tells me I need to have Vulkan.

Does this mean that my PC can't play this?
Last edited by LuckyVanDine; Apr 8, 2024 @ 2:37pm
< >
Showing 31-45 of 58 comments
Nekonoaijin Apr 15, 2024 @ 2:50pm 
Originally posted by kandnil:
Originally posted by LuckyVanDine:
Thanks you for this, but I confess that I find quite a lot of what you wrote very hard to follow.
Sorry about that. What I meant was, with this particular card,the RTX2080 (old by today's standards), I can set
Resolution to 1920x1080, which looks pretty nice here
The main processor is an AMD Ryzen 7 3700X and although this game is more graphics heavy than main processor heavy, a decent processor helps.
Memory (which is really nice to have) is 32GB, That's a fair amount even today but probably overkill for this game.
FOV or Field Of View (mistyped that as POV, sorry) is essentially how much the camera zooms in. Greater zoom = less to draw so less graphics intensive. This you can set in the game's settings, under the display tab. I recomment you try decreasing this number if you're experiencing low framerate.

Anti-Aliasing is another setting that can have an effect on framerate and I've set it to TAA in the graphics setting.

I have the framerate limiter (v-sync) set to 60 frames per second and only sometimes does the framerate drop as low as around 40. Most of the time, it's stable at 60 FPS.

I tried Vulcan but the hit on FPS was too much while DirectX works.

I guess the entirepoint of my post was, you don't need a state of the art gaming computer to play this game and barebone necessities on the hardware front doesn't have to cost a packet.


Being slightly nitpicky, it is Vulkan, not VulCan, and V-sycnh is not a frame- limiter, it is the vertical refresh frequency. Monitors are designed (until recently) with a specific refresh rate at a given resolution, this is directly tied to how quickly the monitor can prepare and display a full frame (in other words the entire screen, for our purposes). A monitor with a vertical refresh rate of 60 Hz can prepare and display a frame 60 times every second. If you disable "V-synch" you are ignoring this limitation and feeding frame data to the buffer regardless of where in the process of updating and displaying a frame the monitor is, which can let the GPU produces many more frames every second, though the monitor can still only draw them at the same rate, but you introduce the possibility of "tearing" where a displayed image has a visible shift between the top and bottom parts of the image at a certain point (because the bottom part is the previously generated frame and the top part is the frame after). Your GPU actually still tends to draw more frames than the monitor can display, even when you have V-synch enabled (as you probably should, but it's your PC - do you) it just only ever presents a completed frame buffer to the monitor.
LuckyVanDine Apr 15, 2024 @ 3:04pm 
Originally posted by Nekonoaijin:
An RTX 2080 Ti wasn't a grand even when new. It's two full generations and about six years old, currently retailing for as little as $500. Even if we ignore currency conversion and call that 500 quid, it's nowhere near what you are claiming. You certainly *can* buy one for as much as a $1,000 but you can buy overpriced things in any market. It'd be a foolish purchase, in my opinion, because although a very capable card if you already own it, as I indicated and has been pointed out several times already, it is *old* - two full generations behind the current crop of cards and we are on the cusp of the next generation.

Oh. I just saw what the developers recommended for this game and looked it up in pcpartspicker, which linked to this: https://www.amazon.co.uk/dp/B07HWMDDMK?tag=pcp0f-21&linkCode=ogi&th=1&psc=1

So, how does one tell that this is "two generations behind"? And why would the developers recommend this if you can't buy it or it is out of date?
LuckyVanDine Apr 15, 2024 @ 3:09pm 
Originally posted by kandnil:
try decreasing this number if you're experiencing low framerate.

Anti-Aliasing is another setting that can have an effect on framerate and I've set it to TAA in the graphics setting.

I have the framerate limiter (v-sync) set to 60 frames per second and only sometimes does the framerate drop as low as around 40. Most of the time, it's stable at 60 FPS.

I tried Vulcan but the hit on FPS was too much while DirectX works.

I guess the entirepoint of my post was, you don't need a state of the art gaming computer to play this game and barebone necessities on the hardware front doesn't have to cost a packet.
Thanks again. There is so much here that I have never heard of before, and I need to find resources to study it. I hadn't realised that it was necessary to be so much of a technican just to be able to play game.
How would I tell if I was "experiencing low framerate"? I am guessing that FPS means "frames per second" here, like in cinema and TV. But I thought that the human cognitive system perceived anything beyond 24fps as smooth motion, and that that's what TV and cinema produces.
kandnil Apr 15, 2024 @ 3:15pm 
You will tell the framerate is low when the game freezes momentarily when you try to move, or the movement is "choppy". At 60 FPS (yes, frames persecond) or more,the movement will appear smooth. Much depends on the graphics card here.

As for being a technician, I'm over the moon I no longer have to write batch and config files in DOS just to get sound :D So much easieron the commodore 64 back when!
Last edited by kandnil; Apr 15, 2024 @ 3:18pm
Nekonoaijin Apr 15, 2024 @ 3:20pm 
Originally posted by LuckyVanDine:
Originally posted by Nekonoaijin:
An RTX 2080 Ti wasn't a grand even when new. It's two full generations and about six years old, currently retailing for as little as $500. Even if we ignore currency conversion and call that 500 quid, it's nowhere near what you are claiming. You certainly *can* buy one for as much as a $1,000 but you can buy overpriced things in any market. It'd be a foolish purchase, in my opinion, because although a very capable card if you already own it, as I indicated and has been pointed out several times already, it is *old* - two full generations behind the current crop of cards and we are on the cusp of the next generation.

Oh. I just saw what the developers recommended for this game and looked it up in pcpartspicker, which linked to this: https://www.amazon.co.uk/dp/B07HWMDDMK?tag=pcp0f-21&linkCode=ogi&th=1&psc=1

So, how does one tell that this is "two generations behind"?

By knowing anything about the topic at all? ;) (I realize you don't, and you know you don't, but that really is the answer). For nVidia, it is relatively easy - there were '10 series' cards in the fairly distant past (GTX 1060, GTX 1080, etc), that was followed by the '20 series' cards which also were the first "realtime raytracing" capable cards and got the new prefix 'RTX' (RTX 2050, RTX 2060, RTX 2080, etc). Then we have the '30 series' cards (I have one of these, an RTX 3090 Ti) with similar naming, and the current generation are the '40 series' cards like the RTX 4060, RTX 4080, and so on.

And why would the developers recommend this if you can't buy it or it is out of date?

A whole bunch of reasons, why not? They are not telling you to buy that card, they are telling you the minimum performance level and feature set they recommend. In this case, that particular card is what the developer uses (or at least did use) on his own machine. It tells everyone where the bar is for "plays the way the developer intended" - if your card is equivalent or better, you're golden, if not, you may have to make some compromises but as long as your card is better than the minimum required you'll still get to play the game acceptably. Your system is neither of those, in fact, you don't even actually have a video card, you have what is known as "integrated graphics". Intel has worked hard, with good reason, to shed that label on their latest implementations, because the performance and features are far better than the old "intel Integrated Graphics" solutions of the past, but it is still actually an on-die integrated graphics solution with basically all the limitations that are inherent to that.
Nekonoaijin Apr 15, 2024 @ 3:28pm 
Originally posted by LuckyVanDine:
But I thought that the human cognitive system perceived anything beyond 24fps as smooth motion, and that that's what TV and cinema produces.

Originally posted by kandnil:
At 60 FPS (yes, frames persecond) or more,the movement will appear smooth.

Actually, no. There is no magic number. It depends on the person, the images displayed, the lighting and even if you are tired, drugged, etc. With that said ... 60 Hz is a good baseline for the majority of the population, in the majority of situations. Nothing special about it being sixty, just that that was technically feasible and practical for reasons that get pointlessly complex and are not all that relevant. Just understand that it could as easily be 56 or 64.

30 Hz is acceptable for many people in a lot of situations, but very noticeable to many others (myself included) in almost all situations. When I was younger I used displays capable of at least 72 Hz, ideally 85 Hz or more, in the resolutions I was using. Take away is "it depends, to a large degree on the person".

As for cinema, nothing magical about 24 fps, but they are not even slightly comparable. Traditional cinematography "gets away" with so few frames per second because of several technical details, the most obvious of which is that all the images are blurred where movement occurs. If you look at film stills this will stand out to you immediately, I guarantee it.
LuckyVanDine Apr 15, 2024 @ 3:38pm 
Originally posted by Nekonoaijin:
For nVidia, it is relatively easy - there were '10 series' cards in the fairly distant past (GTX 1060, GTX 1080, etc), that was followed by the '20 series' cards which also were the first "realtime raytracing" capable cards and got the new prefix 'RTX' (RTX 2050, RTX 2060, RTX 2080, etc). Then we have the '30 series' cards (I have one of these, an RTX 3090 Ti) with similar naming, and the current generation are the '40 series' cards like the RTX 4060, RTX 4080, and so on.
Oh, I see - thank you. Do the numbers actually have any semantics, or are they just sequence numbers?

They are not telling you to buy that card, they are telling you the minimum performance level and feature set they recommend. In this case, that particular card is what the developer uses (or at least did use) on his own machine. It tells everyone where the bar is for "plays the way the developer intended" - if your card is equivalent or better, you're golden, if not, you may have to make some compromises but as long as your card is better than the minimum required you'll still get to play the game acceptably.
Well, TBF, the "store page" for this game does say
RECOMMENDED:
Requires a 64-bit processor and operating system
OS: Windows 10
Processor: Intel i7
Memory: 16 GB RAM
Graphics: Nvidia 2080Ti at 1920x1080
DirectX: Version 12
which after I found the demo wouldn't play and saw the discussions here, I took to be recommending that particular card.
So, how does one go about determining what the key performance attributes of that card are, and then looking at currently available cards to see which match them?
Frames per second has already been mentioned as important, so I assume if I find a tech spec of that card it will somewhere give me a number for that (and I assume that'll be something over 60), but is that all there is to it? I'm guessing there are a load of other performance attributes that need to be compared too?
Nekonoaijin Apr 15, 2024 @ 4:14pm 
I won't quote you, just take it as read that I, err, read your post :P

The numbers *sort* of have some logic ... but as with all branding, only to the extent that it suits the manufacturer and they break it when it suits them.

Broadly speaking though, for nVidia the xx50 part is entry level (long ago there used to be cards at the 40, 30, 20 and even 10 mark ... but we digress) and not every generation has even had a desktop model at this point, so you probably will only see it on laptops. The xx60 part is 'mid-range'; this should be the default choice for most people, especially if on a budget, to get good performance without spending more than necessary. The specifics vary with the actual model and the marketing, though. The xx80 part is 'high end'; the best performance you can get before you start paying very high prices, but most certainly not a budget option. Depending on the specifics this can actually be the "best bang for bucks" model in the range, even though the xx60 part is really supposed to be that, historically. Again, see what I said about "when it suits them". The xx90 part is the 'enthusiast' tier. This is for people like me who make enough use of their PC, with demanding graphical apps (okay, games, let's be honest here) and have the budget to get the best card money can buy which doesn't actually need a bank loan (or a bank job). To complicate it some more, though, there are 'super' and 'Ti' variants. Super is usually a "refresh" where they take the original and improve it across the board, in terms of gaming performance, and release at a competitive price (ideally). Ti is often released simultaneously, but sometime shortly after the main part, and usually is only applied to the higher end cards. It basically signifies "the best we can do with this card without changing the hardware to the next tier. In practice, often the Ti variant is so much better that you would be unwise to buy the base, other times it's a marginal improvement, but for a much higher price. Again "depends what the manufacturer wants to do" applies here. Basic rule, obviously, is 'bigger number is better'. RTX means it is 'realtime raytracing' capable, which all the latest cards are, but I mention because if you pick something up which is a GTX you should understand that you will not be able to use 'RT'. Whether that matters to you or not is something you'd need to research a little (it did to me, which is why I bought a 20xx card even though I had a very capable earlier card).

'Recommended Spec' just means you need to meet or exceed that specification with your current machine to get the best experience. They date, because unless the demands of the application change (as does happen) they should not be altered from the original posted specs. This also means you can get a good idea of the time when the game was released based on the targets, and specifically that you can answer questions like "will it require a card that has <this feature> or not?"

You will not find fps listed anywhere, and to a large extent it actually doesn't matter. If your card is adequate for running that piece of software it will be able to output frames quickly enough to match the monitor refresh rate. If you are coming in at less than that, your card is not adequate with those settings and resolution. That said, if you are usually getting 40 - 60 fps, you'll likely be fine with a game like this and indeed many others. If you were trying to play competitive shooting games that would likely be unacceptable.
LuckyVanDine Apr 16, 2024 @ 11:18am 
Thanks, again. One final question before I try to direct my currently random reading a bit better.

Originally posted by Nekonoaijin:
'Recommended Spec' just means you need to meet or exceed that specification with your current machine to get the best experience. They date, because unless the demands of the application change (as does happen) they should not be altered from the original posted specs. This also means you can get a good idea of the time when the game was released based on the targets, and specifically that you can answer questions like "will it require a card that has <this feature> or not?"

How do I translate that card "name" into a specification? When I look at web pages for an Nvidia 2080Ti I see a gazillion different attributes- how do I know which of those are important. As you say, the "frame per second" capability of the card doesn't seem to get mentioned, which I find odd given how important it seems to be to people elsewhere in this thread. I do see stuff about CUDA cores, GB of GDDR6 memory, operating frequency, TMUs, ROPS, bus width, nm process, die area, millions of transistors, ray tracing cores, number of shaders, bus interface, streaming microprocessors, etc,

Or is it as simple as looking at the generation/model number in the card name that you described? Does every 30xx card "exceed" every 20xx card, so I could buy a 3050 card knowing it will outperform the 2080 Ti?
Last edited by LuckyVanDine; Apr 17, 2024 @ 9:31am
Nekonoaijin Apr 16, 2024 @ 12:11pm 
So, cards don't have a "frames per second" specification. The card potentially will produce as many frames as it can in the shortest time possible (reality is more complex precisely because that's also kind of stupid, and means you'd be running the card full tilt to produce tens of thousands of frames each second and only displaying perhaps sixty of them each second), this is actually why you should generally not disable V-Synch (vertical refresh synchronization). Exactly how many frames per second the card can produce is going to depend on exactly what is required to produce each frame. If it's a simple 2D image, with no 3D component, the number possible is very large (but actually they don't even engage the 3D units for that, because it's 2D). If it is a very complex 3D scene with a lot of geometry and effects applied, it could be well below your current refresh rate, this is what happens when you turn all the details up, turn on all the 'chrome' and then get a "slideshow" of 15 fps. The framerate you get therefore depends on your hardware and what software it is you are running, and within that what options you have selected, starting with the resolution.

How do you get the specs? Wikipedia, basically. How do you get rankings? There are many possibilities here, but pick an online source and look at their "graphics card rankings". "Tom's Hardware" is an excellent place to start and to educate yourself somewhat on the topic.

Or is it as simple as looking at the generation/model number in the card name that you described? Does every 30xx card "exceed" every 20xx card, so I could buy a 3050 card knowing it will outperform the 2060 Ti?

In theory, ideally, aye. In practice, not necessarily, no. However, every 30 series card has hardware advantages that a 20 series card does not. Whether those matter to you or not is something you have to decide, but this is where getting knowledge matters. For your specific situation, my actual advice is "buy an RTX 4060 if you can afford it". I realize not everyone has the disposable income for purchases like that, but it is genuinely the best choice for current gaming at a reasonable level, with the expectation that you won't replace it for perhaps 5 years, maybe even 10 years, depending on what you are playing at that time. That card can currently be purchased for about $300 in the US, maybe less with the right deal. I think you're in the UK, so you may end paying the same or not much less in pounds (it sucks but they have got away with that for years), regardless, that is a more capable card overall, by some margin, than the 2080 you have been discussing. It's perfectly capable of running this tech demo and the upcoming game that it will become.
Nekonoaijin Apr 16, 2024 @ 12:36pm 
For a really quick reference, here is a link to a page on Tom's Hardware where they ranked recent GPUs.

https://www.tomshardware.com/reviews/gpu-hierarchy,4388.html

As you can see when you look at that image, the AMD RX 7600 is also a solid choice and is available for less, generally. Nothing wrong with AMD but they have historically been a little less compatible at times, and sometimes have had driver issues. I doubt you would have any issues but a majority of gamers still use nVidia for a combination of reasons, including the above. One of either the RX 7600 or RTX 4060 is where you should be looking, though.
TerraMimic Apr 16, 2024 @ 4:14pm 
Originally posted by LuckyVanDine:
Or is it as simple as looking at the generation/model number in the card name that you described? Does every 30xx card "exceed" every 20xx card, so I could buy a 3050 card knowing it will outperform the 2060 Ti?

The comparison between say, 20xx and 30xx series cards would be better at the individual "xx" level. In general, you can infer that a 3060 is better than a 2060. But, depending on the game, a 2070 may match the performance of a 3060.

Using a site like https://gpu.userbenchmark.com/Compare/Nvidia-RTX-3060-vs-Nvidia-RTX-2070/4105vs4029 is sometimes a useful comparison tool. From this particular example, the overall score gives the edge slightly to the 2070 over the 3060. But you can also use it to compare between brands, so you can check an AMD card vs an Nvidia card.

The newer generation cards usually have newer, more capable feature sets for software developers to take advantage of. The 20xx cards have DLSS (Dynamic Learning Super Sampling), while the 30xx cards have DLSS2 and the 40xx series have DLSS3 (2nd and 3rd generation DLSS). And while all 3 series are capable of RT, each successive generation is more efficient at it.
Nekonoaijin Apr 16, 2024 @ 5:02pm 
Originally posted by TerraMimic:
Originally posted by LuckyVanDine:
Or is it as simple as looking at the generation/model number in the card name that you described? Does every 30xx card "exceed" every 20xx card, so I could buy a 3050 card knowing it will outperform the 2060 Ti?

The comparison between say, 20xx and 30xx series cards would be better at the individual "xx" level. In general, you can infer that a 3060 is better than a 2060. But, depending on the game, a 2070 may match the performance of a 3060.

Using a site like https://gpu.userbenchmark.com/Compare/Nvidia-RTX-3060-vs-Nvidia-RTX-2070/4105vs4029 is sometimes a useful comparison tool. From this particular example, the overall score gives the edge slightly to the 2070 over the 3060. But you can also use it to compare between brands, so you can check an AMD card vs an Nvidia card.

The newer generation cards usually have newer, more capable feature sets for software developers to take advantage of. The 20xx cards have DLSS (Dynamic Learning Super Sampling), while the 30xx cards have DLSS2 and the 40xx series have DLSS3 (2nd and 3rd generation DLSS). And while all 3 series are capable of RT, each successive generation is more efficient at it.


Not picking on you, or taking issue with what you say, but "User Benchmark" is absolutely the last place anyone should go, and certainly not where we should send newcomers. It's utter trash for reason that don't belong on a Steam thread but can be found with a quick google. Avoid it like the plague.

EDIT: It's not actually bad for the benchmarking software, that can give you useful data about your system (like noticing that a part is underperforming because you forgot to properly adjust firmware settings). It is worse than useless for making purchasing decisions, particularly with regards to CPUs. It's less bad for GPUs, but there are simply better resources; I'd recommend Tom's Hardware, which has been at this game since I was first paying attention to this stuff and building my own systems.
Last edited by Nekonoaijin; Apr 16, 2024 @ 5:09pm
TerraMimic Apr 17, 2024 @ 10:45am 
Originally posted by Nekonoaijin:
Not picking on you, or taking issue with what you say, but "User Benchmark" is absolutely the last place anyone should go, and certainly not where we should send newcomers. It's utter trash for reason that don't belong on a Steam thread but can be found with a quick google. Avoid it like the plague.

EDIT: It's not actually bad for the benchmarking software, that can give you useful data about your system (like noticing that a part is underperforming because you forgot to properly adjust firmware settings). It is worse than useless for making purchasing decisions, particularly with regards to CPUs. It's less bad for GPUs, but there are simply better resources; I'd recommend Tom's Hardware, which has been at this game since I was first paying attention to this stuff and building my own systems.

Fair enough. It was just the first site I pulled up that does direct comparison between cards. It also has a breakdown of a bunch of different data points for the cards. Probably could have found a better site if I spent a little more time looking.
Nekonoaijin Apr 17, 2024 @ 11:54am 
Originally posted by TerraMimic:
Originally posted by Nekonoaijin:
Not picking on you, or taking issue with what you say, but "User Benchmark" is absolutely the last place anyone should go, and certainly not where we should send newcomers. It's utter trash for reason that don't belong on a Steam thread but can be found with a quick google. Avoid it like the plague.

EDIT: It's not actually bad for the benchmarking software, that can give you useful data about your system (like noticing that a part is underperforming because you forgot to properly adjust firmware settings). It is worse than useless for making purchasing decisions, particularly with regards to CPUs. It's less bad for GPUs, but there are simply better resources; I'd recommend Tom's Hardware, which has been at this game since I was first paying attention to this stuff and building my own systems.

Fair enough. It was just the first site I pulled up that does direct comparison between cards. It also has a breakdown of a bunch of different data points for the cards. Probably could have found a better site if I spent a little more time looking.

Aye, I don't assume any malice or anything, it just so happens that site is both notoriously bad, and incredibly high in search results.
< >
Showing 31-45 of 58 comments
Per page: 1530 50