ΔV: Rings of Saturn

ΔV: Rings of Saturn

Cannon Fodder 3 set. 2023 às 8:44
Does anyone actually USE lidar?
I have seen zero use for it except seeing other ships who are displaying their ID, and i've only ever been killed by people not transmitting an ID and telling a ship sneaking up on me from rocks on the LIDAR is either an art I haven't figured out, or nigh impossible.

So what is this instruments actual use?
< >
A mostrar 31-45 de 78 comentários
Bagpipe 21 fev. 2024 às 3:49 
I think what people are really looking for here is some form of radar warning receiver. In military aircraft now, the radar does not tell you if something is hostile or inert. The RWR tells you if something is beaming you and the IFF systems tell you if it is friendly or hostile and often what it is.
These use complex systems which would not function in space, i imagine, at least not the same way.
So the question of LIDAR being unrealistically represented etc. is not really barking up the right tree in this occasion. It's another system entirely which you are looking to have implemented. Right?
Nellvan 21 fev. 2024 às 5:51 
Originalmente postado por Badstormer:
Honestly, it does feel a bit unrealistic to be in possession of technology two centuries more advanced than ours
I get that mankind gotta be at least a bit advanced in the game's scenarios, but: We're miners that use nuclear powered steam rockets to get around. What exactly is that advanced tech you're talking about?
Baylock 21 fev. 2024 às 7:13 
Originalmente postado por Badstormer:
Honestly, it does feel a bit unrealistic to be in possession of technology two centuries more advanced than ours but not have any automated system to warn us if something unexpected is on LIDAR. While I wouldn't usually point this out in a sci-fi game, realism is one of the main selling points here.

Let's put it like this: Do you know how you would go about making a LIDAR automatically warn you about something unexpected? Here's a starting point: How does a returned reflection of light differentiate information? Your LIDAR is already doing all it possibly can: It can tell you if the reflections are coming back sooner or later (redder or bluer on the LIDAR) and that's all it can tell you. Using only that information, can you find a way to make it, without using AI, differentiate 'normal' from 'unexpected'?

With that in mind, it's very realistic! :)

Remember, LIDAR is not RADAR. It is significantly less capable at identifying things itself, but much simpler and more robust. Your human brain can discern what size a series of LIDAR returns probably is, because your human brain is built for pattern recognition, and is capable of independent, deductive reasoning. The LIDAR system is not- it is built to pulse beams of light out, generally in a circular series, to provide data, and nothing more. (LIDAR, today, is also primarily and almost solely used for terrain mapping, but that's done over the course of hours using drones. It is not capable of automatically warning you of jack diddly, and it's got a ridiculous degree of depth of accuracy.)
Supraluminal 21 fev. 2024 às 8:25 
I'm pretty sure I could write a simple program in JavaScript to do additional useful processing on the data coming back from the LIDAR and other sensors in this game. For example, if you see multiple high-albedo pings adjacent in space/time, maybe with certain shape characteristics (e.g. right angles) when you connect the dots, and especially if their velocity appears to differ between frames (i.e. they are accelerating), you can have a high degree of confidence that what you are pinging is not just a rock. Not even a shiny, moving rock, but probably a ship.

Obviously you have to develop some basic heuristics to decide whether or not pings are in fact coming back from the same object between frames, but I'm fairly confident it could be done in a way that would at least be usefully accurate. Enough to provide a "hey something weird over here" alarm that didn't give constant false positives.

And remember we have other sensors, including optical via ROV and the ship, and the visual feeds from Enceladus. I would assume there's at least some basic radar on the ship as well, but I don't know if that's intended to be the case or not. Also not specified but very reasonable assumptions given 2024 technology - a directional EM receiver (i.e. slightly fancy radio antenna) and an infrared camera. If you're allowed to correlate data from all of these sensors, you can make even more inferences about what it all represents.

So sure, if you were limited to pure LIDAR data and no processing beyond what it takes to get it into a visual form, you'd get more or less what's in the game now. The question I'm interested in is, why impose this limitation? Why not assume integrated sensor data, machine-processed and machine-analyzed to at least a basic degree?

It's deeply immersion-breaking and obstinately unrealistic in my opinion, for all the reasons people have given. (Most compelling to my mind, in the hyper-capitalist world of dV, mining is a profit-seeking venture and there would be a significant demand for better mapping and situational awareness software.) We already have a million concessions to reality and convenience in other areas of the game (this is a game, remember), so why is the LIDAR/minimap so sacred?

The concept of "expert systems" dates back to the 1970s. Seems reasonable to assume that in another couple of centuries people will have gotten pretty good at this even without employing true general AI capabilities. I don't think this is controversial, right? Given that we already have a ton of systems like this in our daily lives? We have cars that can automatically watch blind spots and adjust suspension to smooth out bumpy roads. You can buy a toaster oven that automatically roasts a chicken for a few hundred dollars. Surely something as complex and valuable as an industrial spacecraft is going to get some basic attention in this regard.

(Consider the autopilot options available for just a second, for example. Collision avoidance and route planning require the ability to discriminate separate objects and situate them in space.)

Is short, there is a world of possibility between "raw data from LIDAR instrument" and "analysis and integration of disparate sensor feeds inside your human brain" that could easily provide a more informative and useful picture of space around us based in part on the LIDAR data.
Supraluminal 21 fev. 2024 às 8:47 
Originalmente postado por Nellvan:
Originalmente postado por Badstormer:
Honestly, it does feel a bit unrealistic to be in possession of technology two centuries more advanced than ours
I get that mankind gotta be at least a bit advanced in the game's scenarios, but: We're miners that use nuclear powered steam rockets to get around. What exactly is that advanced tech you're talking about?
Nuclear powered steam rockets are quite advanced already compared to anything we've got now, but to name a few more present in this game:

Miniaturized, net-energy-positive, efficient fusion reactors and thrusters
Droplet radiators
Full-on artificial general intelligence
Whatever "singularity cores" are
Robust, portable, efficient ISRU systems (the MPUs)
Successful long-term, wide-ranging human colonization of the solar system

Also, "nuclear powered steam rockets" is a bit of a misnomer. It's a normal nuclear reactor that uses steam to spin a turbine to generate electrical power. The "rocket" part is high-temperature hydrogen plasma heated by the reactor, not "steam" as such. So more like nuclear thermal rockets that also happen to use steam to generate electricity as needed. This is opposed to other propulsion systems like nuclear pulse (Orion) drives, electrical thrusters, or fusion rockets that accelerate propellant via somewhat different means.

I know it's romantic to think of these ships as steampunk nuclear teakettles, but it's selling the technology and engineering involved short.

e: Actually, I'm not so sure the turbines use steam (as in gaseous water) to generate power either, now I think about it. They must also just use heated hydrogen, since they consume propellant. I think that's done to accommodate open-cycle cooling of the generator loop? Which, I dunno, the ship has radiators, but maybe it makes a measurable difference in what size those radiators need to be.

e2: Booted the game to check, and in fact most of the thrusters aren't even basic nuclear thermal rockets, they're magnetoplasmadynamic. That means that after heating the propellant, it's further accelerated by applying a magnetic field, making them a type of electrical thruster primarily. A lot of them also use oxygen-hydrogen plasma, not pure hydrogen, which I guess is arguably steam-like, so that's fun. MPD thrusters exist as prototypes now but absolutely count as advanced technology if they're in mass production!
Última alteração por Supraluminal; 21 fev. 2024 às 9:21
Koder  [developer] 21 fev. 2024 às 10:28 
Originalmente postado por Supraluminal:
I'm pretty sure I could write a simple program in JavaScript to do additional useful processing on the data coming back from the LIDAR and other sensors in this game. For example, if you see multiple high-albedo pings adjacent in space/time, maybe with certain shape characteristics (e.g. right angles) when you connect the dots, and especially if their velocity appears to differ between frames (i.e. they are accelerating), you can have a high degree of confidence that what you are pinging is not just a rock. Not even a shiny, moving rock, but probably a ship.
This argument would be much more impactful if the sensor systems in the game did not already do that. And they do - in fact, you have three identification systems bound to your ship sensors: the transponder readout, the heuristic for closely grouped high-albedo contacts that you described, and the visual tracking of your IFF system - which shows you with great precision position and orientation of a vehicle you encounter.
Badstormer 21 fev. 2024 às 10:38 
Originalmente postado por Koder:
This argument would be much more impactful if the sensor systems in the game did not already do that. And they do - in fact, you have three identification systems bound to your ship sensors: the transponder readout, the heuristic for closely grouped high-albedo contacts that you described, and the visual tracking of your IFF system - which shows you with great precision position and orientation of a vehicle you encounter.

I think the complaint is more that this involves a lot of manual analysis that could be realistically automated to yell at the pilot if something unusual is picked up. Especially with how far machine learning has come just in 2024, I see this as being a cakewalk to implement in a vehicle manufactured over 200 years in the future.

You're absolutely correct that a human can manually interpret these tools if they take the time to learn them, the issue is more that this sort of thing involves enough consistent pattern recognition to make it very plausible to automate even by today's standards, especially in an environment where one is much more pressured than usual to make sure threats are detected early.
Supraluminal 21 fev. 2024 às 10:47 
What heuristic currently exists for closely-grouped high-albedo pings and how does it manifest in game differently from scattered high-albedo pings? I don't think I've ever noticed that. All I can think of is a heavier audio blip when you get clustered pings but probably that's not what you mean?

If you read further down in the post, you'll see I mention integration of IFF info in the LIDAR display as an argument for potentially further minimap-ifying that display and generally increasing the level of detail and nuance you get out of your sensor readouts. (More indications of potential anomalies short of positive ship ID to start, but could show asteroids and etc. with more persistence and detail, stuff like that).

As for transponders, that of course only works if the other ship has theirs enabled. In cases where it isn't, you can still extract warnings about possible nearby ships from just the LIDAR returns. In fact, a "ship-like" set of LIDAR pings WITHOUT a matching transponder hit seems like it should be a huge warning sign, but right now unless you're specifically watching the LIDAR display that will go unremarked upon from what I can tell. (Until astrogator ID, of course.)

Personally I don't engage with combat that much in this game and haven't had a lot of trouble getting ambushed by active hostiles, so that specific aspect of the LIDAR discussion doesn't affect me greatly (though I can imagine being annoyed if it did). For me, it's more about how striking it is that the LIDAR display, which is already kind of a minimap, isn't more fleshed out in ways that seem to me to be not only useful in the fictional context of the game but highly plausible from a technical perspective.

The "hard sci-fi" aesthetic can be taken so far that it actually hurts believability, basically. If I ask "why doesn't the game do this/have this?" and the answer seems to be "to make it feel more gritty" rather than "because that's not technically possible/feasible" then it's going to feel weird. And for me, and I gather other people in here, the currently somewhat rudimentary nature of the sensor readout(s) runs into this issue.

e to add: I know people have been trying to argue that LIDAR can't do x, y, or z, but I'm not very convinced. Most of those arguments seem to be based on what raw data LIDAR can produce rather than what inferences can be made from that data with even fairly simple automated analysis.
Última alteração por Supraluminal; 21 fev. 2024 às 10:49
Badstormer 21 fev. 2024 às 10:57 
Originalmente postado por Supraluminal:
For me, it's more about how striking it is that the LIDAR display, which is already kind of a minimap, isn't more fleshed out in ways that seem to me to be not only useful in the fictional context of the game but highly plausible from a technical perspective...

...The "hard sci-fi" aesthetic can be taken so far that it actually hurts believability, basically. If I ask "why doesn't the game do this/have this?" and the answer seems to be "to make it feel more gritty" rather than "because that's not technically possible/feasible" then it's going to feel weird. And for me, and I gather other people in here, the currently somewhat rudimentary nature of the sensor readout(s) runs into this issue.
This, right here, seems like the crux of the issue. It just doesn't make sense to me that we'd still be manually reading LIDAR/visual feeds 200 years in the future without any form of automation whatsoever.

To be clear, this isn't a game-breaking issue and I am happy to live with the system as it is - but I'd be hard-pressed to say this is how it would look in reality.

I'm not familiar enough with the mechanics of LIDAR to suggest an evidence-backed alternative, but I can tell you that the whole point of modern machine learning is pattern recognition, and reading LIDAR/visual feeds seems like exactly that.

Personally, I propose having the ability to discover hardware not initially available that offers an AI replacement for crew members whose tasks could be automated, as a similar mechanic is already in place for racing equipment.

edit: to put this all another way, for something to be realistic does not necessarily imply it needs to be difficult and/or manual, especially in an environment where computational power would be easy to come by and where energy is produced in exorbitant excess relative to today's power generation. The same quantity/quality of LIDAR data gathered could be processed much more thoroughly in a context where several orders of magnitude more computing power would be available.

Exactly how this might look is up to interpretation, but I strongly doubt we'd be manually interpreting sensor input when this function can already be automated to a great extent today.
Última alteração por Badstormer; 21 fev. 2024 às 14:53
Badstormer 21 fev. 2024 às 15:11 
Originalmente postado por Alzeid Baylock:
Originalmente postado por Badstormer:
Honestly, it does feel a bit unrealistic to be in possession of technology two centuries more advanced than ours but not have any automated system to warn us if something unexpected is on LIDAR. While I wouldn't usually point this out in a sci-fi game, realism is one of the main selling points here.

Let's put it like this: Do you know how you would go about making a LIDAR automatically warn you about something unexpected? Here's a starting point: How does a returned reflection of light differentiate information? Your LIDAR is already doing all it possibly can: It can tell you if the reflections are coming back sooner or later (redder or bluer on the LIDAR) and that's all it can tell you. Using only that information, can you find a way to make it, without using AI, differentiate 'normal' from 'unexpected'?

With that in mind, it's very realistic! :)

Remember, LIDAR is not RADAR. It is significantly less capable at identifying things itself, but much simpler and more robust. Your human brain can discern what size a series of LIDAR returns probably is, because your human brain is built for pattern recognition, and is capable of independent, deductive reasoning. The LIDAR system is not- it is built to pulse beams of light out, generally in a circular series, to provide data, and nothing more. (LIDAR, today, is also primarily and almost solely used for terrain mapping, but that's done over the course of hours using drones. It is not capable of automatically warning you of jack diddly, and it's got a ridiculous degree of depth of accuracy.)

The question is less "can LIDAR detect patterns" and more "in an era where we have the computational power for sentient artificial intelligence, why can't LIDAR input be automatically processed to any extent?"

It doesn't take artificial intelligence of any form to automatically process how dense a given cluster of pings are on LIDAR and automatically ping anything exceeding a given density, or to note the relative position of a given reflection over a period of time and see if it is changing in a manner a rock couldn't. The calculations involved in the autopilot are more complex than either of these functions.

edit: honestly, the fact that RADAR can be used instead does justify not devoting development time to improving LIDAR given there is an alternative - my concern is more around LIDAR apparently having zero functional improvement in automating readout interpretation over the next two centuries
Última alteração por Badstormer; 21 fev. 2024 às 21:50
Baylock 22 fev. 2024 às 0:18 
Originalmente postado por Supraluminal:
I'm pretty sure I could write a simple program in JavaScript to do additional useful processing on the data coming back from the LIDAR and other sensors in this game. For example, if you see multiple high-albedo pings adjacent in space/time, maybe with certain shape characteristics (e.g. right angles) when you connect the dots, and especially if their velocity appears to differ between frames (i.e. they are accelerating), you can have a high degree of confidence that what you are pinging is not just a rock. Not even a shiny, moving rock, but probably a ship.

Cool, don't think that'll work the way you think it will, because you're using information the LIDAR does not have. The LIDAR only sees individual returns. It is displaying those returns to you, but cannot process them at all. It does not see 'shape characteristics' it just sees individual pinpricks. This is WHY it only displays colour information for doppler shift. That's all it can do. All the other information you 'see' is your own mind finding patterns, an emergent property.

I would, however, encourage you to come to the Discord and put forth the idea, if you'd like it to have a chance at being a feature, though! It's not out of the question- but having more minds that might be able to put a fitting spin on it would def give it a better shot. (One thing to keep in mind is that every feature has to have an in-universe justification)

Originalmente postado por Supraluminal:
Obviously you have to develop some basic heuristics to decide whether or not pings are in fact coming back from the same object between frames, but I'm fairly confident it could be done in a way that would at least be usefully accurate. Enough to provide a "hey something weird over here" alarm that didn't give constant false positives.

That sounds like the sort of bleeding edge, high quality tech that you might see as part of the Model E- which was purposefully designed to be quite good, but with the crippling disadvantage of being very, very expensive to operate. A key design goal of Koder's is the avoidance of upgrades- if something does one thing better, it better do something worse, so as to keep a healthy variety of options and avoid metabuilding. Even 'upgrading' core rods for more thermal output has a downside: increasing mass. (For another example, I use a twin reactor and MPD5035, instead of a military turbine. This masses only 3500 combined, as opposed to 5000, and still nets me 500 MW, with a penalty to thermal draw, which I can either sacrifice some of the mass savings by installing a heavier reactor core, or just deal with the faster thermal drop by managing my thruster firings better.)


Originalmente postado por Supraluminal:
And remember we have other sensors, including optical via ROV and the ship, and the visual feeds from Enceladus. I would assume there's at least some basic radar on the ship as well, but I don't know if that's intended to be the case or not. Also not specified but very reasonable assumptions given 2024 technology - a directional EM receiver (i.e. slightly fancy radio antenna) and an infrared camera. If you're allowed to correlate data from all of these sensors, you can make even more inferences about what it all represents.

Reminder that you're flying some economy-class mining rig, and generally have access to industrial-type commercial gear, not military grade hardware. If you've ever met a special ship- the Big Bad Wolf- then know one thing. That's a demilitarized destroyer. And it's using three mining lasers instead of the big ♥♥♥♥♥♥♥ XASER the actual thing uses. We just don't have access to fancy gear out here in the boonies!

Originalmente postado por Supraluminal:
So sure, if you were limited to pure LIDAR data and no processing beyond what it takes to get it into a visual form, you'd get more or less what's in the game now. The question I'm interested in is, why impose this limitation? Why not assume integrated sensor data, machine-processed and machine-analyzed to at least a basic degree?

DV was started as a game that Koder wanted, and nobody else made. He opened it up to a community to help shape it going forward, as a means of financing it. But he's stuck to its core values: sidegrade mentality, almost total adherence to the Hard SF mentality (the only handwavium I know of is the Z-Pinch engine- materials science has not yet found a material that can withstand 5000 K sustained input energy, so the backing of the ship should actually rightfully just MELT after even a little bit of prolonged thrusting. Also, it's not got a shadow shield afaik, so it would be irradiating everyone in the astronomic vicinity. You learn a lot of things in the discord's science chat! :) )

Originalmente postado por Supraluminal:
It's deeply immersion-breaking and obstinately unrealistic in my opinion, for all the reasons people have given. (Most compelling to my mind, in the hyper-capitalist world of dV, mining is a profit-seeking venture and there would be a significant demand for better mapping and situational awareness software.) We already have a million concessions to reality and convenience in other areas of the game (this is a game, remember), so why is the LIDAR/minimap so sacred?

There's also an advantage to not having bleeding edge hardware out in the rings. Y'know the anarchists? They're annoying to EP, but they also don't have bleeding edge gear, so they're 'merely' annoying, so there's no reason to just vaporize them with EP's XASER array. Sure, having heuristic LIDAR wouldn't justify that, but if they had that, it stands to reason they probably have OTHER bleeding edge stuff, which would. And then we wouldn't have the nice anarchists :(

Seriously I'd invite you to come to the discord and pitch the idea- or even just the question. Koder's much more active there afaik, and it'd be more visible- and everyone there could see the answer too!

Originalmente postado por Supraluminal:
Surely something as complex and valuable as an industrial spacecraft is going to get some basic attention in this regard.

Relativity. The Colthon is complex to a toaster, but when you look at it relative to spacecraft, it's brutally simple. Your control cabin is a bloody escape pod! (Also I think that one's like 200 years old? Hell, I think the glorious K37 is older?)

Originalmente postado por Supraluminal:
(Consider the autopilot options available for just a second, for example. Collision avoidance and route planning require the ability to discriminate separate objects and situate them in space.)

It doesn't require that it determine what the object is. It just needs to know that a LIDAR beam is reflecting in the direction of travel. Just one will do, if it's consistent. Keeps the system very simple, very compact, and thus, very economical. This is corporate territory after all, and you're a small business. I mean, hell, you pilot the ship yoursel- wait a second. What are you doing, when you have a pilot?

Originalmente postado por Supraluminal:
Is short, there is a world of possibility between "raw data from LIDAR instrument" and "analysis and integration of disparate sensor feeds inside your human brain" that could easily provide a more informative and useful picture of space around us based in part on the LIDAR data.

Again, I'd encourage you to join the discord and pop that question- it may have been asked before, so it's possible there's an answer floating around- I don't think most of us on the discord frequent the steam forums. And if not, again Koder would probably pop in to answer it, or perhaps Lurkily, the writer.
Badstormer 22 fev. 2024 às 18:37 
I don't have the energy to quote the colossal post above me and reply to specific points, but I will say that I think the amount of technology required for basic heuristics is being dramatically overestimated here. This stuff isn't "bleeding edge" technology today, so why would it be 200 years in the future?

Also, LIDAR is clearly capable of estimating distance between any given objects, otherwise it wouldn't be able to display the ship's position relative to a reflected object on the HUD. This data can be taken and processed by a script independent of LIDAR to glean further information from it, specifically by taking multiple LIDAR cycles and comparing the positions of individual reflections between those cycles, looking for anomalies where an object appears to have shifted its own trajectory in a manner not possible without external input.

All one would need for this input processing script is two to three LIDAR pings to estimate velocity (more or less depending on the LIDAR's sweep frequency which affects scan resolution), followed by comparing it across further pings for rapid changes a rock couldn't make. Naturally, false positives would be inevitable at some point, especially if the LIDAR is cranked high enough to cause less accurate positional data, but it's better than no warning at all.

...that being said, personally, I've found it way easier to just use RADAR, as it can already detect rocks anyway if you don't mind them being dimmer.
Última alteração por Badstormer; 22 fev. 2024 às 18:47
Baylock 22 fev. 2024 às 21:35 
Originalmente postado por Badstormer:
I don't have the energy to quote the colossal post above me and reply to specific points, but I will say that I think the amount of technology required for basic heuristics is being dramatically overestimated here. This stuff isn't "bleeding edge" technology today, so why would it be 200 years in the future?

The point I'm making isn't that it's particularly high tech. It's that it's higher tech than what a low budget company would be using, if not higher tech than E Corp *allows* to be sold near the rings. You'd be surprised what might touch off a technology arms race!

Originalmente postado por Badstormer:
Also, LIDAR is clearly capable of estimating distance between any given objects, otherwise it wouldn't be able to display the ship's position relative to a reflected object on the HUD.

It is capable of estimating distance between anything it reflects off of. It is unaware of what anything it is reflecting off of is,[/s] which is the crux of the problem. All it knows is if something is present, if it is moving closer, or further, and you can determine range with that information by measuring the change in the return signal. None of that will tell you what an object is, nor help you determine that at all. The data is overlaid over the visual feed- but the LIDAR system is not seeing any of what you see. It sees returns of light. It can know where a given return originated from, though, so it is relatively simple to project where that return came from over the visual feed- the visual feed is at a constant distance from the ship, so overlaying it based on estimated range is trivial. (You'll note however, that there seem to be a slight delay when rapidly zooming in and out smoothly- it's not perfect!)

Originalmente postado por Badstormer:
This data can be taken and processed by a script independent of LIDAR to glean further information from it, specifically by taking multiple LIDAR cycles and comparing the positions of individual reflections between those cycles, looking for anomalies where an object appears to have shifted its own trajectory in a manner not possible without external input.

Run the expensive non-rotating LIDAR and spin rapidly. Look at your LIDAR display. It's ghosted completely out, nothing is going to be gleaned from that data, as it's meaningless. Everything looks like it's moving unnaturally to such a system.

Secondarily, you're a civilian ship operating in civilian territory, that is nominally full of ships with mandated transponders. The LIDAR system is not required to discern a ship, because the transponder receiver knows where it is. The fact that pirates are a thing is out-of-scope for you, a civilian, as the Vilcy are a thing, and the rings are nominally 'safe enough'- and if you go wandering out where it's known to not be safe, that's 'kind of a you problem', is what I would imagine the in-universe reason we don't have this functionality, if it demonstrably worked.

Which brings me back around to: Pitch it in the discord!!! I'm willing to bet if you could demonstrate how it would work, to Koder, and maybe find a way to ensure it could be implemented in, say, a new LIDAR system that has downsides to compensate, it could be a thing!

Originalmente postado por Badstormer:
All one would need for this input processing script is two to three LIDAR pings to estimate velocity (more or less depending on the LIDAR's sweep frequency which affects scan resolution), followed by comparing it across further pings for rapid changes a rock couldn't make. Naturally, false positives would be inevitable at some point, especially if the LIDAR is cranked high enough to cause less accurate positional data, but it's better than no warning at all.

A high speed iceroid slamming into another, and having its resulting vector be radically different could be mistaken- but then, you could conceivably crank up your proximity warning to trigger beyond visual range, and just assign it to only trigger on contacts moving fast enough- with the obvious downside that if they happen to fly a course where they're not moving relative to you, only looming, than the system will not fire. (An example of how to 'kludge it' based on having an upside, a downside, requiring Tuning work where no one tune setting is optimal, and in general making it not a clear upgrade. It functionally accomplishes what you want- automatic warning of beyond-vision targets- and has a downside: it's not specific to ships, a sufficiently weirdly moving iceroid will trigger it (which isn't a problem gameplay wise, but does introduce emergent imperfections) and a ship that happens to move in a non pure pursuit course will not trigger it, which even reflects that as a pursuit strategy of irl animals that are looking for movement, but won't notice a slowly looming object as coming towards them)

Originalmente postado por Badstormer:
...that being said, personally, I've found it way easier to just use RADAR, as it can already detect rocks anyway if you don't mind them being dimmer.

And this is probably why that exists- it makes it pretty obvious to discern iceroid from ship, while being a touch annoying to use in a mining context and thus being quite useful for one situation, while not being useful for another, nor meta. (And then there's me who puts it on all my miners anyway because I can work with its weak returns and massive display clutter because I just work weirdly. I think it also has a lower initial sweep speed, so you can jack it up and it will retain decent resolution, too. (I tend to run the default at 75%)
Badstormer 22 fev. 2024 às 22:05 
Honestly, I'm not verbose enough in the lore of this game to argue whether this feature makes sense in this context, but I've pitched a much more straightforward idea here that I'm happy to take feedback on. Just integrating a RADAR and LIDAR system on one ship at the expense of having to pay for and maintain both seems like a reasonable proposition, both gameplay-wise and technology-wise.
OverThere 24 fev. 2024 às 8:00 
I respect and fully agree with the developer's arguments that the output of the display is exactly the kind of raw, hard to understand output one would expect from those instruments and that with experience, proper care and attention a player can figure out what the output means. I do not yet have enough game experience to do this, so find the displays frustrating.

The problem is that this is too much to realistically expect of the player who is juggling multiple things at once.

In the military, whether a submarine, ship or multi-crewed aircraft, there would be a person whose entire job was to look at the true and hard to figure out data coming from the instruments, interpret it and in real time put it into a form that the other crew members could quickly understand.

This role, which would very much exist in real life, is missing in the game. It should be there and its presence would go a long way to resolving the understanding frustration repeatedly expressed in the discussion boards.

My proposal would be that the geologist role be redefined as sensor operator, combining the role of geologist and LIDAR/radar/visual data interpreter. When the sensor operator decided a signal was a single rock with multiple minerals, they would mark it as such and it would appear. When they decided that a signal was a ship operating without a transponder heading in a certain direction, they would mark it. And so on.

The player mini-map would be either the raw data, as is now the case, or with the signal operators markings. The geologist map, which you open from the J menu, would be the raw data with the sensor operator's interpretations included.

The sensor operator interpretations need not be accurate 100% of the time. Mistakes do happen, for instance tragically in 1988 when Iran Air flight 655 was mistakenly identified as an attacking aircraft by a sensor operator (Flight 655 was climbing but the sensor operator indicated it was diving). On the basis of that mistaken interpretation flight 655 was shot down with the loss of all 290 people on board.
< >
A mostrar 31-45 de 78 comentários
Por página: 1530 50