Stormworks: Build and Rescue

Stormworks: Build and Rescue

485 ratings
Synthetic Vision (Augmented Reality)
41
45
14
13
4
2
7
7
3
5
4
2
2
   
Award
Favorite
Favorited
Unfavorite
File Size
Posted
Updated
498.805 KB
Oct 6, 2022 @ 6:41am
Jan 19, 2024 @ 9:41am
6 Change Notes ( view )

Subscribe to download
Synthetic Vision (Augmented Reality)

Description
Creating triangle mesh with 2.5d delaunay triangulation and shows it on HUD with augmented reality and flat shading.

Code [github.com]
Microcontroller
A simpler bare-bone setup.


QnA
How does it work?
I'm using a laser distance sensor to get points, which is added to a point cloud. For every new point I try to add, I check the distance to the nearest point in the data set that is stored. If the distance is not bigger than a set minimum threshold then I ignore the point. That just makes sure some points are not in the same place or too clumped together.

I use a k-d tree [en.wikipedia.org] structure to efficiently search the distance to a nearest point, else I have to search through the whole set and save a best point. That is to say with the tree it takes about O(log n) on average to find a nearest point, but if the tree is imbalanced then it goes towards O(n) search time.
The 'O()' notation is big O notation, and it's just to say how efficiently an algorithm is when it grows in size, or time complexity, so you just ignore the 'O', and look what's in the parenthesis. https://www.bigocheatsheet.com/

Then after getting a point, instead of directly drawing the point cloud like LIDAR, I triangulate it with 2.5D delaunay triangulation, [en.wikipedia.org] this algorithm [en.wikipedia.org] specifically.

The algorithm works with n dimensions, but not that simple to work in 3D, as you get the dimensional simplex [en.wikipedia.org], so tetrahedra. Therefore it's done in 2D, so the height is ignored during triangulation, but used again when drawing, so 2.5D.
The algorithm is incremental, which is to say that I add one point at a time to the mesh, and use the intermediate calculations, so I'm not starting the triangulation over from scratch every time a new point is added.

The color/shading of a triangle is calculated upon creation of it, which is first determined if it should be water or ground color by if triangle centroid center (average of 3 vertices altitude) is under 0 altitude, then it's colored as water else ground.
Then the shading is done with dot product of the triangles normal and a fixed directional light vector with some biased lerp curves to better show steepness.

Then the triangle is added to a Quadtree [en.wikipedia.org] data structure. The data structure spatially partitions triangles in chunks, such that it is fast to frustum cull the chunk and either accept triangles inside or throw away the whole chunk, without needing to test every triangle inside. (The quadtree does not store duplicates)

In short (Optimizations not explained) when it gets to drawing the triangles, it is first calculated which triangles are in view and then that set is sorted with Painter's algorithm. [en.wikipedia.org]
Then the mesh is finally drawn with augmented reality.


TL;DR:
Laser distance sensor to get points, in which new points are tested if the distance to the closest saved point is bigger than a minimum threshold else ignored.
Then triangulated to a mesh with delaunay triangulation, in which the triangles color/shading is decided upon creation. Triangles are spatially partitioned in a quadtree. Quadtree gets frustum culled to quickly get triangles in view.
Then before drawing, the triangles are tested if actually in view and then sorted with painter's algorithm (Depth sort) and finally drawn with augmented reality.


Why a button for isFemale?
There is a height difference depending on gender, so it's to make augmented reality work.

How does it differentiate water?
Checks if the average altitude/height of the 3 vertices that make a triangle is under 0 altitude then it's colored as water else it's over 0 and therefore colored as ground.

Works underwater? Scan the seabed?
Lasers in SW has no issue scanning underwater, therefore it's no issue and works on the seabed. Colors might not be optimal tho.

It is primarely intended for flying vehicles and due to the implementation of frustum culling bounding volumes (Quad tree. Frustum culling 6 points per quad node: Each quad corner at 0 altitude and 2 at quad center with different set altitudes), then it might have issues deeper underwater, or not...

Works in caves?
Nope. It is 2.5D, meaning the triangulation happens in 2D, in which the vertices height are ignored, which results in a connected mesh, where if you use the vertices height after triangulation, it effectively becomes a heightmap mesh. If you scan something with a roof or in a cave then you get spiky terrain from the ground to roof, becoming unusable.
250 Comments
Richman Feb 25 @ 7:05am 
Yup, me too. With the LookX and LookY info, it should be possible to adjust the projection accordingly.
HEINZ-MUSTARD Feb 19 @ 7:31pm 
man i wish this could be used with the new hmd
Jumper  [author] Nov 1, 2024 @ 6:55am 
Tho the vehicle file name is Walrus_Delaunay_Triangulation <version>
Jumper  [author] Nov 1, 2024 @ 6:53am 
The vehicle is something i put together in a day or two for debugging this, so it doesn't have a name and so the vehicle in the video is just this workshop vehicle.
iru75 Nov 1, 2024 @ 4:15am 
what is the name of the plane in the video? Thanks
Jumper  [author] Aug 16, 2024 @ 2:10pm 
[2/2]
Or just another general bug, but I'm more leaning towards the edge cases of robustness with floating points in the triangulation implementation.

At this point I don't really wanna bother trying to fix it, as it would be annoyingly hard for me to figure out, but more the stupid char limit, in which if a (~800 char) interpreter using string data existed, then maybe a rewrite to have it in 1 or 2 scripts, and try a more robust algorithm implementation.
Still, thx for pointing out that it crashes. Might fiddle with it again in the future.
Jumper  [author] Aug 16, 2024 @ 2:10pm 
[1/2]
Okay thx. Makes more sense that there is a general bug and not (altho maybe too, but could just be more are testing in multiplayer) a only multiplayer bug.

I do some hacky stuff of course to fit code in the 4096 char limit, so it is split into 4 scripts to work and some slight bit manipulation involving floats, to get more bandwidth between scripts, which I'm not entirely sure is 100% correct, so maybe some edge cases there.

The triangulation algorithm implementation I know is not 100% robust due to floating point error arithmatic/comparison, so there are some very unlikely edge cases there too.
(And the chosen algorithm needs a "super-triangle" that encapsulates all sampled points, which is just chosen to roughly cover the playable map, but that would fail too, if sampling data far out of the map)
NoNameIdk1 Aug 16, 2024 @ 12:15pm 
For me it crashes in singleplayer and multiplayer after a few minutes from "Tried to index nil" or something
Jumper  [author] Aug 14, 2024 @ 5:07pm 
Okay thanks for the notice. I guess I was ignorant thinking that if it works in single player then it would work in multiplayer as long as it only needs client sided functionality. Probably just a bug on my part due to not understanding some logic things in multiplayer. Only know that scripts are client sided, loaded fresh when vehicle is loaded and depending on the use case of code then it can desynchronize if the script change some state that you want synchronized with other clients.

I won't try to fix this in the near future as my focus lies elsewhere and not really interested in SW, also about to start Computer Science at Uni.
Mainly the stupid char limit that I wanna try to fix, which I've looked slightly into and figured an approuch that takes ~800 chars for an interpreter reading string custom bytecode, but it is the compiler for Lua -> custom interpreter/bytecode that's the hard part, so it halted, might try in the future.
The Burber (she/her) Aug 6, 2024 @ 7:26pm 
it was something about trying to find a null value "a" or smth