Suface data and Survex

Moose

New member
We've been converting raster grid data to XYZ then to Survex topographic point format. This is a Python script. What I could really do with knowing is exactly how many points Survex can handle in this format, as the topo grids are very large with many points!

It would be possible to break the terrain model up into smaller grids, which Survex could then handle individually, but it would be useful to have a limit to work to.

Even better would be if Survex was able to increase the array so that it can handle bigger grids / more surface points. Is this possible?
 

graham

New member
Moose

I don't think Olly ever comes here & Wookey does only infrequently. You'd be better joining the survex mailing list.

Check out this page for details.
 

jarvist

New member
Cavern or Aven? How big are the topo grids?

All the arrays I've ever seen in the code are dynamically allocated (i.e. there is no fixed length array to increase), and as long as Cavern isn't trying to apply errors / build a big graph for your connected surface data (i.e. any algorithm that isn't O(n) in memory) you shouldn't get into the realm of memory exhaustion on a modern machine.

Displaying with Aven is a different matter - getting your graphics card to render >millions of lines in a timely manner is a challenge.
 

jarvist

New member
Just did some quick testing of this with valgrind to track memory usage of cavern / aven 1.1.13 on 64 bit Linux, against the (3765 leg) Migovec survey.

Processing an extra 20'000 DEM datapoints (a 2MB .svx file) added 8 MB of memory consumption to cavern (10 vs. 2MB), and added 1.5MB to the resulting .3d file, and took nearly 20 more CPU instructions to compile.

Displaying the larger 3D file took aven from using 12 MB of memory to 33 MB.

So with this I imagine that 'aven' will become a problem before 'cavern', but that both these figures are pretty minimal compared to the memory consumption of e.g. a modern web browser.
If your data sets really are that 'big' (as in, too large to reside in memory at once) you might have to start preprocessing / use variable fidelity to get  it working sensibly.
 

wookey

Active member
As jarvist has said - there are no fixed limits in survex - it was explicitly designed that way because one of the reasons we wrote it was due to being annoyed by various fixed limits in other bits of software back in 1990. So the question is simply how much data can you process before you run out of memory or it gets too slow to be usable.

Do tell us when you find out :)

Some better surface handling is likely to be along sometime as we'd all like to be able to put real geodata in these days. I believe Ol is working on it (very slowly - help is always welcome).
 
Top