Tuesday, September 30, 2014

Under the Dome mania!

   If you thought space ship planet Earth couldn't be more insane, this series, in a microcosm, encapsulates pure cold war sci fi futurism at its best!

Sort of like the soaps but better!

Under the Dome Season 2

I know I've seen negative reviews on Amazon, but I think some of the reviewers may miss the point here!

Monday, September 29, 2014

Blender height map importer

One of these days I may get around to writing a batch height map importation algorithm.  At least if it hasn't been written, one it would spare you in the case of bigger batch loads, having to manually set up a mesh grid, subdivide it appropriate, put a displace modifier on the map (which generally looks crappy at low resolutions), and then having to load a low 8 bit grey scale height map for further resolution loss.  Coupling this with trying to manually align everything and the added tedium of hand interpolating points for alignment.  So with a nice batch algorithm, you could add say a lot of terrain in fairly short order auto aligned and everything else.  I think I could do it.  Python is easy enough to work with!

Okay I may get over to doing it, one because of cycles rendering which sort of seems nice and fancy to me.


Sunday, September 28, 2014

Ten centimeter friends!

   I am recently inspired on the part of a feedback on a previous post of mine, namely, in sounding a bit deranged.  So I wanted to share this program that I've found useful for building sentences.  Everything from random name generation to creative writing aids.  I am hoping someday to design an A.I. writer to take over my blog writings!  This will be in sequel to Asimov's Gold!


import os
import random
class Readcsvdata:
    def readfile(self):
        file = open(self.file, 'rU')
        datareads = []
        dataread = {}
        for line in file:
            csvlinedata = line.split(',')
            dataread[csvlinedata[0]]=csvlinedata[1:len(csvlinedata)]
        datareads.append(dataread)
        for filex in self.files:
            file = open(filex,'rU')
            dataread = {}
            for line in file:
                csvlinedata = line.split(',')
                dataread[csvlinedata[0]]=csvlinedata[1:len(csvlinedata)]
            datareads.append(dataread)
        return datareads
        
    def __init__(self):
        self.dirpath = 'C:\\Users\\StrangeCharmQuark\\Downloads\\mthes10\\'
        self.dirpath2 = 'C:\\Users\\StrangeCharmQuark\\Downloads\\'
        self.filename = 'mthesaur.txt'
        self.filelist = ['census-dist-male-first.csv','census-dist-female-first.csv',
                         'census-dist-2500-last.csv','steampunk.txt',
                         'clothing.txt','colorterms.txt','fabric.txt', 'ore.txt',
                         'gemstone.txt']
        self.file = self.dirpath+self.filename
        self.files = []
        for filen in self.filelist:
            self.files.append(self.dirpath2+filen)
            
class Readhsvdata:
    def readfile(self):
##        file = open(self.file, 'rU')
        datareads = []
##        dataread = {}
##        for line in file:
##            csvlinedata = line.split('-')
##            dataread[csvlinedata[0]]=csvlinedata[1:len(csvlinedata)]
##        datareads.append(dataread)
        for filex in self.files:
            file = open(filex,'rU')
            dataread = {}
            for line in file:
                csvlinedata = line.split('-')
                dataread[csvlinedata[0]]=csvlinedata[1:len(csvlinedata)]
            datareads.append(dataread)
        return datareads
        
    def __init__(self):
        ##self.dirpath = 'C:\\Users\\StrangeCharmQuark\\Downloads\\mthes10\\'
        self.dirpath2 = 'C:\\Users\\StrangeCharmQuark\\Downloads\\'
        ##self.filename = 'mthesaur.txt'
        self.filelist = ['geology.txt']
        ##self.file = self.dirpath+self.filename
        self.files = []
        for filen in self.filelist:
            self.files.append(self.dirpath2+filen)
            
        
class Namegenerator:
    def pickrandom(self):
        alist =[]
        llist =[]
        nlist =[]
        for i in range(0,10):
            n =''
            intpick = random.randint(0,len(self.adict)-1)
            ##alist.append(self.adict.keys()[intpick])
            n+= self.adict.keys()[intpick]
            n+=' '
            intpick2 = random.randint(0,len(self.ldict)-1)
            n+= self.ldict.keys()[intpick2]
            nlist.append(n)
        return nlist
            
    def __init__(self, adict, ldict):
        self.adict = adict
        self.ldict = ldict

class Namegenerator2:
    def pickrandom(self):
        nlist =[]
        for i in range(0,10):
            n =''
            intpick = random.randint(0,len(self.adict)-1)
            ##alist.append(self.adict.keys()[intpick])
            n+= self.adict.keys()[intpick]
            n+=' '
            intpick2 = random.randint(0,len(self.ldict)-1)
            n+= self.ldict.keys()[intpick2]
            n+=' '
            intpick3 = random.randint(0,len(self.kdict)-1)
            n+= self.kdict.keys()[intpick3]
            nlist.append(n)
        return nlist
            
    def __init__(self, adict, ldict, kdict):
        self.adict = adict
        self.ldict = ldict
        self.kdict = kdict

class Namegenerator3:
    def pickrandom(self):
        nlist =[]
        for i in range(0,10):
            n =''
            intpick = random.randint(0,len(self.adict)-1)
            ##alist.append(self.adict.keys()[intpick])
            n+= self.adict.keys()[intpick]
##            n+=' '
##            intpick2 = random.randint(0,len(self.ldict))
##            n+= self.ldict.keys()[intpick2]
##            n+=' '
##            intpick3 = random.randint(0,len(self.kdict))
##            n+= self.kdict.keys()[intpick3]
            nlist.append(n)
        return nlist
            
    def __init__(self, adict):
        self.adict = adict
##        self.ldict = ldict
##        self.kdict = kdict


a = Readcsvdata()
tdict = a.readfile()
thlist = tdict[0]
mnlist = tdict[1]
fnlist = tdict[2]
lstlist = tdict[3]
steamlist = tdict[4]
clothinglist = tdict[5]
colorlist = tdict[6]
fabriclist = tdict[7]
orelist = tdict[8]
gemlist = tdict[9]

hsvdat = Readhsvdata()
hsvdatlist = hsvdat.readfile()
geologylist = hsvdatlist[0]

b = Namegenerator(mnlist,lstlist)
fullnames = b.pickrandom()

c = Namegenerator(fnlist,lstlist)
ffullnames = c.pickrandom()

d = Namegenerator(steamlist,steamlist)
steamnames = d.pickrandom()

e = Namegenerator(thlist,steamlist)
thsteam = e.pickrandom()

f = Namegenerator2(colorlist,fabriclist,clothinglist)
clothing = f.pickrandom()

g = Namegenerator3(geologylist)
geo = g.pickrandom()

print(fullnames)
print(ffullnames)
print(steamnames)
print(thsteam)
print(clothing)
print(geo)
Obviously if you wanted to make use of this word list reader, you'd simply need build your word list, you can easily modify the script so as work with hyphen separated values, versus comma separated ones, or any spacing, or whatever. Fun project actually!

The only people that one directly hears from

     Well let's see the people that you directly hear from these days were people that you knew some decades past, that you don't bother to talk to at all really, and generally you could care less to hear from.  To be included in that list are the potential jerks (polite language) that are likely drafted to see what you are up to or in promotion to historical revisionism, then wording as opposed to altered past.  In that history, the children lived happier, the school on paper looked more integrated at least urging in that day that multiculturalism weren't dead.  Those days you if you were lucky to be one of the rare sops to be struck down and beheaded by a meteorite, red scare tactics just didn't amount to a hill in beans, nut jobs aside, but the school, unlike in the revisionist version, wasn't really a good school, nor well administrated, parishioners likely made it difficult, and likely the school as a microcosm weren't really Catholic, but more the politics of its generation, a hodge podge mix of fascist children taught in that day and age to be the sick sorts of sops that grew up to be, really in a sad white 'rule' city that carefully positions itself not to look to 'southern' sympathetic...you sort of wonder how many urban Lebowski achievers grew up deserving for the token hand outs...Couture?!  How's the business buddy?!  I know its all about the entrepreneurs, these days Kansas City talks about its in retrospect likely maligned vision of Silicon Prairie or in so many words investment boondoggle, white elephants if you dared attend one of those funny mustache wearing meetups somewhere along the line, or really someone else's start up, but the good news for Kansas City is that, among the nation's top list of most dangerous cities (not Southern enough), formal crime statistics may not have made it as obvious, or if you thought Turkmenistan was beautiful with your head really up your ass, by all means.  But fortunately for us all, a Kansas style bridge gate public works scandal  is something for a Roberts re election.  I always about declines in commercial traffic for any length of time?!

Saturday, September 27, 2014

On the upside...

     Autumn is here!  

People that you knew...

    I don't know if its just me or would someone else relate.  If another person that you knew in grade school shows up unannounced at the door step.  Generally I sort of tire of it, not because it is obviously a bit rude to do so...I mean how f'ing hard is it to pick up the phone and say, 'Hey, man, not sure what's up?   How about a meeting or something?'  Then you might ask yourself, "How many people from grade school show up at your doorstep?"  All this expectation, like social politeness, yes, supposed to cater to really a person that generally is manipulative enough, and then leaves little back handed social drops on the persistent exchange that he goes out of his way to initiate.  Sure, nice successful career supposedly, and you are supposed to cater to someone else's terms being polite.  If it weren't more obvious a near violation of personal space and privacy though when these sorts of people come to the doorstep...I guess social boundaries weren't so obviously when the exchange were generally one sided, like, "Man, you always go out of your way to contact me...I don't say squat to you...Isn't that a clue?"  Until maybe a sort of exchange crops up that definitely sends someone packing out the door and away for a good something number of years.
     Over the years, although there's something of a social relaxation to the persistent foot in the door used car salesman types.  I don't know if its just me in dealing with these sorts of people, the sort of individual that screams something all over in mind, if it weren't using a friendship for some purpose.  At least scripted fish bait these days seems as much.  Sudden interest in another's personal psychology, sudden interest in maintaining a relationship, despite all the sorts of indicators that should have an exchange going south in a hurry, maintaining that relationship at higher costs for what?   You see this sort of social exchange popping up for all sorts of reasons...maybe it were economic, social, political, and/or any other b.s. reason that had anything to do with a friendship.

   Zeb Fortman, please take the hint and f'off man.  Lots of laughter.

I give a chance to another to figure it all out, but fortunately I don't have to deal with calling cops and restraining order sorts of turf...I'm sure enough on the script are well versed enough in law?

By the way, another person in mind at the moment.  I'll be polite for now and not do another name drop.  Someone's business sinking?!

Friday, September 26, 2014

You might hear...

    If it weren't amusing enough to hear an acting FBI director complaining about any potential issue pertaining to, with recent increases in encryption standards either on the part of  Apple or Google.  I wonder with respect to the level and ease that a given argument is actually listened to or believed, or as one might have it, that the FBI might actually have a difficult time engaged in broad surveillance gathering activity, like targeting a particular cell tower, because under lose enough claims to present warranting procedure, a suspect's communication might have passed through this at any given time in the past, but never mind any possibilities of the sort which should entail not going directly to the 'supposed source' problem here.  Higher levels of encryption really could be useless if your operating system were infected regardless, as long as vulnerabilities exist on the pre encryption processing pipeline, but really as to the removing a component to the data informatics capabilities in intelligence gathering, and then given the ability of collecting sourced data being far more extreme in this day and age relative to any other, the necessity for broad and indiscriminate surveillance methods simply shouldn't be necessary, and generally in lieu to notorious and glaring examples of failures in intelligence where they might have stuck out like a sore thumb, speaks perhaps greater volumes to the problems of information gathering in general.  Consider this basic problem of 10 inter connected nodes.  Indiscriminate algorithms, for instance, would have 10^10 points of data and on data that weren't temporal in nature.  The biggest problem is that connecting seemingly random points of data in this day and age avoid often times brute force combination computations, and that inherent data mining biases likely should be applied, but when it is so glaring that more obvious lapses with respect to broad intelligence gathering methods, it points often times to why broad surveillance in general is likely to fail as a methodology alone.  Of course, you might be led to the argument that targeted surveillance more broadly applied yields data, such as in a given network, or regionally speaking, something of constructed social theory meant that higher probabilities for intercepting something of greater interests, but when the coarse nature of intelligence gathering should rise to nation state levels, I'd offer this should be head scratching turf, and especially when the problem itself is unwieldy if not outright improbable to any degree of usefulness given coarseness at such a level.

   I've recently heard another other big brother sympathy apology opinion solicitations on the so called Fox News civil libertarian 'independents' hosting network.  Here the contributor describes exactly the argument, one might expect, if having concerns about  issues of broad surveillance, namely, yes, the FBI or Federal agency of a similar power capacity had desire in obtaining data, they'd likely be able to get at such data through a myriad of means that extended beyond a warrant less process, but then opinion, on the nuance, describes supposed ineptitude on the part of local law enforcement in terms of information gathering tactics, and then the added problem to this later argument.  Here, in recent days, its not just that teens, immature adults, or that any persons neither in position of qualification to handle sensitive personal information, its that local law enforcement tools somehow wind up online and privacy invasions are inevitable.  Here, I think we are supposed to sympathize not with starlets that think better to use cloud storage, maybe knowing better or not otherwise, its just that one more tool for significant privacy invasions exists allow from the sorts of nationalized village beating social crisis that exists in this day and age.   Of course, Russia gets blamed for these sorts of tools, but if you studied which countries are the biggest offenders on privacy breaches, you'd find the United States and China topping these lists.  I think its personally offensive, however, to the degree of seeing the more lowly of individuals telegraphing knowledge, neither given presumably any formal capacity necessitating having any given information of a personal nature, and merely staging knowledge in power tripping and socially abusive ways that had nothing to do with protecting societies or law enforcement in general.  The problem is that precisely it is a big maligned lie stating that even local law enforcement don't have similar capabilities relative to methods given at  federal levels.  Its that the methods available are so in indiscriminately disseminated, any grandmother down the street might be able to gain access into their next door neighbor's system with a little effort that hadn't required a warrant.  Consequently I don't know if there really is much of anything of revelation when anonymous users, in recent times, abuse the notions of personal privacy making death threats and publishing another's private data.  These sorts of individuals, unfortunately, range in larger numbers these days and have unfortunately  I think an all to often 'green light' to use and abuse surveillance tools that were supposedly designed to protect people in society is given.  

Some more notes on creating/importing world terrain mosaics in Unity

Okay so I managed to script load/import terrain assets in batch, compile 16 bit heightmap images, but a couple things, apparently for the mosaic loading pattern things don't quite mesh well for the terrains, even using (as in my previous post) the mentioned terrain edge boundary algorithm.  Namely the corners still have discontinuities between edges, and the reason for this in my program is related to sequencing the measurement of points on a given edge boundary while neither accounting for any given change when measuring for another edge boundary.  It would appear I also needed have a fixed point for the corners of the world mosaic maps.  Even so visually the results appear precisely matched except for inside the terrain edge matching threshold regions at the corners.

   What I've managed as a solution to this is by creating a fixed point on a given corner but projecting this fixed points edge difference in so far as the interpolated region on a neighboring map, which in turn should lead to a new projection on the next neighboring map, and so forth.

For instance, at a given intersection I have 4 maps.  I start with map 4, and I choose the corner point of this map.  Then I compute the cspline interpolated translation curve on neighboring map 3's edge boundary.  I use the old edge difference points from map 3 as related to the map 1 neighbor and then I add the differences as computed with the cspline interpolated difference on map 3's edge along our defined threshold region.  I again, following the process of using a fixed point in map 4 do the same for map 3, and compute an cspline interpolated region on map 1, and then repeat this procedure to readjust values on map 2, until I lead back into map 4.

I can reverse this process likewise.  We are still left with the problem as I've mentioned in a given threshold region for cspline translation of points for terrain boundary matching namely going from map 2 back into 4, and this would lead to a readjustment of points back in map 3, so we can diagonalize or linearly vary the threshold length of points on the threshold boundary region (this would be like creating a custom diagonal miter for a picture frame).  So that by folding a translation of points into the given mesh region this would  not effect edge boundary differences of points as originally measured say from map 4 to map 3 (hence mitering the threshold region).

Here are some results shown visually after correction:






Some added improvements.  We can still slightly see seam on the a given edge.  Added approximating methods could smooth over these I imagine but definitely smooth.



Example with texture loading.  I've again batch automated texture loading.




Something to keep in mind when importing height map.  You'd definitely want to know the orientation of the map itself.  Unfortunately, I've had issues reading raw 16 bit grey scale height maps.  Why, for instance, I had created a height map binary reader, but also in terms of image formatting another issue which should include, for instance rotating an image.  I actually created a fairly simple algorithm to rotate bit map image coordinates.  Pretty simple really, using your favorite linear algebra math package, you'd need to first convert to local terrain coordinates.  Where the local coordinate position sets the terrains local point of origin to the center of its map.  This is terrain.width/2 for x height and terrain.height/2.  Then you'd need to convert any position (from previous post see 16 bit binary map loading script) x y to local prime coordinates.  To do this I used a linear algebra package, and created 2d vectors for the original x,y coordinate position, and the a 2d vector for the new local coordinate's point of origin, I'll label this V2.  Subtract these two vectors and the resultant vector determines the coordinate position on the prime axis, call this V3.  Now, again you'd need to generate a 2x2 rotation matrix  (see http://en.wikipedia.org/wiki/Rotation_matrix ).  So you need a supplying angle.  If you have this in degrees, you'll need to convert this to radians (Math.Pi/180.0 * degree).  Then build the matrix coefficients using the rotation matrix formulation provided at the link above.  Next, you'll take the supplied x', y' (local coordinate) positions and multiply the rotation matrix M*vec(x',y') this resultant vector is the x', y' coordinate after rotation, call this V3', now you'll need to reconvert from prime coordinates back into original global point of origin, so you take this resultant vector and add it to the local point so take V2+V3' and this returns the new coordinate positions x, y post rotation.  This rotates the terrain grid around its centroid which is what we want to ensure that we have proper indexing values for our terrain.  If you did it like me, to avoid fretting over problems of trig, you'll need to cast convert from integer coordinate positions (for unity) to doubles or floats or whatever your linear algebra package runs from and then convert back to integer positions.  This is fine by the way for square (90 degree, 180, 270, and 360 rotations).  Technically we can't set up terrain grids rotated in other ways (without truncating the map on the square) since Unity allows rectangle terrain grids that are not skew.  Although I am sure if you really wanted to hobby around you could find a way of overlapping terrain grids to create skew tile ways of arranging terrain grids.  The math though gets tedious with these non standard rotations since you'd need to figure out how to preserve all integer based coordinate positions with a rotation conversion on a double that is converted back to an integer , Unity forces this coordinate position integer type for Terrain building, a problem likely involving scaling, or interpolating point positions and heights.

Using the binary height map importer algorithm (previous post a few back on my blog), I've found when using the World Terrain generator program which can produce both height map and terrain texture images, that using a North South world terrain row position, and West to East world terrain column position, that I needed to rotate height map images by -90 degrees and Texture images by +90 degrees.  This generally resolved issues without the need in modifying my overall importation build algorithms, such as translating terrain tiles differently relative to the build indexing using the North/South/West/East formulation but changing the indexing of the maps so that they align properly because the binary reader builds the maps with its local north pointed, for instance west or east instead of true north on the local coordinate build.  

Saturday, September 20, 2014

Unity 16 Bit height map loading custom loading script

While Unity furnishes a raw image map importer for heightmaps.  Out of curiosity and desire for developing a batch loading script (where I could import at any given time as many terrain heightmaps as desired), I've investigated the potential for loading high resolution 16 bit gray scale heightmaps.

In this case, it actually appears to be very simple.  I've used a reference script, in this case found at


The problem is I wanted a section of this script customized from Java Script over to C # so basically the key is porting this main excerpt of code.  Where heightmapimage.size is assumed to have equal length x height dimensions

using System.IO;
...

var bytes = File.ReadAllBytes(pathtofile + filename);
m_terrain.terrainData.heightmapResolution = heightmapimage.size;
m_heightValues = new float[heightmapimage.size,heightmapimage.size];
int i = 0;




for (int x=0; x < heightmapimage.size; x++){    
   for (int y=0; y < heightmapimage.size; y++){
      m_heightValues[heightmapimage.size-1-x,y] = ((float)bytes[i++]+             (float)bytes[i++]*256.0f)/65535.0f;
     }
}

A grey scale heightmap is represented in 16 bits actually as 2 bytes, or 1 byte=8bits + 1 byte=8bits.
And the apparent algorithm for representing 1 of 65,536 gray scale color values is given above with normalization provided (this is done for PC), you'd simply follow the same procedure reversing the byte order for Mac conversions.

While I couldn't find any information explicitly on the encoding process for RAW16 (at least with a google search on any number of variant search strings, other than generalized information).  Mathematically speaking it appears the encoding process is pretty straightforward modulus remainder arithmetic.  A reading of the byte array returns the multiplier of the modulus amount (how many times wrapping around the value 256) which is multiplied by the 256 value factor (or if you think of this in two dimensional matrices, a row has 256 columns, with 256 rows for each column), the column space represents the remainder or modulus of 256 value which is the number on a given row before a value increments to the next row space.  Thus by analogy we read the row column positions which fill any one of the values ranging from 0 to 65,535 which in turn represents the gray scale pixel value on the heightmap image.

Thursday, September 18, 2014

Translating height map data for Unity (gray scaled heightmaps) 8 bit images

I saw the following code for sampling heightmap data from another site

Loading terrain height map data in Unity

    private void LoadHeightmap( string filename )
    {
        // Load heightmap.
        Texture2D heightmap = ( Texture2D ) Resources.Load( "Heightmaps/" + filename );
 
        // Acquire an array of colour values.
        Color[] values = heightmap.GetPixels();
 
        // Run through array and read height values.
        int index = 0;
        for ( int z = 0; z < heightmap.height; z++ )
        {
            for ( int x = 0; x < heightmap.width; x++ )
            {
                m_heightValues[ z, x ] = values[ index ].r;
                index++;
            }
        }
 
        // Now set terrain heights.
        m_terrain.terrainData.SetHeights( 0, 0, m_heightValues );
    }
Generally while I'd say this is about right, since a height map is gray scaled (at least hopefully gray scaled.  Otherwise, for alternate height map false color images, I imagine you'd need to find some adequate height sampling method, or use whatever algorithm that were used in producing the height map firstly in reverse.  One technically only need sample one color of an RGB tuplet (since in gray scale any color has equal RGB color values).  The only remaining problem is normalizing the sampled color value from 0 to 1 on its given scaling.  This is accomplished by dividing the sampled color by 255 since all color variations range from 0 to 255.

Thus I believe the correct code in Unity should read

    private void LoadHeightmap( string filename )
    {
        // Load heightmap.
        Texture2D heightmap = ( Texture2D ) Resources.Load( "Heightmaps/" + filename );
 
        // Acquire an array of colour values.
        Color[] values = heightmap.GetPixels();
 
        // Run through array and read height values.
        int index = 0;
        for ( int z = 0; z < heightmap.height; z++ )
        {
            for ( int x = 0; x < heightmap.width; x++ )
            {
                m_heightValues[ z, x ] = values[ index ].r/255;
                index++;
            }
        }
 
        // Now set terrain heights.
        m_terrain.terrainData.SetHeights( 0, 0, m_heightValues );
    }

Pretty easy really.  This script would be limited for 8 bit images.  Generally Unity image (texture) importers are restricted outside of 16 bit options (at least at the moment I hadn't seen import options).  I've also tried using Windows System.Drawing libraries for C # but all I could find were bitmap conversions of something like png or 8 bit bitmaps.  It works but there are definitely noticeable graphical step value artifacts visually speaking on a given mesh terrain.  If you can produce high resolution maps this may not be a problem, but if you are restricted in terms of resolution of terrain, then you may want to consider using 16 bit importation scripts.  

Thursday, September 11, 2014

Cubic Spline example

For those who aren't sure of the method in producing a cubic spline curve between given node points on a sample set of points, I've furnished a curve generating example on a given node interval.  The link below provides further information.

Cubic spline example

Wednesday, September 10, 2014

Recent curve interpolating process works for graphics related work (and other stuff)

    Not sure if anything of the potential application in a way, albeit providing some measure of control for curve generation.

    So the idea that I had goes along the lines of how to build a smooth curve, outside of using bezier curves.   Okay so it seems there are number of things that can be done here.  For instance, it seems maybe I could generate a random sample of points, I don't need to be perfect in sketching out a curve by hand, and then guesstimating coordinate values for a given domain and range.  From this I could then apply a higher order least squares approximation at least higher order than linear...which merely graphs a best fit line between random points drawn, and maybe providing something nice for drawn curvature.  Interpolation over small intervals might work if you kept your sample space small, the problem, however, here is curve instability.  Interpolating with higher order polys may make points fit, but you may have all sorts of squiggling and jostling of the curve wildly moving in between sample points.  Which then brings up the alternate choice method which is cubic spline approximation.  Here, in theory over large interval sets you could sketch out a curve and then piece wise from sample point set to set, guesstimate slope values for a drawn curve.  Ensuring the sample parameters change from cubic spline point to point approximation sets change the same, that is the sample position and first, second, or third derivative conditions remain the same at the sample point ensures point wise continuity for the curve generated.  Otherwise, this should look pretty smooth.

Another method that I've recently employed, albeit surrendering a bit of control on the curve, employs a method of scaling points on the curve.  Basically scaling the domain and range of the curve, while controlling over a shorter interval curve generation say between a given domain [0,1], and range [0,1].  Granted if you were working over larger domain sets [0,20] and something like [0,1] for a given normalized z range, re scaled curves would likely have relative to a 20 point sample cubic spline set have a similar look in general.  It seems you'd have a lot more precision in curve sampling and control in the later case, but if you needed to generate a curve really fast and have something roughly in the ball park of needs, you could simply re scale points on the original domain and range.  Which is what I did for producing satisfactory looking falloff (threshold) curves.  I'd mention under curve range re scaling methods, you'd generally want to have as many points as needed to resolve the geometry of the curve as necessary.  For instance, if we were limited to 20 points on the domain for scaling, then I'd use 20 sample points on the original domain [0,1] which is to say a step increment of 1/20, between neighboring sample points, so that under scaling transformation I'd have transformed from 0,1/20,2/20,3/20,...,1 to the domain of 1,2,3,..,20.  In theory what's nice here for the approximating function(s), means that you can provide as many points as necessary on the original non scaled domain for curve approximation on a given re scaled domain and range (depending on your 3D modeling software), meaning in theory we shouldn't lose as much resolution of the geometry of the curve (if we sample enough points), or in other words, with the approximation of a curve, we can build a larger sample set of points with an approximating function so that we can re scale the geometry while reducing loss in geometric resolution under scaling .

    What happens with geometric resolution when there is loss you might ask?  Well, if you have something like linear interpolating processes employed in your 3D modeling software, typically what will happen is that as you scale the geometry larger, it will appear less piece wise smooth from vertex to vertex on the surface geometry (you might have smoothing modifiers for shading that help get under the lower poly resolution problems for scaling transformations here), or in other words, you start to see more the straight lines running from vertex to vertex on the represented geometry with more visibly present piece wise higher order derivative discontinuities (non matching slope values for instance for left or right approaches to a given vertex)...that is a curve appearing a lot less smooth.  Obviously having more sample points, does also lead to potentially more graphics processing load since there is that much more data to be tracked in the process of re scaling geometry with more points, but, it seems you might be able some solution to balancing this?!

 Generally while the geometry of the curve is not invariant under this type of re scaling method, it still, produces smooth, and stable curvature.

Here's some added notes on this stuff

World Terrain Mosaics Mind mapping thoughts working with Unity game engines


Oblivion

 Between the fascination of an upcoming pandemic ridden college football season, Taylor Swift, and Kim Kardashian, wildfires, crazier weathe...