Thursday, August 2, 2012

Translating image pixel data into vertex data Python to Blender

Okay so this project seems a bit more ambitious relative other things that I've done, but its actually not as hard as it seems.

One I've used  from Unofficial python modules and packages (supporting in many cases windows 64 bit os)
Pil (Python Imaging Library).

Although fancier imaging libraries such as Scipy's Ndimaging that greater support for image processing.

The process here in this case is to transfer something like a outline selection of an image from a given background pixel data into vertex data.  The motivation here were using the profiles of crown molding and transferring this outline pixel data into vertex data for 3 dimensional imaging.  Here, having the profile of the crown molding image allows one to build furthered three dimensional structure of the molding by merely extruding say z from an xy planar image of the profile.

Compare photos of the original jpg to the vertex outline.






I've posted the first four mold profiles from the image above to the image below.



Updates 8/6/12:

I've managed to fill vertices with corresponding face data, so that the results appear as follow below:



One can see at very close scaling the clear presence of bit map sampling appears in a lower resolution bitmap photo.  Of course, luckily, while this could seem disconcerting to us with large scaled values.  The presence of bitmaps become less noticeable at smaller scaling values.  This is to say when we shrink the mesh, these stepwise jumps from point to point (making clearer the presence of bitmap) become less noticeable.

Reading a bit on Smoothing here.  Of course, blender provides native function support to aid in this, but if you wanted to create your own method for smoothing the curvature.   This would in part require analysis of the tangent lines from point to point...since in this case we are approximating a curvature with straight lines from point to point on the sides of the molding here.  In our method, we'd probably want to see the point to point extent of similarity in tangent lines, so that one array of points (e.g., the left side of the molding could instead be redrawn as a line with one tangent lines and neither being stepwise discontinuous in so far as approach of tangent lines from one side of a given point relative another.  The routine of this analysis should be constructed likely in recursive way that scans points of similar tangent lines, records this up into a defined set of endpoints.  Constructs a new line with a slope defined by a given set of endpoints, then reconfigures the position of all points in between...again this routine identifies that a set of points share a linear relationship here.   Those that are non linear on the other hand would require some differed routine here.  Likely the approach here, were to smooth tangent lines approaching a point from its 'right' and 'left'.  For example, clearly if I approached a point x1 from the left and its slope were given in nearly vertical fashion I would have a high slope value at such a point, but if I approached from the right and it were nearly horizontal, it would be nearly the inverse of the approach from the left, so I would perhaps want to create perhaps something of an average of slopes between such value or have some method of smoothing the tangent lines so that they match when approached from right or left of a given point.  Doing this, however, requires the repositioning (or translation) of neighbor points to such point.

Updates 8/7/12:
More discussion.


At the moment much of this discussion is more math curiosity then something that need be defined here.  Likely I might not need really do all this if I were planning on scaling the mesh model down in such a way, that any given user were less likely to scan a given mesh object with a fine microscope, and blender provides vertex smoothing functions here that do much the same, but I were looking to amplify the model relative to the original scaling of translated bit map to mesh object, I might have more need for custom smoothing functions?   If I use blender's native smoothing function, on the left side of the shown image, I have a significant translation in the position of the vertices, while smoothing out the overall edge of the vertices.


 The problem is I am not looking for this sort of smoothing here.  I'd like to preserve the positions of selected vertices, while sampling lower order polynomials through small intervals for a given selection of vertices, a given polynomial, and then constructing a spline graph for the positions to be replaced.  But I am wondering whether or not this will fully work?  What does this mean?  We toss out certain vertex positions.  While maintaining others, especially when the relationship of a given position is determined to linear in nature.  How do well tell if a series of position share a linear relationship?  You could go back to your old statistics routine and see if the the samples approximate to a line, or you could simply measure the slopes between line intervals and see that they closely match enough.  If they do, then this means that all such points share a linear relationship, and it suffices to take the end points, interpolate a line between the end points of all such points, and the recalculate the positions of interior points on the interval of such endpoints.  For points that don't share a linear relationship in so far as slopes, we could interpolate the higher order polynomial that fits the curvature along such graph for a given interval.

How do we do this?

We resort to some techniques in linear algebra to do this.  Firstly, if you have read anything of linear algebra, we can describe not just one but a set of many linear equations (remember, a linear equation is of the form bx + c = 0, or it could also be of the form bx + cy = d where coefficients b, c, and d are real numbers if we are working in the real number system).  Thus we could have a set of linear equations described as

a1x1 + a2x2= a3
a4x1 + a5x2= a6

As long as certain criteria apply in so far as the coefficients, we could solve a system of such linear equations, for instance, by solving for variable x1 in the first equation and then substituting this solution into the second equation.  You could also express these equations into a matrix, and then 'diagonalize' the matrix which solves all variables x1 and x2.

So getting back to our problem what does interpolation mean for the techniques that we would know in linear algebra?

As it turns out an unknown polynomial that we need to sample for such points should have the form
 for n+1 sample points:

pn+1(x) = a1 + a2x1 + ...+ an+1xn
 

If our polynomial is in this form, then we can construct a series of linear equation by directly substituting into our equation the values of our sample points for (x,y).

For instance, a set of three sample points would indicate a interpolating (or sampled polynomial) of 2nd degree (since for n+1 sample points there is at most a term of degree n), let three sample points be (1,0) and (0,1) and (2, -2).  Now lets substitute the values of each sample into the the polynomial p2(x) = a1 + a2x + a3x2
So for the first value, (1,0) we substitute and have

0 = a1 + a2 1 + a312

which is

0 = a1 + a2 + a3      (1)

Substitute the second value which is (0,1)

which is

1 = a1

(remember the value of x = 0 means the higher order terms of x cancel out)

Finally substitute the value of (2, -2),

which is

-2 = a1 + a22 + a322
or

-2 = a1 + 2a2 +4a3             (2)

Let's substitute a1 = 1 into equation (1)

Then

0 =  1+ a2 + a3

Solve for a3.   Remember to solve for a variable you would like this variable term isolated on one side of your equation, your equation needs to be simplified to do this, meaning that no other like terms exist of such variable and all other terms not like such variable are transposed to the opposite side of your equation.  In this case, we need to transpose the term a2 and the the term 1 to the opposite side of this equation.

so

adding -1 to both sides of this equation

-1 + 0 = -1 + 1 +  a2 + a3

means

-1 = a2 + a3

adding -a2 to both sides of this equation

-1 -a2 = -a2 + a2 + a3

means

-(1 + a2) = a3  (3)

Now let's substitute equation (3) into equation (2), and let's substitute a1 = 1 into equation (2)

so we have

-2 = 1 +  2a2 -4(1 + a2)

applying the distributive rule to the term -4(1 + a2) we have

-2 = 1 + 2a2 - 4 - 4a2

Next since we are solving for a2, let's simplify the right side of our equation, remember
we can transpose (shift) terms around however we like as long as we preserve their signs (meaning the signs are included as coefficients  so that we are applying the commutative property with addition of terms a+b = b+a).  So grouping like terms on the left side of our equation we have,

-2 = 1 - 4 + 2a2 - 4a2

Now simplify the left side having like terms (this adding or subtracting as necessary), so

-2 = -3 -2a2

Now lets solve for a2, we need to move the non like term -2 to the left side of our equation, so transposing means adding 3 to both sides

3 -2 = 3 -3 -2a2

and simplification adding

1 = -2a2

Now to complete our solving process here we need to take the inverse of the coefficient of the term -2a2, so we multiply by -1/2 to both sides of our equation

(-1/2) * 1 = (-1/2)*(-2)a2

means

-1/2 = a2  (4) 
Now back substitute equation (4) back into equation (3), this yields
-1(1 -1/2) = a3, and simplification means

-1/2 = a3     (5)
So with our coefficients solved for the polynomial the interpolating polynomial for

p3(x) = 1 -1/2x -1/2x2.

This polynomial does have error, so it shouldn't be assumed that it is an exact fit for what we are seeking, and this could be problematic when applying many samples to sampling a polynomial to fit all such points and providing reasonable curvature to the points in between.

As it turns out with spline interpolation, we deal with the wilder oscillatory natures of higher order polynomials, by limiting our samples to smaller intervals, providing less (mountainous, hill like curvature) that might deviate from the type of curvature that we are approximating for the in between points, and this leads to us to likely do the same for our given problem above.  The trick is we want enough points to make the sampling curves reasonably fit our given set of sample points, while not appearing to 'linear' like in the case of non linearly related points.

Yet another method exists outside of interpolation itself, but resides still on the premise of identifying the curvature of a given shape in various partition sets.  Where curvature is defined according to both point by point characteristics of a given slope since we have in this case linear approximation of curvature by the methods used above in transforming bitmaps to vertex data.  Over a given curve we could compute the average rate of change of slope values.  If given a small enough partition, then it seems we should be able to avoid running into problems with approximation differences (e.g., owing to a curve overall which in a given partition set might resemble a third degree polynomial as opposed to a second degree polynomial...this is to say that the rate of change of the rate of change of the slopes are not be constant in the third degree or higher polynomial.  Remember going back to calculus a third degree polynomial has a second derivative which looks like 6ax where a is the coefficient of the term containing such degree n variable).  Like spline interpolation we simply compute what the slope for the next neighbor point should look like at a given coordinate position, and the recompute the neighbor point position where it fits the expectation of the slope value at such position, so ideally we are using second degree polynomial spline fits which ensures a constant rate of change in the slope for such given partition on the curve set.

Update 8/8/12:  So a bit more, we can use a method for second degree polynomial approximation, extending off an idea of Newton's method.  Here, we go through the process of computing secant lines..this is the same formulation for computing the slope of a line.  However, the secant need not mean that the slope at such point exists, we could think of this as computed slope between two points if a line did exist between both such points.  The idea more formally goes this way:

1.  For two endpoints, pick an interior midpoint.  Compute the left right approach secant lines to this midpoint.   For two endpoints x1 and x2 compare the left approach slope value to such secant, and compare likewise for the right approach slope value of x2.  We compare the slope value to the secant, to see if the secant need to decrease in approaching either slope value or decrease.  This may be either or both, depending on the results.

2.  We compute a subdivision on the midpoint and the endpoint to determine a new midpoint, and create a subdivision of the secant value added or subtracted to the originating secant value which is placed between the such endpoint slope value and the previous iterations secant value.

3.  Reiterate steps 2, until having reached an adequately spaced subdivision.

Fairly straightforward and simple process.  The idea is that for a given curve I may toss all but one desired interior point relative to two endpoints on a given curvature, and then go about the process of creating a new curvature between such points. 

updates 8/15/12:

Okay so working the rounds building a routine which determines for the crude bitmap form, a curve division routine.   For a given curve partition, determining a routine which determines linearity versus non linearity in the fit of point data.  Linearity is the easiest for curve approximation.  However for non linearity I am working with a cubic spline method.  The problems relating to this is that the method that I am using works from a symmetric matrix form (easiest to compute) alongside second derivatives from point to point.  I have attempted to use quadratic form which means second derivatives are constant in producing curvature (which is what I would like since it mean less point to point variability in data) or translates to smoother plotting.  So I am solving sequentially for coordinate values.  The problem with quadratic fittings is that these may not be best fits in terms of non linear data that were fit to higher polynomial approximation orders.  This basically means on the cubic spline one would have variable second derivatives and computing either through interpolating slope from left side to right side end point approaches could translate into too much slope on one side leading to larger approximation errors for point to point computations in the cumulus.  I've, however, been able to preserve all curve smoothness generally at all but two points re scaling data.  This means curvature is not potentially as smooth at say the first endpoint, second point, and previous point to first endpoint on a given segment.  This could be amended further partitioning the curve segment at such points and the re interpolating with another cubic spline segment that further smoothing curvature.  Depending on how far one wants to go in building higher resolution curvature I would suppose.

Updates same day:

So having worked with a few self generated examples, finding the limitation of interpolation for curve building.  It seems if creating positions while generating in the sequence additional points without having first order poly constraints in place to control the approximation of points, scaling seems inevitable, the problem of scaling emerges when curve building between two n points and the slope differential is significant, especially for second order derivatives fixed as constant.  I've decided to this leave method alone unless having decent sets of points and then instead using the cubic spline method for generating solutions to the second order derivatives which in turn could help to generate sub points. The problem that it seems to be dealt with here is the spatial separation of points themselves could pose issues.  Currently investigating a self generated technique which seems similar to bezier curve building. The idea seems simple enough here.  Its a geometric/graphical formulation which takes the 3 separate tangents the interior npoint tangent, and the left right endpoint tangents.  Here we compute the intersection the left right endpoint tangents to a point J.  Here we can create subdivision m as desired on the segment sets from left endpoint le to J, segment(le, J) and J to the right endpoint re, segment(J, re).  Using the angles between tangent lines relative the point J at either endpoint allows use to determine from a given subdivision on the seg(le,J) and seg(re,J), one can compute from a given subdivision on the angle, I could resort in this case to a midslope formulation on either such angle, and then control the curvature by creating subdivisions of the midslope angle in varying degrees (quadratic, cubic, and so forth) to approximate the corresponding point positions defining curvature and the rate of convergence on the endpoint left right slope tangents and vice versa to to the midslope intersection (which is a segment whose normal line intersects J).  Noting these points form normal line segments relative the interior tangent line.   This seems to have basic advantages such as constraining the placement of points neither exceeding all tangent slope angles, and simplifying the spatial placement of points relegated to constraint boundaries a given point so that it lies always normal line segment relative the interior tangent line for a given subdivision.  Here mindful that a fixed angle from subdivision point on the segments (le, J), and (J, re) produces a linear curve while linear rates of change of such angle define (quadratic) rates of convergence, and so forth.  Also one could build a family of curves defining any degree of J point angles respectively and then building by way of convergence both to endpoint left right tangents to such fixed angle.   Could be pretty much the bezier curve construction method, but I hadn't formally studied this from a mathematical standpoint aside from using these for graphical purposes.

No comments:

Post a Comment

Oblivion

 Between the fascination of an upcoming pandemic ridden college football season, Taylor Swift, and Kim Kardashian, wildfires, crazier weathe...