Friday, October 13, 2017

Angular backend integration problem...custom parser

I found any number of back end parsing solutions that didn't work with my angular app, and I spent more time researching this problem in the long run.  It took me 30 minutes (no kidding) to write a functioning string parser for approximately several hours of non solution to Angular typescript component interfacing.

Example parser

  
class ParseObject{
    toParseObj: string;
    pitems: PostItem[];
    constructor(objstr: string){
        this.toParseObj = objstr;
        this.pitems = [];
    }
    catchEscape(rs: string[]){
        var i = 0;
        var rtnrs: string[] = [];
        var setcontinue: boolean = false;
        for (let r of rs){
            if (setcontinue){
                i+=1;
                setcontinue = false;
                continue;
            }
            if (r[r.length-1] == "\\" ){
                if ((i+1) < (rs.length-1)){
                    let fr: string = rs[i+1];
                    rtnrs.push(r+fr);
                    setcontinue = true;
                }
            }
            else{
                rtnrs.push(r);
            }
            i+=1 
        }
        return rtnrs;
    }

    iterateParse(strP: string, strset: string[]){
        for (let strp in strset){
            var splitobj = strp.split(strP);
            var nsplitobj = this.catchEscape(splitobj);
        }
    }

    parseString(){
        /*{"records":[{"name":"JimBob","message":"Hello katz!"},{"name":"Dr IQ","message":"Hello hello my name is Jim Bobby.
 Been an old friend of shorty there!
 Gratz!"}]}*/
        //presumes user properly input string for formatting
        var rs = this.toParseObj.split(']}');
        console.log(rs);
        //catch escapes
        var nrs = this.catchEscape(rs);
        console.log(nrs);
        let nrs0 = nrs[0];
        var nrs1 = nrs0.split('{"records":[');
        var nrs2 = this.catchEscape(nrs1);
        console.log(nrs2);
        let nrs3 = nrs2[1];
        var nrs4 = nrs3.split("},{");
        var nrs5 = this.catchEscape(nrs4);
        console.log(nrs5);
        for (let nr of nrs5){
            //split "," for key,value s
            var nrs6 = nr.split(",");
            var nrs7 = this.catchEscape(nrs6);
            console.log(nrs7);
            let nrs8 = nrs7[0].split("\"name\":");
            let namevalue = nrs8[1].split("\"")[1];
            console.log(nrs8);
            let nrs9 = nrs7[1].split("\"message\":");
            let messagevalue = nrs9[1].split("\"")[1];
            this.pitems.push(new PostItem(namevalue,messagevalue));
        }
        return this.pitems;
    }
}
class PostItem {
    constructor(public name: string,
                public message: string) {
    }
  }

You can build a custom service to be used with a promise


Here's my component service
import { Component, Injectable, OnInit } from '@angular/core';
import {PVector} from './lineData';
import {frameResizer} from './frameResizer';
import { routerTransition } from './router.animations';
import {Guest } from './guest';
import {Http, Headers, RequestOptions, Response} from '@angular/http';
import 'rxjs/add/operator/map';
import 'rxjs/Rx';
/*{"records":[{"name":"JimBob","message":"Hello katz!"},{"name":"Dr IQ","message":"Hello hello my name is Jim Bobby.
 Been an old friend of shorty there!
 Gratz!"}]}*/

class ParseObject{
    toParseObj: string;
    pitems: PostItem[];
    constructor(objstr: string){
        this.toParseObj = objstr;
        this.pitems = [];
    }
    catchEscape(rs: string[]){
        var i = 0;
        var rtnrs: string[] = [];
        var setcontinue: boolean = false;
        for (let r of rs){
            if (setcontinue){
                i+=1;
                setcontinue = false;
                continue;
            }
            if (r[r.length-1] == "\\" ){
                if ((i+1) < (rs.length-1)){
                    let fr: string = rs[i+1];
                    rtnrs.push(r+fr);
                    setcontinue = true;
                }
            }
            else{
                rtnrs.push(r);
            }
            i+=1 
        }
        return rtnrs;
    }

    iterateParse(strP: string, strset: string[]){
        for (let strp in strset){
            var splitobj = strp.split(strP);
            var nsplitobj = this.catchEscape(splitobj);
        }
    }

    parseString(){
        /*{"records":[{"name":"JimBob","message":"Hello katz!"},{"name":"Dr IQ","message":"Hello hello my name is Jim Bobby.
 Been an old friend of shorty there!
 Gratz!"}]}*/
        //presumes user properly input string for formatting
        var rs = this.toParseObj.split(']}');
        console.log(rs);
        //catch escapes
        var nrs = this.catchEscape(rs);
        console.log(nrs);
        let nrs0 = nrs[0];
        var nrs1 = nrs0.split('{"records":[');
        var nrs2 = this.catchEscape(nrs1);
        console.log(nrs2);
        let nrs3 = nrs2[1];
        var nrs4 = nrs3.split("},{");
        var nrs5 = this.catchEscape(nrs4);
        console.log(nrs5);
        for (let nr of nrs5){
            //split "," for key,value s
            var nrs6 = nr.split(",");
            var nrs7 = this.catchEscape(nrs6);
            console.log(nrs7);
            let nrs8 = nrs7[0].split("\"name\":");
            let namevalue = nrs8[1].split("\"")[1];
            console.log(nrs8);
            let nrs9 = nrs7[1].split("\"message\":");
            let messagevalue = nrs9[1].split("\"")[1];
            this.pitems.push(new PostItem(namevalue,messagevalue));
        }
        return this.pitems;
    }
}
class PostItem {
    constructor(public name: string,
                public message: string) {
    }
  }

@Injectable()
export class SearchService {
  apiRoot: string = 'url/path/to/your/fetch.php';
  results: PostItem[];
  loading: boolean;

  constructor(private http: Http) {
    this.results = [];
    this.loading = false;
  }

  search() {
    let promise = new Promise((resolve, reject) => {
      let apiURL = this.apiRoot;
      
      this.http.get(apiURL)
          .toPromise()
          .then(
              res => { // Success
                console.log(res);
                var json=JSON.stringify(res);
                var jobj = JSON.parse(json);
                var jobj2 = jobj["_body"];
                //var jobj3 = JSON.parse(jobj2);
                console.log(jobj2);
                var parseobj = new ParseObject(jobj2);
                this.results = parseobj.parseString();
                this.results.reverse();
                console.log(this.results);
                
                /*
                for (let job in jobj){
                    this.results.push(new PostItem(job.name, job.message));
                }
                //this.results = res.json().results.map
                /*
                this.results = jobj.map(item => {
                  return new PostItem(
                      item.name,
                      item.message
                  );
                });*/
                // this.results = res.json().results;
                //posts = res.data.records;
                resolve();
              },
              msg => { // Error
                console.log("hit error");
                reject(msg);
              }
          );
    });
    return promise;
  }
}
Also make sure to make http global via app.module.ts and adding your service as a provider with
import { HttpModule } from '@angular/http';

...

@NgModule({
  imports: [
    ..., HttpModule
  ],
  declarations: [
   ...
  ],
 
  providers: [ SearchService ],
  ...
})

Deployment of your Angular app

You've may know the basics about instancing your Angular app via command line environment with ng serve yada yada yada, but do you know how to build bundle your app for distribution/deployment?


ng build --prod

 command will create dist (distribution folder with javascript bundles). 

Then with the generated files in the dist folder, you need simply upload to deployment (site host) and your site should be up and running.  You can google search to do deployment to AWS (amazon) or any other site hosting solution.  

Saturday, September 30, 2017

Metal Swift usage of Command Buffer with complex Kernel(s) computations and more

Advice for working with complex kernel encoding computations:

-Account for all commandBuffer instances and .commit() s its important since any added instance without a .commit on the commandBuffer in a draw loop is a memory leak.  May not show on XCode Instruments stack trace but will show on Debug Memory Graph while running in Debug mode (see https://medium.com/@xcadaverx/locating-the-source-of-a-memory-leak-712667bf8cd5)

Important to enable malloc stack tracing to see the code source of memory leaks.  Inevitably if you have an instanced commandBuffer that isn't committed in loop will source back to its instance creation as a culprit.  The remedy is easily in loop committing this instanced commandBuffer.

Why re instance commandBuffer?

As it turns out with kernel computations if you are doing complex kernel computation work with multiple pass encodings sequentially.  You can singly instance any device buffers that need be passed to the GPU on kernel instancing (usually at the outside of your viewcontrollers initialization) and then re write to these buffers with memcpy() functions

for instance:

        let hillbufferptr = hilllockbuffer?.contents()
        memcpy(hillbufferptr, &(hilllockLabelMap!),hilllockByteLength)
        let relabelbufferptr = relabelbuffer?.contents()
        memcpy(relabelbufferptr, &(relabelMap!), relabelByteLength)
        let maxhbufferptr = maxhbuffer?.contents()
        memcpy(maxhbufferptr, &(maxHMap!), maxhByteLength)
        plist = [Float32](repeating: -1.0, count: (hillockmatrixtexture?.width)!*(hillockmatrixtexture?.height)!)
        let plistbufferptr = plistbuffer?.contents()
        memcpy(plistbufferptr, &(plist!), plistByteLength)

Then buffers need not be instanced in loop.  Also important instancing textures outside of draw loop otherwise these can translate into memory leaks if not properly dealt with in terms of deallocation.

Anytime data retrieval occurs from a kernel encoding  requires a commandBuffer.commit() and waitUntilCompleted() method...this translates into a new instancing (as far as I can tell) of a commandBuffer.  The memory on the old command Buffer is freed otherwise.

Strategy for complex kernel pass encoding with multiple kernels passing a given data buffer from one kernel to the next.  My advice (as per Apple's direct advice) avoid running a series of commandBuffer.commit() and waitUntilCompleted() method calls for buffer to array call backs only to in turn re write these back to the buffers.  Instead use a single buffer and pass that same pointer buffer from one encoding kernel to the next.  Callback instancing memory with byte copies to an array is slow and will likely cause GPU to hang error messages...or as an engineer describes this is merely serializing data flow from the CPU to the GPU.  It is slow and cumbersome.   I personally found only one instance where CPU processing data was used...my instance was CPU processing array data needed to sort and create set instancing of array data: sorting an array and eliminating duplicate values.  This data would in turn determine the iterative structure of added encoding necessary to the algorithm.

I haven't found a recipe nor have been able to construct an adequate recipe in passing something like float array data into structs and pointing to the struct with such objects since working with instanced struct data (unless it is in pointer form) in the struct on the GPU side has a discretely defined array container size...instead I've just passed array pointers directly to the kernel functions as needed.  Plenty of code recipes for this.

Monday, September 25, 2017

Guild Wars 2 Path of Fire -The Departing Eater of Souls strategy

Early spoiler beta.  The eater of souls will regenerate health after leap attack with tractor beam health siphon.  The strategy to this fight is to run opposite the beam and do a double evade on it at the same time when he launches the health drain.  When far enough away the siphon will do no damage and the Eater of Souls regenerates no health. 

Thursday, September 14, 2017

Dynamic map storage on a texture map in Metal 2(Apple)

   The idea is pretty straightforward here.  Namely, to implement this form of storage, you will probably want a texture that has at least 2 channels, but this can also work for 4 channels with 2 channels reserved for the key value map storage.  

It would be noted that in fact the 'dynamic' storage portion of this storage is actually reserved storage space containing texture.width*texture.height possible indices for storage.  Keying without any layer of lexical enumeration between indices would be an added implementation beyond the scope of what I'd like to cover in this article.  So it is assumed that the indices given by the total range on the set, will serve directly as the key value, and thus the mapped key will have such value.  Thus the key 1 is mapped to index 1 on the texture storage set.  

To map these hash values means then to have some storage value likely a boolean value 1 indicating the hash bucket is reserved,  or 0 that it isn't reserved.

The formulation for converting from a 2d array to 1d array is pretty simple.  

y = indexkey/ texture.width
x = indexkey% texture.width

Thus in metal if your texture point is set with a half4 vector

You can say reserve at such index by accessing (via sample or write) the position:
where 
samp = sampler
uint2 indexval = (indexkey/texture.width, indexkey%texture.width);
half4 color = currenttexture.sample(samp, indexval);
color.zw = half2(1.0, value);
desttexture.write(color,indexval);

Thus mapping to latter two channels on destination texture transposing first 2 channel values from the original texture.  

Lookups then are pretty straight forward if the key is known just plug into x,y formulation above and control check for boolean positive (storage allocation) and then its corresponding value pair.  Noted this is for half format numbers.

Thursday, July 27, 2017

The finished photo and format. Why it matters?

    Some time ago while working with gray height map images, I noticed that 16 bit gray scale images were often preferable relative to 8 bit format images.  The reason for this being that often when rendering terrain height maps in 8 bit format that I would have obvious terrain artifacts appearing in the image.  A three dimensional model would often appear less resolved relative to the 16 bit format of the same terrain image.  This lack of resolution in photography would translate in the way of pixelation and blurriness.  The deeper reason for this, however, lay embedded in the math of the gray scale format.  Namely, in 8 bit grayscale format and image would merely have 256 gray light  variations relative to 256*256 = 65536 variations of gray light color.  256 gray light variations may not seem on immediate inspection a source for complaint depending on terrain topography but if height maps were differed say by 10,000 ft for example.  An 8 bit gray scale format would provide a step variation of 39 ft per gray light variation while a 16 bit channel furnishes .15 ft per gray light variation.  The obvious difference is immediately striking by the math, and it is also the same for terrain elevation rendering in terms of smoothness and continuities.  That is, the 16 bit format is likely to appear as having step discontinuities or 'bumps' and 'artifact' in the image making them appear synthetic or artificial.
    The problem of 8 bit versus 16 bit format color format has applications as well notably in the way of processing digital images in a similar manner, but can at times especially in landscape photography be appreciated in a similar way.  Notably, the 8 bit RGB format has similar a restriction of 256 light color variations per channel.  This problem especially relates to pure gray light color variations say as given in clouds especially lacking variations in the RGB channels.  A pure gray color, for instance, could be represented (120,120,120) mathematically in terms of it given color value,  for instance, equally any pure gray light could be represented by (n,n,n) where n is some value between 0 and 255.  It is important noting pure gray light is equal on all channels, and thus seeing the inherent problem as mentioned above.  Pure gray light variations (where an image has more monochromatic attributes) become immediately observed.
    While cloud variations may have many different shades and tones that definitely extend outside a monochrome, at times, maybe you have encountered photos where light pixels were definitely constrained inside this range?  I have felt like in recent times actually I have in certain circumstances encountered it, and the effect in processing images were residual visual artifact that needed be resolved or at least were overlooked in the finishing process, or at least if having been left unresolved would translate into imaging artifacts like synthetic gradient lines furnishing step discontinuities...these are obvious to naked eye as artificial lines going beyond the distinction of a cloud and its surrounding background.
   Moral:  While decent entry level cameras not only furnish raw (uncompressed) format photos, it may help to know what color bit representation/resolution also occurs here.  Optimally if cameras were situated at 16 bit per color channel RGB, digital representation of color would likely neither suffer from the problems found either in capturing an image in 8 bit per channel, or having at least finished rendering of the same type.  Most web publishing today of photos stick with 8 bit formats.  The image in the 16 bit format on the other hand while visually optimal is not optimal in the way of storage since 16 bit images are many times over the size of an 8 bit per channel image, but maybe someday pushing the standard for higher quality imaging will win out if culture changes, and this isn't received in the way as industry standards to push 24 bit audio formats have been received.  

Thursday, April 6, 2017

Singles Canada site

Canada Singles

Review:  I think all but one or two had legitimate profiles for those that were inquiring on a given profile.  Quite predatory site which any number running the most obvious scams.  Conversations inevitably wind up migrating to:

1.  A dead father, or dead parents.
3.  A profile candidate living in Africa, or more specifically Ghana while a profile indicating none of the above.
4.  A profile candidate waiting on inheritance.
5.  Most eventually migrating to requests for money.

After extensive conversation with one suspected fake profile candidate, an individual offered that most profiles on the site were fakes and likely being located anywhere else but as were profiles were supposedly originating.

Probably one of the worst places to encounter fraud.  Not a good site.  Especially if you wanting to find someone from Canada!  :)

Most of the profile fraud, however, has been quite easy to spot and obvious.  All usually don't bother lingering around on questioning.

Raises an important question in mind.  Why pay $40.00 a month for that?

This all brings to mind the massive and rampant level of consumer fraud that exists for an online dating site with a top search result under "Canada Dating"

Oblivion

 Between the fascination of an upcoming pandemic ridden college football season, Taylor Swift, and Kim Kardashian, wildfires, crazier weathe...