Sunday, December 31, 2017

DIY Audio self publication starting out


   I recommend having a modest exploratory budget in mind if you intend to seek out a niche in the self publishing world of music and are a newcomer. 

   Gear requirements for a modest (non smartphone setup): 
   -Microphone(s)
   -DAW (Digital Audio Workstation) software (e.g. Ableton Live 9 or Protools) or some DAW platform  (e.g.  Tascam and a number of manufacturers can be found providing gear through places like sweetwater.com).
   -USB interface (if using DAW software option).
   -Laptop or Desktop or possibly Tablet (check your DAW software hardware requirements to make sure both USB interface and hardware meet software specifications).
 
That's technically about it to get started.

A little primer on mics:

Shure microphones have been a nice versatile gig and studio recording microphone.  While home studio recording budgets are more easily fashioned using any number of condenser mics now on the market.  A basic setup will require likely a laptop/desktop/tablet if you want decent audio recording though the reality of smartphone recordings are equally here to stay.  Native smartphone recordings are likely going to be very rudimentary for room recordings relative to home studio setups.  Unless you know what you are looking for in a microphone...I highly recommend entry level stuff if you are new to the business.

DAW software...I recommend going with larger scale suite options that provide unlimited track options and studio effects packages.  At least this allows you to in house much of your track mastering and studio effects needs for audio production purposes.

USB interface...what are your purposes and needs...if you are a solo musician that need only record yourself and maybe a single instrument at a time, the simplest interfaces likely work (e.g. 2 mic inputs and 2 1/4" stereo inputs) probably will work fine.  If you need to do live session recording work with more than 1 musician, you may want to investigate larger scale interfaces that furnish more simultaneous inputs. 

Laptop and Desktop.  I've used mid line processors for laptops or desktops alike.  Your audio interface will do the Analog to Digital work so really whatever remains necessary for computation in audio is generally pretty well handled for many home studio applications...your audio interface hardware setting will need be set to ASIO for input (which directly has the audio interface) handling (much like a video card has a gpu) audio based computational work. 

Learning to record:
I'll omit particular guides on recording and mastering (beyond scope of what I wanted to cover right now).    If you make time and effort to record, you'll likely be recording something and learning to use your gear.  Otherwise, not.  Anytime spent recording is better than no time or time that need be structured in highly choreographed ways.  The more people that have to be recorded will likely mean tighter and more restrictive scheduling unless you work around this...in other words, do you have to have everybody in a music group there to record at any given time in order to record and the same goes for mastering?  Spend your time doing scratch work to learn.  Plenty of scratch is how you learn to get better at what you are doing.  The best time learning about audio engineering may come in your own company doing this work whenever you can...doing this on your time not others.

Publishing to Video sites.  Many musicians actually use either slideshows or a single photo still for, for example, Youtube publication.  Here are some things to keep in mind:

Windows, Mac, and iOS furnish native movie making programs that allow you to import your audio file.  The simplest self publication  here uses a photo still of your choosing and creating a video length of that still filling out the time span of the audio file being interlaced with that image still.  You can get, of course, more creative and build slideshows or integrating video of your choosing.

More premium video production services:

You can invest in things like Adobe premiere or Adobe Cloud subscription.  Here you can build more complex choreography of video work here.  There is likely a bit more of a learning curve both understanding how to work with animated transitions that weren't template ones.  Though Adobe does provide templates as well if you hadn't wanted to learn the ins and outs of editing timeline curves for animating effects and transitions.  You may want to learn things like audio video synchronization when using separate audio and video recording (non interlaced sources).   

At the moment.  H264 is a common high definition standard for video publication.  Likely your video editor will give you mastering format options for the video type that you want to produce for a given social media platform (at least it should). 

Self publications to sites like Google Play, Amazon, Apple Store and Spotify:

Sites like DistroKid  allow you to publish your music by handling all legal work necessary to get your music copyright recognized...you pay a yearly subscription fee (entry level is a small annual fee) and you keep royalties on your music sales.

Self marketing: 

Uploading your music to Youtube, or getting yourself setup for any given market online or otherwise, isn't a likely guarantee to music sales, and unfortunately, even marketing online may not be either.  Far from quality audio recordings can actually be quite successful and mostly this relates to how any recording artist(s) sell themselves in a social way.  If you are niched in a collector's market, you may have decent odds depending on the media type that you are selling (e.g., vinyl and CD may provide better returns than by mp3).  Though as I've read, you likely want to avoid being excessive with twitter, facebook or any other social media spam when engaging with people socially.  Positives would include getting reviewed and featured in zines, music journals and so forth, and potentially having higher levels of exposure through sites with larger scale traffic.  Word of mouth and festivals are likely a bigger positive to social exposure.  Having someone's ear is different than having hoped they stumbled upon you in a vast sea of audio. 

 Home studios have been liberated with higher quality audio production tools that are more affordable, but the pool of interest in doing such has also increased.  If the acceptance of a culture around poorer quality audio production recordings is a testament that successful social exposure need not correlate to higher quality gear, if you are entry level into this market, I wouldn't bother trying to raise significant money to pay either for a recording or for the gear to do such.  It is a waste unless you know what you are doing. 


Wednesday, December 27, 2017

New Economy part 3 in series

  Given the ever increasing rise of machine intelligence,  humans will increasingly not be able to compete across greater spectrums of machine automated production modes.  This isn't merely in manufacturing or transportation sectors but increasingly affecting service sector economies.  Nonetheless I think we are entering into a new chapter in social and economic history...could we be ready to affirm in our own soon to be social history the birth of the Machine Intelligence Revolution?  This has broad impact to civilization and the ways that we will live in the future.

Questions to consider:  why is there mostly scant thoughts with respect to existing impacts?  And why consideration without so much consideration to social and economic displacements?   Is it sufficient and fair to argue that the luxuries of such revolution should only increase worker productivity while marginally having impact to worker population numbers, for instance? 

New Economy part 2 in series

   Certainly if less permanence were true and that anything were paid by task and template, the notion of employment as worthwhile comes measurably one's effective response and analysis.  Income diversification were arguably present to the degree that a measure of self worth were given less so by the perception of self worth but by the measure of worth.  How many tasks have you completed (for metrics)?  What is your rating (for metrics)?   Answer to be presented...or rather a range of task submissions for a task presented before a contract or even necessary details should be a given.  AI chatbots find ways in abstracting project goals in such a way without compromising proprietary concerns (or that anything of the NDA) need ever be signed.  The creative mind is valued, yes, for its supposedly intangible alchemy, but ever increasingly the template world and all approximation of the creative mind is encroached upon.  Where ever the creative mind?

Left at the base of the artful mind.  How are lives are an artful process has some intrinsic meaning...as it were to the design of self engineering.  What happens to works and so much definition and value is placed to the abstraction of lives relative works...works transcend lives but in the future?  

And while we cling to what we can touch and feel for its value, that alone, drives the future of gold...or that artificial scarcity is replicated by way of the lesser cosmic events creating such elements in the first place or that atoms elsewhere could be so diffuse as to maintain even higher levels of scarcity...future makes arbitrary the alchemical mind which has spun gold from ideas alone!  There is only delusional consensus that anything has value in the future...and that is the future economy.  The notion that we must believe much has value when scarcity no longer exists.  What is the future of desire?  What is the future of wanting?   



Monday, December 25, 2017

Dreams, Desires, and Aspirations

    Venturing into the foray of self publication, self broadcast, or anything else. 

    You have tried the self publishing world briefly.  You found ways to do epub formats and then having your e-reader handy tested the format to verify that your little collection of short stories could be read through a few times.  When to do:  A few weekends out of the year or when in a forced insta hire layoff!

    You ventured into self publication of music.  You generated more spotify notice than anything and paid at fractions of a penny for each listen.  Who actually buys an mp3?  You bought your own album!  When to do:  See time permits above...or otherwise when the unheated garage is tenable to practice sessions!

    You whizzed through another country at a blistering pace with camera firing in tote and attempted to market your natural landscape photography.  Having below entry level camera gear and nothing more would make far from justice the prodigious residence of imagery that should exist on market.  You might actually buy your own stuff on 500px.  If only having placed yourself on the NYTimes best seller's list, a marketing trend might suffice in giving you notice.  You pay yourself for zero sum net loss/gain and something out of nothing makes something!   When to do:  Any scant deal for government subsidized flight trips abroad with enough accumulated savings makes this two week sojourn possible! 

    You'll market your prowess in scripting languages, anything erudite making COBOL sexy again!  You find yourself fluent in the way of knowing lookups...mastery being no longer necessary, your mind is being read and understood better than you might have been able to express.  The vast language of machines could do much interpretive reading on the sequencing of natural language.  The designer you worked from one system into another, leaping on dev trends.    When to do:  boredom sets in.

    You'll sublet a room in your house.  Unless city or state regulators say otherwise.  In that case, it were likely utilizing some obscure 19th century law relating to people with hats walking where they shouldn't be...or pertinently the obscurum of commercial versus residential zoning.  When to do: Several weekends out of the year!

    You'll invest pocket change in the new stock.  You lost a sum in bitcoin.  IRA/401k account!  When to do:  the last time you drained the retirement account.

    You'll taxi people.  Everything to become as tiny as a rickshaw navigating the maze like corridors of a city sprawling into the sky.  You are at the base of everything.  When to do: this is your job on Wednesday!
 
     You'll leave an instant hire job for another opportunity.  Career transience is one's intelligence in knowing that office is a shared space in the tropics for a few afternoons with billowing storm clouds ominously overhead.  When to do:  This is your job on Friday!

     Your mind maps will be commerce.  So to your thinking processes which could be a commodity.  You will teach people how to live again.  Personal philosophy coaches and lifestyle administrators abound.  When to do:  This is your job on Tuesday!  

   The rest of the days of the week are given to the resilience of repurposing your life and the things around you!  A new mall has opened up down the street with explicitly that purpose even if stores exist in providing rehabitation of old habits.  You move like a modulus on the branch of time, always who you are and you should be!

     

Saturday, December 23, 2017

Call this redefinition

   Call this redefinition.
   Measure is by analytics.
   You have resurfaced the you that remembered a fragment of a line
   Educate oneself to know what could be remembered
   What is to learn? 
   No more marketing tomorrow!
   All of this will be automated in time! 
   No more sales! 
   No more mentoring!
   No more law! 
   

Eventful and uneventful

    Half a day and a year and a half later.  An eventful milestone passes nearest to the holidays.

    What can one say about that time?  The shroud of secrecy that claims any that knows without claiming to know, pretends without pretending.  Corporate schedules are routed like circuitutious liabilities and all the passive journals in writing with digital eyes.  Rotating through and through, stacking through the heaps, that class that asks for much.  Another aims her scanner at you and feigns to sleep, that is a place where much substance is made of anything beyond its due.  All hyperbolic in presence is the world of suspicion there in store.

Stack the heaps.
What remains seems endless with false corporate cheer.

Wednesday, December 13, 2017

React Firebase Deployments

   Some helpful hints to syntax errors, notably as related to 'npm run build' and 'firebase deploy' as related to syntax error shows either an older version a 'build/static/js/mainxxxxx.js' build or a current version but seems to be referencing an older version of this.

1.  Updating the 'index.html' file on new deployment.  Important to do this.  You can add a space in the file (or some character which makes the operation of the file functionally unchanged), for instance.

2.  In the Firebase console under the hosting tab, you may need to rollback to the current deployment, or at least ensure this has happened.  I have found that any new 'firebase deploy' command hasn't fully ensured updates of all deployment files to the latest deployment batch.

I have also experienced old cached site data remaining even upon refreshing firebase server data.  I am not certain why this seems to be more so the case either with 'firebase deploy' ments or 'firebase serve' (local hosting), but with chrome browser, you can do a refresh load with ctrl+'reload'.  Seems to remedy old data in the browser system. 

My current experience as of 12/13/2017

  

Sunday, November 5, 2017

React App recommendations for inlining svg art

Recently playing around with Adobe Illustrator in conjunction with Animate CC for export into ReactJS .  However, as I've encountered with Animate CC choppy motions for animating svg group elements coupled to all the hassle of inlining svg plus simil code...this allegedly is trending down in terms of popularity, one particular bright spot that I've found for inlining rapidly lengthy pieces of svg code comes by way of transpiling (in cli environment) an svg into reactjs components.  It is done easily at console.  Here's a link that helps doing such...

https://medium.com/@scbarrus/transform-raw-svg-files-into-react-components-in-seconds-25faf56a6f07

I've found that it does what it says in a second for large svg files.  Just follow instructions for installing globally via npm.

Of course aside from working on a component search object element and then modify inside reactjs component object for animation with loop call inside such component (in other words creating js side animation), css is an alternative as well.  React by the way is nice for creating ios and android apps while avoiding all the hassles of learning to have to code in swift and using xcode or android and java in general.  React apps can be provisioned for web apps as well. 


Friday, October 13, 2017

Angular backend integration problem...custom parser

I found any number of back end parsing solutions that didn't work with my angular app, and I spent more time researching this problem in the long run.  It took me 30 minutes (no kidding) to write a functioning string parser for approximately several hours of non solution to Angular typescript component interfacing.

Example parser

  
class ParseObject{
    toParseObj: string;
    pitems: PostItem[];
    constructor(objstr: string){
        this.toParseObj = objstr;
        this.pitems = [];
    }
    catchEscape(rs: string[]){
        var i = 0;
        var rtnrs: string[] = [];
        var setcontinue: boolean = false;
        for (let r of rs){
            if (setcontinue){
                i+=1;
                setcontinue = false;
                continue;
            }
            if (r[r.length-1] == "\\" ){
                if ((i+1) < (rs.length-1)){
                    let fr: string = rs[i+1];
                    rtnrs.push(r+fr);
                    setcontinue = true;
                }
            }
            else{
                rtnrs.push(r);
            }
            i+=1 
        }
        return rtnrs;
    }

    iterateParse(strP: string, strset: string[]){
        for (let strp in strset){
            var splitobj = strp.split(strP);
            var nsplitobj = this.catchEscape(splitobj);
        }
    }

    parseString(){
        /*{"records":[{"name":"JimBob","message":"Hello katz!"},{"name":"Dr IQ","message":"Hello hello my name is Jim Bobby.
 Been an old friend of shorty there!
 Gratz!"}]}*/
        //presumes user properly input string for formatting
        var rs = this.toParseObj.split(']}');
        console.log(rs);
        //catch escapes
        var nrs = this.catchEscape(rs);
        console.log(nrs);
        let nrs0 = nrs[0];
        var nrs1 = nrs0.split('{"records":[');
        var nrs2 = this.catchEscape(nrs1);
        console.log(nrs2);
        let nrs3 = nrs2[1];
        var nrs4 = nrs3.split("},{");
        var nrs5 = this.catchEscape(nrs4);
        console.log(nrs5);
        for (let nr of nrs5){
            //split "," for key,value s
            var nrs6 = nr.split(",");
            var nrs7 = this.catchEscape(nrs6);
            console.log(nrs7);
            let nrs8 = nrs7[0].split("\"name\":");
            let namevalue = nrs8[1].split("\"")[1];
            console.log(nrs8);
            let nrs9 = nrs7[1].split("\"message\":");
            let messagevalue = nrs9[1].split("\"")[1];
            this.pitems.push(new PostItem(namevalue,messagevalue));
        }
        return this.pitems;
    }
}
class PostItem {
    constructor(public name: string,
                public message: string) {
    }
  }

You can build a custom service to be used with a promise


Here's my component service
import { Component, Injectable, OnInit } from '@angular/core';
import {PVector} from './lineData';
import {frameResizer} from './frameResizer';
import { routerTransition } from './router.animations';
import {Guest } from './guest';
import {Http, Headers, RequestOptions, Response} from '@angular/http';
import 'rxjs/add/operator/map';
import 'rxjs/Rx';
/*{"records":[{"name":"JimBob","message":"Hello katz!"},{"name":"Dr IQ","message":"Hello hello my name is Jim Bobby.
 Been an old friend of shorty there!
 Gratz!"}]}*/

class ParseObject{
    toParseObj: string;
    pitems: PostItem[];
    constructor(objstr: string){
        this.toParseObj = objstr;
        this.pitems = [];
    }
    catchEscape(rs: string[]){
        var i = 0;
        var rtnrs: string[] = [];
        var setcontinue: boolean = false;
        for (let r of rs){
            if (setcontinue){
                i+=1;
                setcontinue = false;
                continue;
            }
            if (r[r.length-1] == "\\" ){
                if ((i+1) < (rs.length-1)){
                    let fr: string = rs[i+1];
                    rtnrs.push(r+fr);
                    setcontinue = true;
                }
            }
            else{
                rtnrs.push(r);
            }
            i+=1 
        }
        return rtnrs;
    }

    iterateParse(strP: string, strset: string[]){
        for (let strp in strset){
            var splitobj = strp.split(strP);
            var nsplitobj = this.catchEscape(splitobj);
        }
    }

    parseString(){
        /*{"records":[{"name":"JimBob","message":"Hello katz!"},{"name":"Dr IQ","message":"Hello hello my name is Jim Bobby.
 Been an old friend of shorty there!
 Gratz!"}]}*/
        //presumes user properly input string for formatting
        var rs = this.toParseObj.split(']}');
        console.log(rs);
        //catch escapes
        var nrs = this.catchEscape(rs);
        console.log(nrs);
        let nrs0 = nrs[0];
        var nrs1 = nrs0.split('{"records":[');
        var nrs2 = this.catchEscape(nrs1);
        console.log(nrs2);
        let nrs3 = nrs2[1];
        var nrs4 = nrs3.split("},{");
        var nrs5 = this.catchEscape(nrs4);
        console.log(nrs5);
        for (let nr of nrs5){
            //split "," for key,value s
            var nrs6 = nr.split(",");
            var nrs7 = this.catchEscape(nrs6);
            console.log(nrs7);
            let nrs8 = nrs7[0].split("\"name\":");
            let namevalue = nrs8[1].split("\"")[1];
            console.log(nrs8);
            let nrs9 = nrs7[1].split("\"message\":");
            let messagevalue = nrs9[1].split("\"")[1];
            this.pitems.push(new PostItem(namevalue,messagevalue));
        }
        return this.pitems;
    }
}
class PostItem {
    constructor(public name: string,
                public message: string) {
    }
  }

@Injectable()
export class SearchService {
  apiRoot: string = 'url/path/to/your/fetch.php';
  results: PostItem[];
  loading: boolean;

  constructor(private http: Http) {
    this.results = [];
    this.loading = false;
  }

  search() {
    let promise = new Promise((resolve, reject) => {
      let apiURL = this.apiRoot;
      
      this.http.get(apiURL)
          .toPromise()
          .then(
              res => { // Success
                console.log(res);
                var json=JSON.stringify(res);
                var jobj = JSON.parse(json);
                var jobj2 = jobj["_body"];
                //var jobj3 = JSON.parse(jobj2);
                console.log(jobj2);
                var parseobj = new ParseObject(jobj2);
                this.results = parseobj.parseString();
                this.results.reverse();
                console.log(this.results);
                
                /*
                for (let job in jobj){
                    this.results.push(new PostItem(job.name, job.message));
                }
                //this.results = res.json().results.map
                /*
                this.results = jobj.map(item => {
                  return new PostItem(
                      item.name,
                      item.message
                  );
                });*/
                // this.results = res.json().results;
                //posts = res.data.records;
                resolve();
              },
              msg => { // Error
                console.log("hit error");
                reject(msg);
              }
          );
    });
    return promise;
  }
}
Also make sure to make http global via app.module.ts and adding your service as a provider with
import { HttpModule } from '@angular/http';

...

@NgModule({
  imports: [
    ..., HttpModule
  ],
  declarations: [
   ...
  ],
 
  providers: [ SearchService ],
  ...
})

Deployment of your Angular app

You've may know the basics about instancing your Angular app via command line environment with ng serve yada yada yada, but do you know how to build bundle your app for distribution/deployment?


ng build --prod

 command will create dist (distribution folder with javascript bundles). 

Then with the generated files in the dist folder, you need simply upload to deployment (site host) and your site should be up and running.  You can google search to do deployment to AWS (amazon) or any other site hosting solution.  

Saturday, September 30, 2017

Metal Swift usage of Command Buffer with complex Kernel(s) computations and more

Advice for working with complex kernel encoding computations:

-Account for all commandBuffer instances and .commit() s its important since any added instance without a .commit on the commandBuffer in a draw loop is a memory leak.  May not show on XCode Instruments stack trace but will show on Debug Memory Graph while running in Debug mode (see https://medium.com/@xcadaverx/locating-the-source-of-a-memory-leak-712667bf8cd5)

Important to enable malloc stack tracing to see the code source of memory leaks.  Inevitably if you have an instanced commandBuffer that isn't committed in loop will source back to its instance creation as a culprit.  The remedy is easily in loop committing this instanced commandBuffer.

Why re instance commandBuffer?

As it turns out with kernel computations if you are doing complex kernel computation work with multiple pass encodings sequentially.  You can singly instance any device buffers that need be passed to the GPU on kernel instancing (usually at the outside of your viewcontrollers initialization) and then re write to these buffers with memcpy() functions

for instance:

        let hillbufferptr = hilllockbuffer?.contents()
        memcpy(hillbufferptr, &(hilllockLabelMap!),hilllockByteLength)
        let relabelbufferptr = relabelbuffer?.contents()
        memcpy(relabelbufferptr, &(relabelMap!), relabelByteLength)
        let maxhbufferptr = maxhbuffer?.contents()
        memcpy(maxhbufferptr, &(maxHMap!), maxhByteLength)
        plist = [Float32](repeating: -1.0, count: (hillockmatrixtexture?.width)!*(hillockmatrixtexture?.height)!)
        let plistbufferptr = plistbuffer?.contents()
        memcpy(plistbufferptr, &(plist!), plistByteLength)

Then buffers need not be instanced in loop.  Also important instancing textures outside of draw loop otherwise these can translate into memory leaks if not properly dealt with in terms of deallocation.

Anytime data retrieval occurs from a kernel encoding  requires a commandBuffer.commit() and waitUntilCompleted() method...this translates into a new instancing (as far as I can tell) of a commandBuffer.  The memory on the old command Buffer is freed otherwise.

Strategy for complex kernel pass encoding with multiple kernels passing a given data buffer from one kernel to the next.  My advice (as per Apple's direct advice) avoid running a series of commandBuffer.commit() and waitUntilCompleted() method calls for buffer to array call backs only to in turn re write these back to the buffers.  Instead use a single buffer and pass that same pointer buffer from one encoding kernel to the next.  Callback instancing memory with byte copies to an array is slow and will likely cause GPU to hang error messages...or as an engineer describes this is merely serializing data flow from the CPU to the GPU.  It is slow and cumbersome.   I personally found only one instance where CPU processing data was used...my instance was CPU processing array data needed to sort and create set instancing of array data: sorting an array and eliminating duplicate values.  This data would in turn determine the iterative structure of added encoding necessary to the algorithm.

I haven't found a recipe nor have been able to construct an adequate recipe in passing something like float array data into structs and pointing to the struct with such objects since working with instanced struct data (unless it is in pointer form) in the struct on the GPU side has a discretely defined array container size...instead I've just passed array pointers directly to the kernel functions as needed.  Plenty of code recipes for this.

Monday, September 25, 2017

Guild Wars 2 Path of Fire -The Departing Eater of Souls strategy

Early spoiler beta.  The eater of souls will regenerate health after leap attack with tractor beam health siphon.  The strategy to this fight is to run opposite the beam and do a double evade on it at the same time when he launches the health drain.  When far enough away the siphon will do no damage and the Eater of Souls regenerates no health. 

Thursday, September 14, 2017

Dynamic map storage on a texture map in Metal 2(Apple)

   The idea is pretty straightforward here.  Namely, to implement this form of storage, you will probably want a texture that has at least 2 channels, but this can also work for 4 channels with 2 channels reserved for the key value map storage.  

It would be noted that in fact the 'dynamic' storage portion of this storage is actually reserved storage space containing texture.width*texture.height possible indices for storage.  Keying without any layer of lexical enumeration between indices would be an added implementation beyond the scope of what I'd like to cover in this article.  So it is assumed that the indices given by the total range on the set, will serve directly as the key value, and thus the mapped key will have such value.  Thus the key 1 is mapped to index 1 on the texture storage set.  

To map these hash values means then to have some storage value likely a boolean value 1 indicating the hash bucket is reserved,  or 0 that it isn't reserved.

The formulation for converting from a 2d array to 1d array is pretty simple.  

y = indexkey/ texture.width
x = indexkey% texture.width

Thus in metal if your texture point is set with a half4 vector

You can say reserve at such index by accessing (via sample or write) the position:
where 
samp = sampler
uint2 indexval = (indexkey/texture.width, indexkey%texture.width);
half4 color = currenttexture.sample(samp, indexval);
color.zw = half2(1.0, value);
desttexture.write(color,indexval);

Thus mapping to latter two channels on destination texture transposing first 2 channel values from the original texture.  

Lookups then are pretty straight forward if the key is known just plug into x,y formulation above and control check for boolean positive (storage allocation) and then its corresponding value pair.  Noted this is for half format numbers.

Thursday, July 27, 2017

The finished photo and format. Why it matters?

    Some time ago while working with gray height map images, I noticed that 16 bit gray scale images were often preferable relative to 8 bit format images.  The reason for this being that often when rendering terrain height maps in 8 bit format that I would have obvious terrain artifacts appearing in the image.  A three dimensional model would often appear less resolved relative to the 16 bit format of the same terrain image.  This lack of resolution in photography would translate in the way of pixelation and blurriness.  The deeper reason for this, however, lay embedded in the math of the gray scale format.  Namely, in 8 bit grayscale format and image would merely have 256 gray light  variations relative to 256*256 = 65536 variations of gray light color.  256 gray light variations may not seem on immediate inspection a source for complaint depending on terrain topography but if height maps were differed say by 10,000 ft for example.  An 8 bit gray scale format would provide a step variation of 39 ft per gray light variation while a 16 bit channel furnishes .15 ft per gray light variation.  The obvious difference is immediately striking by the math, and it is also the same for terrain elevation rendering in terms of smoothness and continuities.  That is, the 16 bit format is likely to appear as having step discontinuities or 'bumps' and 'artifact' in the image making them appear synthetic or artificial.
    The problem of 8 bit versus 16 bit format color format has applications as well notably in the way of processing digital images in a similar manner, but can at times especially in landscape photography be appreciated in a similar way.  Notably, the 8 bit RGB format has similar a restriction of 256 light color variations per channel.  This problem especially relates to pure gray light color variations say as given in clouds especially lacking variations in the RGB channels.  A pure gray color, for instance, could be represented (120,120,120) mathematically in terms of it given color value,  for instance, equally any pure gray light could be represented by (n,n,n) where n is some value between 0 and 255.  It is important noting pure gray light is equal on all channels, and thus seeing the inherent problem as mentioned above.  Pure gray light variations (where an image has more monochromatic attributes) become immediately observed.
    While cloud variations may have many different shades and tones that definitely extend outside a monochrome, at times, maybe you have encountered photos where light pixels were definitely constrained inside this range?  I have felt like in recent times actually I have in certain circumstances encountered it, and the effect in processing images were residual visual artifact that needed be resolved or at least were overlooked in the finishing process, or at least if having been left unresolved would translate into imaging artifacts like synthetic gradient lines furnishing step discontinuities...these are obvious to naked eye as artificial lines going beyond the distinction of a cloud and its surrounding background.
   Moral:  While decent entry level cameras not only furnish raw (uncompressed) format photos, it may help to know what color bit representation/resolution also occurs here.  Optimally if cameras were situated at 16 bit per color channel RGB, digital representation of color would likely neither suffer from the problems found either in capturing an image in 8 bit per channel, or having at least finished rendering of the same type.  Most web publishing today of photos stick with 8 bit formats.  The image in the 16 bit format on the other hand while visually optimal is not optimal in the way of storage since 16 bit images are many times over the size of an 8 bit per channel image, but maybe someday pushing the standard for higher quality imaging will win out if culture changes, and this isn't received in the way as industry standards to push 24 bit audio formats have been received.  

Thursday, April 6, 2017

Singles Canada site

Canada Singles

Review:  I think all but one or two had legitimate profiles for those that were inquiring on a given profile.  Quite predatory site which any number running the most obvious scams.  Conversations inevitably wind up migrating to:

1.  A dead father, or dead parents.
3.  A profile candidate living in Africa, or more specifically Ghana while a profile indicating none of the above.
4.  A profile candidate waiting on inheritance.
5.  Most eventually migrating to requests for money.

After extensive conversation with one suspected fake profile candidate, an individual offered that most profiles on the site were fakes and likely being located anywhere else but as were profiles were supposedly originating.

Probably one of the worst places to encounter fraud.  Not a good site.  Especially if you wanting to find someone from Canada!  :)

Most of the profile fraud, however, has been quite easy to spot and obvious.  All usually don't bother lingering around on questioning.

Raises an important question in mind.  Why pay $40.00 a month for that?

This all brings to mind the massive and rampant level of consumer fraud that exists for an online dating site with a top search result under "Canada Dating"

Friday, March 17, 2017

Typescript/JavaScript Angular Problem: Hashmap a dictionary of rescaled one dimensional coordinate values to their corresponding indices for a position based array mapping and more

Okay, so the title is rather a big one, but the problem goes like this:

You've had to squeeze your data down in such a way so that it fits into your canvas graphic...that is, it makes little sense if you have big numbers that can't be viewed because their plot values are outside of the dimension of your screen.  This problem is of 'squeezing' the data is otherwise re scaling it so that it fits into visual dimensions that can be read.  Its really not a big deal doing that since you can refactor a plot without changing its shape/form attributes by using a global scale factor.  But let's say you wanted to have a quick and easy search algorithm that matches a screen based coordinate that a user has supplied, say from mouseover position that relates to a position on the graph or at least as directly related to your big data.  How to do this?

As it turns out you may have stored your big data, in some sort of collection type that contains points which may have something like: x, y in 2 dimensions, and this collection type may be structured in the form of an array.  The problem is that with any mouseover, your first inclination may be to search the set of points and then break at a common match, but this is slow!  I say this since you are constantly search iterating points to find the match based on a screen position that is supplied.  Why not create an inverse map on the set of points and hashmap these?

As it turns out you can do it in Typescript/Javascript, but as I found out, keys say with the {} type string values.  If your data is on the other hand given by positions which are 'number' types you'd first have to likely Math.round(), Math.floor() or Math.ceiling your numbers.  Then you'd convert this to string using the position.x.toString(),  'number' types have a .toString() method.  Then you'd key the value while storing its array index in value position.

A sketch:

    public setScreenXCoords(){
        //called after screen coordinates have been setSceenXCoords
        //and that this.points is a screen coordinate set.  
        var i = 0;
        for (let pvec of this.screenpoints){
            let x = pvec.x;
            x = Math.round(x);
            /*
            console.log("key: ");
            console.log(x);
            console.log("value");
            console.log(i);
            */
            this.screenToIndex[x.toString()] = i;
            i += 1
        }
    }

In a given data range set of points, it is important to keep in mind that keying position data means also that truncation (by floor, rounding, or ceiling a number) may necessarily supply more than one position for a given key, and for this you may need to bucket the values in an array, and provide additional searching algorithms in refining a search, but at least this method provides nice fast lookup position addressing.

So when you search you need to account for undefined key values.  These are screen positions that do not have a corresponding key in, for instance, inverse2dVectorsMap.  For this, you simply omit the user supplied position until they have sufficiently toggled a position that corresponds to a point on the data set.

And example sketch:

    public getScreenPoint(xposnumber){
        let xpostr = xpos.toString();
        if (xpostr in this.screenToIndex){
            return this.screenToIndex[xpostr];
        }
        return undefined;
    }


Then filter 'undefined' return values relative to non 'undefined' ones.

For example:

    private mouseMoved(eMouseEvent){
        //(mousemove)="mouseMoved($event)"
        console.log(e.offsetX);
        let xstr = e.offsetX.toString();

        
        console.log("screen index: ");
        
        if (e.offsetX != undefined){
            let ind = this.dataCharts[0].getScreenPoint(e.offsetX - 80.0);
            console.log(ind);
            if (ind != undefined){
            this.pointindex[0] = ind;
            this.pointindex2 = ind;
            }
        }
    }

Thursday, March 16, 2017

Tip for determining an angular dom event

Dom events in html5 do have a translation to Angular, but it may not be immediately obvious.  For instance, in Angular the reference call (click) is equivalent to onclick for the corresponding dom event in Html5 outside of Angular.  As it turns the conversion process may follow the rule:
add parenthesis () and then drop the prefix "on"  so "onclick" becomes "(click)" or "onmousemove" becomes "(mousemove)".  This seems to be a general translation language pattern between Angular and non Angular usage in Html.

Sunday, March 12, 2017

Programming complications and work around in Angular

   In a previous post, I provided a simple recipe for passing simple type arrays to avoid more formal data binding in Angular.

Recently a problem emerged where in Typescript (translated to JS), I were instancing randomly generated class/struct data called lineData in working with a visual graph/charts data interface and this data was instanced inside the class of two canvas selector directives from a parent component controller.   My intent were to pass an array object of this class namely lineData[]

The problem:
When instancing a class array type of this data and while passing the array object of lineData, or  lineData[], I was repeatedly generating errors in the contstructor and ngOnInit() or ngOnChanges() methods when attempting to write data to either directive instance.  The error generated was an 'undefined' when accessing the lineData[] object even though it were supposedly populated in the constructor call with data.  Strangely when not passing from the parent controller component to the individual child directives, lineData[] instanced at the child level with non injection would not generate non persistence problems as seen in the parent to child injection case.

Solution:
I still needed to pass and coordinate data between two child directives on the lineData[] array object, so instead, I opted to generate the random lineData[] at the parent component controller level and then pass this to each child directive where data persistence would suffice.  This in retrospect still seems a bit troubling to me, one since evidently manipulating and instancing data in the child directive fails as an instancing source, but not if such origins are in the parent component controller class.  Technically though it should work, it my suspicions are that the ngOnChanges() method is cleaning the lineData[] object (beyond the constructor call) with the changes in the parent level controller binding of this object to its child directives.  Technically, I haven't set anything at the parent level to reset lineData[] (i.e., removing data or splicing or whatever), but the mystery remains as to what were occurring here.

Friday, March 3, 2017

Cool Trick in Angular to avoid formal data bindings

   How should I be passing parameter data, and how do I exactly do it without defining a directive or component?

  Let's say you wanted to defined a child class structure that is merely a object process that does something but you weren't as interested in angularizing syntax to pass parameter data.  There is a way to do this.  The secret goes to C and C++ and other non scripting high level languages that make use of pointers for referencing the memory of object data.  As it turns out in scripting languages commonly, though they don't make use of point or call by referencing exactly, but you can still do it!  


Scripting languages  (e.g., Java, Javascript, and Python for instance) are locked out from call by referencing on:

-number, float, integer, boolean, and so forth object types.  


But not on:

-array, list, hashlist, map, dictionary, class, struct, string, and so forth.  


Thus if you wrap a locked object type in one of the non locked out types such as an array, you can pass parameter data by modification.

The beauty of doing a 'binding' this way is that it is quite simple in terms of syntax.  Literally you use an object wrapper that does the same thing.  

I have avoided defining event listeners for parameter data changes.  Namely in my particular case synchronizing animation loop start and stops between two angular directives, the loop itself would act as an event listener thus omitting the need in explicitly defining an event detection structure in angular.  As it turns out though object wrapped types, I believe, are not detected by implicit changes say in the array such as modification of the array at a given already assigned index.  Thus defining event listeners in angular will likely have work around if you plan on constructing event listeners in conjunction with object wrapping methods that I have described.  

You can construct event listeners (without defining these formally) also by using recursion calls:

method(){
   requestAnimationFrame(()=>this.method())

Technically its an animation timer, but it should be safe for event handling processes.  You'd just need to define breaks if you want to pause or stop this event handler (e.g. a boolean structure that prevents reiteration of the recursion call).

Friday, February 17, 2017

Recipe for creating parallax effect of background image in Angular 2.0 Typescript

import { HostListener, Component, OnChanges, Directive, Input, Output, EventEmitter, SimpleChange, ElementRef} from '@angular/core';

Here is the directive:

@Directive({
    selector: '[track-scroll]',
    //host: {'(window:scroll)': 'track($event)'},
 
})

export class TrackScrollComponent {

  @Output() notify: EventEmitter = new EventEmitter();
  @HostListener('scroll', [''])
    track() {
        console.debug("Scroll Event");
        //console.debug(this._elementRef.nativeElement.scrollTop);
        this.notify.emit(this._elementRef.nativeElement.scrollTop);
    }
   constructor(private _elementRef: ElementRef) {}
}

It is worth noting that you'd want an event emitter which communicates changes in track-scroll directive back to the parent component.

Now as indicated in the previous posting on implementing in the parent component html you'd modify the tag with a on notify event handler:

<nav track-scroll (notify)='onNotify($event)' >
   <nav class='backgroundimageclass' style.background-image.position-y = "{{this.scrollpos}}px"
</nav>


Now in your parent component you'd need to make sure to have  scrollpos set as class instance object type number and ensure that the event function is defined with updates to scrollpos in this onNotify function:

@Component({
    moduleId: module.id,
  selector: 'parentcomponent',
  templateUrl: './parent.component.html',
  styleUrls: ['./parent.component.css'],

})

export class ParentComponent{
   scrollpos: number = 0.0;

   onNotify(message:number):void {
        this.scrollpos = message*0.2;
        console.log("hitting!");
        console.log(message);
    }
}

Thus the child directive is transmitting back through "message" the scroll position which is assigned to the parent class object scrollpos and this class object is interpolated back into the parent html style.background-image.position-y value...note we are scaling to a fractional value 0.2 the amount of change resembling parallax so that the background image is not equal to the shift in text.

Here is an example parent.component.css sheet for adding the background image:

.backgroundimageclass{
     background-image: url(../assets/MainNavHeader_V01.png);
    /*background: rgba(76, 175, 80, 1.0);*/
    /*background-color: black;*/
    /*background-size: 1200px 780px;*/
    padding-top: 10px;
    background-blend-mode: overlay;
    /*min-width: 1236px;*/
    /*min-height: 298px;*/
    /*width: 1236px;*/
    /*height: 298px;*/
    position: relative;
    left: 50px;
    -webkit-mask-image: -webkit-gradient(linear, left 70%, left bottom, from(rgba(0,0,0,1)), to(rgba(0,0,0,0)));
    /*-webkit-mask-image: -webkit-gradient(linear, left 30%, left top, from(rgba(0,0,0,1)), to(rgba(0,0,0,0)));*/
    /*background-position-y: 100px;*/
    /*background-repeat: no-repeat;*/
    /*overflow: auto;
    clear: both;*/
}

I've included in this case a little webkit alpha mask also on the background image which feather fades the bottom border for instance.

Note my import statement above is done so assuming the parent and child directive are in the same file.  If you wanted to separate these, just make sure to update your import statements.

Please read my previous post for additional information on implementing the child directive in the parent.  Namely, you'll want to ensure that the directive is actually done through the app.module.ts declarations for most current versions of Angular 2.0

A little bit of GIS information: Acquiring DEM heightmap renderings and colorization



First it should be mentioned that if you are doing terrain colorization of your choosing, you may already be familiar with or not http://viewer.nationalmap.gov/ in obtaining both satellite terrain imagery and/or heightmap data.  The big challenge, however, comes in how to re purpose this data in the context of say a 3D modeler like Blender.  Adding that I would recommend a program called QGIS (http://www.qgis.org/en/site/)  which provides GIS rasterization (rendering) services specifically tailored to things like heightmap mosaicing and a lot more.  Mosaicing is particularly useful because often heightmaps tend to be parsed by coordinate range addresses.  Without any GIS program, you'd have to have some alternate graphics program properly align and stitch these mosaics together in their proper coordinate range addresses otherwise (which is not needed).  Secondly QGIS equalizes heightmap mosaics when they are merged into one heightmap handling overall differences between the two maps (which may not be absolute in the context of luminosity and light values over the range of greyscales for instance in greyscaling formats by analogy). 

Other prerequisites:  
In Ubuntu installation goes as follows:
https://trac.osgeo.org/ubuntugis/wiki/QuickStartGuide
apt-get install cgi-mapserver
Because a given heightmap is actually given by measurements which include the surface of the Earth which is curved we will need to however project a given map flat...in this case GDAL warp packages (doing this task) will need flat projection coordinates under a lambert-conformal conic transformation.   Don't worry there is a utility in java script to handle this task...though you will need access to java script.  I like node.js which makes use of package manager called npm to handle these sorts of packages for easy to directory installation.

I recommend utilizing a java script called Proj4js and if you use node.js the node package is proj4 (see https://github.com/proj4js/proj4js) which can be installed via npm (node package manager) at your terminal command line :
Using this command will install the node package in the given root directory at prompt.

npm install proj4

you can run node at terminal, once installed typing:
$ node

once node is running in terminal then:

> var proj4 = require("proj4");

You may also find it handy for input and output coordinate parameters using information given from
spatialreference.org

next we define the destination projection given from spatial reference.org

>proj4.defs('EPSG:2228',"+proj=lcc +lat_1=37.25 +lat_2=36 +lat_0=35.33333333333334 +lon_0=-119 +x_0=2000000.0001016 +y_0=500000.0001016001 +ellps=GRS80 +datum=NAD83 +to_meter=0.3048006096012192 +no_defs");

if you type then 'proj4' and hit enter you should see the definition given..
with key value 'EPSG:2228'.

The source rule is already defined for longitude-latitude coordinates:
which is 'EPSG:4326'

so conversion at prompt is as follows:
> proj4('EPSG:4326','EPSG:2228',[-118,32.0170]);

// yields [ 6872583.317630408, 433558.88221227354 ]

You can use this to find projection coordinates from one coordinate system to another.  That being said QGIS also has handlers for Relief, Hillshade and Slope, Roughness, TRI, and TPI colorization and plugins in provisioning custom color gradients alongside providing users the ability to customize color gradient sets.

So the short workflow synopsis goes as follows:

With DEM dataset downloaded, you can load, for instance, .img sets on a new QGIS project session as follows:
Assuming that you have downloaded multiple filesets for a given geographic selection range, you may want to merge your heightmap files (merging these will equalize this files relative to one another...otherwise jump discontinuities may likely occur between them).

Merging Mosaics
1.  Choose 'Raster>Miscellaneous>Merge...'.
2.   Pressing 'ctrl' button select all necessary source files to be merged for source.
3.  Assign a destination file.  You may likely need to make sure the destination file type matches the source (this is not always given by default).
4.  Press 'Ok'.

All is done well and you should see a mosaic composition of all merged images shown in the QGIS visual browser window.

Selection cropping by latitude/longitude coordinates
1. Choose 'Raster>Extraction>Clipper'
2. At Dialog, longitudinal coordinates are X and latitude coordinates Y...or alternately you can select on the map visually coordinate boundaries.
3.  Assign destination file. You may likely need to make sure the destination file type matches the source (this is not always given by default).
4. Press 'Ok'.

Projection warp flattening the data set

1.  'Raster>Projection>Warp...'
2.   Default lat/long coordinate source SRS is usually provisioned depending on your file type, but you will likely need to indicate the target SRS (e.g.,  'EPSG:2228' above projection flattens coordinates for the state of California).
3.  As above source (e.g. merged file above) is selected and destination name assigned.
4.  Press 'Ok'.

Generating Hill Shade

Reading the scroll position of an html element in Angular 2.0 Typescript

I've seen some implement a component, which you can can do but this expects (to my knowledge) an html template.  I've used instead directive (which a component is but without the requirement of having an associated html template).

Here is my directive:
import { HostListener, Component, OnChanges, Directive, Input, Output, EventEmitter, SimpleChange, ElementRef} from '@angular/core';

@Directive({
    selector: '[track-scroll]',
    //host: {'(window:scroll)': 'track($event)'},
 
})

export class TrackScrollComponent {
  @HostListener('scroll', [''])
    track() {
        console.debug("Scroll Event");
        console.debug(this._elementRef.nativeElement.scrollTop);
    }
   constructor(private _elementRef: ElementRef) {}
}

I did use injection of the Element Ref, or as it is instanced from the 'div' tag the div ElementRef.

All directives in RC 6.0 (?) + go to the ngModule, or otherwise assigned in parent component.

Thus in my app.module.ts

@NgModule({
  imports: [ BrowserModule, AppRouting ],
  declarations: [TrackScrollComponent],
  bootstrap: [ App ]
})

Implementing in your component then goes to yourcomponent.html or template as follows:


<div track-scroll> </div>

 Sorry for some of the imports I hadn't tested which one were exclusively needed for this howTo.

There is mentions about not working directly with ElementRef, but at the moment I wanted to provide clarity for solution as opposed to creating added confusion for something that really should be simple.

Oblivion

 Between the fascination of an upcoming pandemic ridden college football season, Taylor Swift, and Kim Kardashian, wildfires, crazier weathe...