My Photo

Your email address:


Powered by FeedBlitz

April 2018

Sun Mon Tue Wed Thu Fri Sat
1 2 3 4 5 6 7
8 9 10 11 12 13 14
15 16 17 18 19 20 21
22 23 24 25 26 27 28
29 30          
Blog powered by Typepad

Become a Fan

« Feature/Function Innovation: Inventing Left-Hand Columns | Main | Found: An Immutable Audit Log »

November 05, 2007

Comments

Feed You can follow this conversation by subscribing to the comment feed for this post.

As usual, Jeff, you find some really interesting work to post about. I'm definitely interested in learning more about the Triadic Continuum.

Dan Linstedt

Hi Jeff,

I heard about this too, and at first glance it sounds promising. However when I looked at the patent application - what's really interesting is the actual functionality that they've built.

What I find is that the Data Vault architecture that I've constructed over the years meets the same mathematical modeling basis as this "traidic continuom." There are three parts in the Data Vault, Hub and Satellite establish context and business keys for a local component, and the Link is left to be "discovered, assigned, weighed" and acts as a vector between the contextual time based elements.

I'm wondering if you wouldn't share your thoughts about the Data Vault modeling architecture as well?

I'm real interested in talking to these folks to see just what their data modeling looks like... They have some great application on top of it, but I bet that we could use the Data Vault model with their application, and arrive at similar results.

As you know, the Data Vault model is also based on an interpretation of the neural cells in the brain.

Cheers,
Dan L

Jane Mazzagatti

Actually we have thought about many of these issues and more - the Pierce cognitive model covers everything from the rudimentary data structure we're currently experiementing with to self awareness - but so far we have only been able to implement a rudimentary K (knowledge) structure (lack of resources not understanding) and prototype a data analyzer to begin to demo the potential of data structure - John's book gives the details of how the rudimentary structure is created and some of the attributes - to answer some of your questions the K structure (patented as the interlocking trees data store) creates a structure from data that contains everything you would have if you created a set of tables with indexes and cubes - so you have every context captured in one structure as soon as you record the data - there are no calculated probabilities stored in the structure (only counts) so that as each new record is added every probability changes but no calculations are done until the particular probability is needed - and the only time structure is created is when some new experience is encountered - such as a new field variable (otherwise the old structure would be reused) so that the 'structure knows' immediately that something new has been encountered - also, the resulting structure for large data sets is smaller than the original data - Jane

Michael Atlass

When you suggested that "The principle being: at the exact moment new observations (data) are ingested into a persistent context data store … also happens to be the cheapest moment computationally to detect relevance (insight)."

I think you could profitably rephrase that in the context of the Triadic Continuum. The moment the data become part of the TC, it is connected to all past events of which it is related in a structured relationship that can lead back to that data at any time in the future when it is determined that that data is again needed to answer an inquiry. Further, as future events are linked to that data via their relationship, those events also become part of the answer to the query, according to their relevance.

Ray Garcia

Wouldn't the concept of Perpetual Analytics require that a valid simulation model exist for the system dynamic in order for it to receive new information and accurately produce a new system state that is useful for analysis and subsequent validation?

How would this be possible without a unifying theory on the rational application of evolutionary/genetic/neural programming, properly balanced with human judgment?

The comments to this entry are closed.