My Photo

Your email address:

Powered by FeedBlitz

April 2018

Sun Mon Tue Wed Thu Fri Sat
1 2 3 4 5 6 7
8 9 10 11 12 13 14
15 16 17 18 19 20 21
22 23 24 25 26 27 28
29 30          
Blog powered by Typepad

Become a Fan

« The Christmas Day Intelligence Failure – Part I: Enterprise Amnesia vs. Enterprise Intelligence | Main | The Christmas Day Intelligence Failure – Part III of III: Deadly Transparency »

January 19, 2010


Feed You can follow this conversation by subscribing to the comment feed for this post.


Nice work Jeff -- you may be interested in my recent op-ed discussing the commonality in a series of events:

Systemic failures, by design

Tarun Srivastava

I agree with Data finds Data but I would like to know about "Data build rules to find data". This would help not only identify consumers of data but evolve the rules that identify data. For example, some intelligence gathering would result in data finding out that it needs to be shared with USCIS. But if the rule of sharing does not evolve. The rule engine should be self feeding and ever evolving. I would expect my rule Engine to feed on all events, feed them to my CEP and at the same time my CEP evolves itself based on data set. New rules should evolve instead of being interpreted by humans.
Let me give an example. For Mr Abdulmatallab was identified as someone as potential threat. So some agency needs to create the data for him. Lets say CIA gathered info and entered into their system. Once data is created, then "Data finds Data". But someone needs to create the data. What if the travel pattern of Mr Mutallab were an indication? The system should feed on all possible events known for Mr mutallab, create an algorithm. It should try to see if there is any commonality with other events. For example, once Mr Mutallab was flagged, everyone who was not a citizen and in the same country as Mr Mutallab would need to be flagged for pattern matching. What is the rule identified here? Foreginers or non residents. All these data points should feed into rule engine to be extracted for creating rules for identifying consumer for data. With so many complex events happening, it is almost impossible for human to interpret events and correlate them. More over non-events would be impossible to identify. What if someone went under radar for years and then emerges in some crime. Such patterns need to be identified by system. Events need to be identified. They may not be in order, may be in order or could be randomly in and out of radar. Flagging Mr Mutallab is waiting for his Dad to help the agency. Doesnt the erratic stays of Mr Mutallab good enough to be flagged? In today's purposed life, any erratic behaviour needs to be flagged. For example, a policeman not only uses radar to find people who overspeed but watches for people changing lanes erratically. We need to build systems which can identify such individuals. Since this would be trying to identify an individual way before any credible intelligence, near real time is as good as real time. Potential individuals and their patterns would continue to evolve. I guess my thaughts arent structured but yeah there is a random pattern here.:)

The comments to this entry are closed.