Ann Cavoukian, Information and Privacy Commissioner, Ontario, Canada and I released a joint paper June 8, 2012 entitled “Privacy by Design in the Era of Big Data” available here.
In this paper, Ann describers her aspirations for Privacy by Design (PbD) to which I call out specific PbD inspired engineering decisions the team and I made during the development of my latest (software) invention.
Internally code named “G2,” this now 3.5+ year effort started with a full year first working on paper only. No different than first rendering detailed architectural plans to paper before building a house, our first year was dedicated to overcoming what we thought were our performance limitations of the past while simultaneously weaving in as many advances in privacy and civil liberty protections as we could fathom at the time.
As a result, I can say that this new technology (something you may come to hear about one day called Sensemaking) has more privacy and civil liberties protecting features than any technology my team and I have ever created in the past. In fact this technology may have more baked-in privacy and civil liberties enhancing features of any advanced analytic software ever engineered. I would love to be wrong about this – starting a fierce competition over “I have more privacy features than you” is going to be a good thing for planet Earth.
Here are a few highlights from the report that you may find interesting:
While organizations have practical incentives to make the most of their ever growing observation space (the data they have access to), they also have a pressing need to embed in these systems enhanced privacy protections. We outline in this paper just such an example — how an advanced Big Data sensemaking technology was, from the ground up, engineered with privacy-enhancing features. Some of these features are so critical to accuracy that the team decided they should be mandatory — so deeply baked-in they cannot be turned off.
~ page 2
But as technological advances improve our ability to exploit Big Data, potential privacy concerns could stir a regulatory backlash that would dampen the data economy and stifle innovation.
~ page 3
A new class of analytic capability is emerging that one might characterize as “general purpose sensemaking.” These sensemaking techniques integrate new transactions (observations) with previous transactions — much in the same way one takes a jigsaw puzzle piece and locates its companions on the table — and use this context-accumulating process to improve understanding about what is happening right now. Crucially, this process can occur fast enough to permit the user do something about whatever is happening while it is still happening. Unlike many existing analytic methods that require users to ask questions of systems, these new systems operate on a different principle: the data finds the data, and the relevance finds the user.
~ page 5
However, in these new systems the task of ensuring data security and privacy becomes harder as more copies of information are created. Large data stores containing context-accumulated information are more useful not only to their mission holders but also to those with interests in misuse. That is, the more personally identifiable information Big Data systems contain, the greater the potential risk. This risk arises not only from potential misuse of the data by unauthorized individuals, but also from misuse of the system itself.
~ page 7
PbD prescribes that privacy be built directly into the design and operation, not only of technology, but also how a system is operationalized (e.g., work processes, management structures, physical spaces and networked infrastructure.) Today, PbD is widely recognized internationally as the standard for developing privacy compliant information systems. As a framework for effective privacy protection, PbD’s focus is more about encouraging organizations to both drive and demonstrate their commitment to privacy than some strict technical compliance definition.
In short, in the age of Big Data, we strongly encourage technologists engaged in the design and deployment of advanced analytics to embrace PbD as a way to deliver responsible innovation.
~ page 9
In late 2008, Jeff Jonas embarked on an ambitious journey to create a sensemaking-style system. This effort started with overall architecture planning and design specifications. Over the first year of this project, while drafting and redrafting these blueprints, his team worked to embed properties that would enhance, rather than erode, the privacy and civil liberties of data subjects.
To engineer for privacy, his team weighed performance consequences, default settings, and which, if any, PbD features should be so hard wired into the system they literally cannot be disabled.
Over the year that spanned the preliminary and detailed design, the team created a robust suite of PbD features.
~ page 9
The dynamic pace of technological innovation requires us to protect privacy in a proactive manner in order to better safeguard privacy within our societies. In order to achieve this goal, system designers should be encouraged to practice responsible innovation in the field of advanced analytics.
With this in mind, we strongly encourage those designing and building next-generation analytics of any kind to carry out this work while being informed by Privacy by Design as it relates to personally identifiable data.
~ page 13-14
One thing is for sure: our PbD efforts are a work in progress – much has yet to be done. Comments and critiques are most welcome.
In the meantime, if you are an engineer consider becoming a student of privacy yourself and then begin engineering with PbD in mind. And if you are a privacy/civil liberties advocacy type, find yourself an engineer to take under your wing – and be gentle as you nurse these newbies along.
RELATED MATERIAL:
RELATED POSTS:
Big Data Q&A for the Data Protection Law and Policy Newsletter
Sensemaking on Streams – My G2 Skunk Works Project: Privacy by Design (PbD)
G2 | Sensemaking – One Year Birthday Today. Cognitive Basics Emerging.
Responsible Innovation: Designing for Human Rights
Responsible Innovation: Some Things are Best Left Un-invented
Responsible Innovation: Staying Engaged with the Privacy Community
Not the exact right place for it, but the work on G2 and also the privacy by design stuff makes me think of a TV show, "Person of Interest" that could practically be about you, Jeff. Lots of Hollywood BS in the show, but at some level, there is a lot that reminds me of this, I think you would get a kick out of it (although, I suspect I'm not the first to suggest this).
Posted by: Ian Story | June 20, 2012 at 10:00 AM