Today, July 13, 2006 the Markle Foundation Task Force released its third and final report.
Press release available here.
Full report available here.
It has been an absolute honor to be called upon by the Markle Foundation to serve on this task force. Zoë Baird and James Barksdale were co-chairmen. And the members included Robert D. Atkinson, Rand Beers, Eric Benhamou, Jerry Berman, Robert M. Bryant, Ashton B. Carter, Wesley Clark, William P. Crowell, Bryan Cunningham, Jim Dempsey, Mary DeRosa, Sidney D. Drell, Esther Dyson, Amitai Etzioni, David J. Farber, Richard Falkenrath, John Gage, John Gordon, Slade Gorton, Morton H. Halperin, Margaret A. Hamburg, John J. Hamre, Eric H. Holder, Jr., Jeff Jonas, Arnold Kanter, Tara Lemmey, Gilman Louie, John O. Marsh, Jr., Judith A. Miller, James H. Morris, Craig Mundie, Jeffrey H. Smith, Abraham D. Sofaer, James B. Steinberg, Kim Taipale, Rick White and Richard Wilhelm.
Needless to say I met some amazing people and learned a great deal along this journey.
In today’s public release I was asked to speak for the Task Force about the technology related recommendations. If you are interested in my opening statements, here is my transcript:
"A number of technologies are available that can be used to better connect the right people with the right information and at the same time these technologies can help enforce policy and enhance public trust. In this report the Markle Task Force has highlighted technologies that will improve information sharing and enhance security, while facilitating greater accountability and higher levels of privacy protections. While the report is not intended to be an exhaustive discussion of specific technology or ongoing research, it does provide an overview of certain technologies and approaches that have particular applicability to implementing a trusted information sharing environment. For example, (on page 59) we call for the use of electronic directory services to enable organizations to locate relevant content in the enterprise; much in the same way one uses the card catalog at the library, as opposed to roaming the halls to find the book. The Task Force has never called for the wholesale transfer of data between systems or agencies; rather, we have called for leaving the data with the original holder. The electronic directory services approach enables information to be discovered while avoiding large party-to-party data dumps. This approach simply enables users to discover who has information specifically relevant to their case. Holders of the information can then grant access, based on policy, to each information request. This approach to discoverability delivers on the "need to share" goal by first answering the question "share what with who?" Further (On page 63,) we encourage the use of data anonymization before transfer between systems wherever possible. While this reduces the risk of unintended disclosure of any transferred information being later stolen and repurposed, it also enhances overall privacy, as personally identifiable information is no longer being exchanged in a human readable form. Notable, we prefer anonymization over encryption (when possible), the difference being encrypted data can be decrypted, whereas anonymized data can only be practically unlocked by requesting the human readable record from the original data holder. Again, information transfer is minimized. (On page 70,) The Task Force also calls for the use of Immutable Audit Logs. This type of technology is intended to permanently record, in a tamper resistant manner, how users have used a system. Even corrupt database administrators cannot alter history. Immutable logs can increase security, build trust among users, measure compliance with policies and guidelines, and improve transparency and the ability to conduct oversight by appropriate stakeholders. This concept is more fully spelled out in a free stranding paper published by the Task Force and available on the Markle web site [here]. Today’s report also calls attention to other technology that will be useful to improve collaboration, security and trust. (Between pages 57 and 71,) We cover standards for greater interoperability, improved analyst tools to organize, visualize and make better sense of the information at hand, subscription-based processes that enable users to be alerted when something becomes relevant (e.g., a watch listed party they are preparing to investigate is later removed from the watch list). Equally important, we cover information rights management – technology that enables the owner of data to control what a recipient can do with the data, much in the same way a PDF can be created without enabling the recipient to print it. Improved collaboration tools are discussed as these are seen as essential to improving communication between specific individuals working on specific problems. Strong user authentication and encrypting data both in transit and at rest are suggested as well. And last but not least, we encourage the use of automated tools to detect users of the information sharing environment who are using the system in a manner inconsistent with policy. We have repeatedly stressed in our reports that technologies and polices must be developed together. By designing systems and employing technologies with features that support and enforce policy, information sharing environment designers can help foster trust that automated systems and their users are conforming to governing laws, rules, and guidelines. All this being said, the Task Force recognizes that technology, alone, cannot ensure that the information sharing environment is effective, secure or protective of privacy and civil liberties."
> Equally important, we cover information rights
> management – technology that enables the owner of
> data to control what a recipient can do with the
> data, much in the same way a PDF can be created
> without enabling the recipient to print it.
This sounds remarkably like snake oil, and reading page 75 of the report doesn't shed any more light on a safe way to do this. IRM technology tends to fail as soon as someone takes an interest in attacking it... and once someone has figured out how to break it once, they can share the information far and wide.
Why propose a privacy "solution" that doesn't stand a chance against someone willing to attack it?
The report sounds interesting. I'm going to read it.
Posted by: Brian | July 14, 2006 at 07:28 AM