2005 Mar 01
Privacy Technologies
David M. Raab
DM Review
March, 2005

Data privacy matters a lot to some people and hardly at all to others. But computer systems can’t be so flexible–they need consistent rules. Since the people who care about privacy include some lawmakers, some of those rules are dictated by government. Other rules are driven by consumer preference, as interpreted by businesspeople whose job depends in part on making their customers happy. Some organizations even include ethical considerations in their policies.

Many privacy rules reflect a set of “fair information practices” first drafted in the 1970’s. (The Center for Democracy & Technology has a good overview at www.cdt.org/privacy/guide. Also see www.cdt.org/security/guidelines for a look at privacy rules by industry.) Although formulations vary, basic components are:

– notifying the person (a.k.a. “data subject”) that data is being collected

– giving the data subject choices about what is collected and how it may be used

– limiting what is collected, how it may be used, and how long it’s retained to what’s needed for the original purpose

– ensuring the data is accurate, which includes allowing the data subject to review the data and make corrections

– providing reasonable security from unauthorized access or modifications, and ability to verify compliance through audit trails

The underlying notion is that a person retains some rights over her information even after she gives it to someone else. This is goes beyond the usual security issues related to data access. A company’s price list may be tightly guarded but the company is still free to treat it however it wants. Protected personal information, on the other hand, can only be used in the ways permitted by government regulations and company privacy policies (which are considered a contract with the customer, and thus legally binding, although companies can change them pretty much at will. They just have to publish the new one.)

Giving control to the data subject has given rise to some delightfully baroque concepts, such as automated agents that negotiate an exchange of personal data for better service or lower prices. Real-world implementations tend to be simpler, such as gaining access to a Web site in return for registration, getting grocery store discounts if you allow your purchasing habits to be tracked, or choosing whether to receive marketing solicitations in return for nothing at all.

Even these prosaic approaches can raise some challenging questions. How long does a preference apply? What if conflicting preferences are expressed in different channels or at different times? If permission to use one set of data expires, should you revert to another, older version of the same data that you know is incorrect? Is data acquired from an outside source, such as a rented list, subject to the same limits as data acquired directly? What’s the exact line between a permitted operational use and a secondary usage which requires additional permission?

No matter how difficult such questions may be, the answers can ultimately be determined, stated as business rules, and embedded in an application. Even individual preferences are just another variable to check when the application accesses its data.

But data doesn’t stay in the original application. A hospital may legitimately generate a list of patients with diabetes–say, to survey them about their treatment. But if it sells that list to a drug company for marketing, it’s probably a privacy violation. (Or not: your regulations may vary.) In other words, it matters how the data will be used after it leaves the original system. This is more complicated than the access control built into standard applications and databases, which just checks a list of authorized users. It’s a particular issue when building data warehouses and other analytical systems that aggregate information from multiple systems, often without retaining direct ties to the original source.

How can a system be responsible for what happens to data it no longer controls? A cynic might note the beauty of writing regulations is you don’t have to worry about such details. A more reasonable answer is the system isn’t responsible, the organization is. And the organization does have the ability to enforce policies about how its data is used. Sure, policies can be broken by dishonest employees or business partners. But preventing such violations is a normal business activity. The relevant security technologies–identity management, authentication, provisioning, encryption, traffic scanning, Web site auditing, intrusion detection, auditing, and so on–are not unique to privacy.

There are, however, two sets of products that can legitimately be considered privacy-specific. One intrepidly attempts to meet the challenge of controlling data after it has left home (easily understood by the parent of any college student). The basic approach is to associate each piece of protected data with the specific rules governing its use. The International Security, Trust & Privacy Alliance (www.istpa.org), an industry-funded group, has put together a framework describing the system services and capabilities needed to do this. It has generated little in the way of actual products, however. A more conventional solution is to enforce data usage policies through some type of application-independent middleware. IBM Tivoli Privacy Manager (www?306.ibm.com/software/tivoli/products/privacy?mgr?e?bus), Computer Associates eTrust Access Control (http://www3.ca.com/Solutions/product.asp?ID=154), Vormetric (www.vormetric.com), Teleran (www.teleran.com), and Synomos (www.synomos.com) offer differing forms of such systems.

Digital rights management, which also works at controlling data from a distance, might also be relevant. But it doesn’t seem to have been applied in this way. In fact, conventional DRM is sometimes considered a threat to privacy (see www.epic.org/privacy/drm).

The second set of products is aimed at helping manage compliance with privacy regulations. These are more planning and tracking tools than actual technical solutions. They help users to organize information about regulations, existing systems, and changes needed for compliance. They then track completion and auditing of the changes and of additional change requirements as these emerge. Vendors include Securesoft Systems (www.securesoftsystems.com), TruSecure (www.trusecure.com), SAFE Risk Management (www.saferms.com) and Preventsys (www.preventsys.com).

A third set of products helps consumers enforce their own privacy by allowing anonymous email and Web commerce, erasing browser cookies, avoiding spyware and viruses, encrypting information, and so on. Being aimed at consumers, they are not part of the enterprise privacy toolkit. But they may be worth examining for ideas to limit the amount of private data stored or shared within enterprise systems.

As the relatively short list of privacy products suggests, specialized technology really isn’t the solution to privacy requirements. General-purpose security technology plays a much larger role, and non-technical concerns such as training and policy development are still more important.

Certainly the public discussion is dominated by lawyers. This may be inevitable since legal compliance is the major motivation for most companies’ privacy efforts. But something is lost when technologists play such a secondary role: it’s no accident that creative, promising work by ISTPA and IBM’s Privacy Research Institute (www.zurich.ibm.com/pri/projects) has had so little impact. In a world where privacy requirements can only be expected to become more common and more complex, technology professionals cannot afford to respond piecemeal to each new regulation. Only a broader, systematic view of the long-term problem will let them keep up. So it’s time for technology managers to take a larger role in the privacy discussion, if only in self-defense.

* * *

David M. Raab is a Principal at Raab Associates Inc., a consultancy specializing in marketing technology and analytics. He can be reached at draab@raabassociates.com.

Leave a Reply

You must be logged in to post a comment.