Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Hippocratic Databases: Principles, Design, and Challenges, Papers of Database Management Systems (DBMS)

The key privacy principles for hippocratic database systems, proposes a strawman design, and identifies technical challenges. With the increasing amount of digital data and privacy concerns, hippocratic databases aim to protect personal information. They build upon statistical databases and database security research, and are inspired by privacy regulations and guidelines.

Typology: Papers

Pre 2010

Uploaded on 07/30/2009

koofers-user-cb3-1
koofers-user-cb3-1 🇺🇸

10 documents

1 / 12

Toggle sidebar

Partial preview of the text

Download Hippocratic Databases: Principles, Design, and Challenges and more Papers Database Management Systems (DBMS) in PDF only on Docsity! Hippocratic Databases Rakesh Agrawal Jerry Kiernan Ramakrishnan Srikant Yirong Xu IBM Almaden Research Center 650 Harry Road, San Jose, CA 95120 Abstract The Hippocratic Oath has guided the conduct of physicians for centuries. Inspired by its tenet of preserving privacy, we argue that future database systems must include responsibility for the pri- vacy of data they manage as a founding tenet. We enunciate the key privacy principles for such Hip- pocratic database systems. We propose a straw- man design for Hippocratic databases, identify the technical challenges and problems in designing such databases, and suggest some approaches that may lead to solutions. Our hope is that this paper will serve to catalyze a fruitful and exciting direc- tion for future database research. 1 Introduction “And about whatever I may see or hear in treat- ment, or even without treatment, in the life of human beings – things that should not ever be blurted out outside – I will remain silent, hold- ing such things to be unutterable” – Hippocratic Oath, 81 The explosive progress in networking, storage, and processor technologies is resulting in an unprecedented amount of digitization of information. It is estimated that the amount of information in the world is doubling every 20 months, and the size and number of databases are in- creasing even faster [37]. In concert with this dramatic and escalating increase in digital data, concerns about the pri- vacy of personal information have emerged globally [15] [17] [37] [51]. Privacy issues are further exacerbated now 1Translation by Heinrich Von Staden. In a Pure and Holy Way: Per- sonal and Professional Conduct in the Hippocratic Oath. Journal of the History of Medicine and Applied Sciences 51 (1966) 406–408. Permission to copy without fee all or part of this material is granted pro- vided that the copies are not made or distributed for direct commercial advantage, the VLDB copyright notice and the title of the publication and its date appear, and notice is given that copying is by permission of the Very Large Data Base Endowment. To copy otherwise, or to republish, requires a fee and/or special permission from the Endowment. Proceedings of the 28th VLDB Conference, Hong Kong, China, 2002 that the Internet makes it easy for new data to be automat- ically collected and added to databases [6] [10] [58] [59] [60]. Privacy is the right of individuals to determine for them- selves when, how and to what extent information about them is communicated to others.2 Privacy concerns are be- ing fueled by an ever increasing list of privacy violations, ranging from privacy accidents to illegal actions. Of equal concern is the lax security for sensitive data. See Appendix A for some examples of recent privacy violations. Database systems, with their ubiquitous acceptance as the primary tool for information management, are in the middle of this gathering storm. We suggest that the database community has an oppor- tunity to play a central role in this crucial debate involving the most cherished of human freedoms3 by re-architecting our database systems to include responsibility for the pri- vacy of data as a fundamental tenet. We have been inspired by the privacy tenet of the Hippocratic Oath, and propose that the databases that include privacy as a central con- cern be called Hippocratic databases. We enunciate the key principles for such Hippocratic database systems, distilled from the principles behind current privacy legislations and guidelines. We identify the technical challenges and prob- lems in designing Hippocratic databases, and also outline some approaches that may lead to solutions. Our hope is that future database research will convert the Hippocratic database vision into reality. We recognize that technology alone cannot address all of the concerns surrounding a complex issue like privacy. The total solution has to be a goulash of laws, societal norms, markets, and technology [32]. However, by ad- vancing what is technically realizable, we can influence the proportion of the ingredients and the overall quality of the solution. We also recognize that all of the world’s data does not live in database systems. We hope the Hippocratic databases will provide additional inducement for privacy- sensitive data to move to its right home. If nothing else, 2This definition is attributed to Alan Westin, Professor Emeritus of Public Law and Government, Columbia University. 3Samuel Warren and Louis Brandeis. The right to privacy. Harvard Law Review 4 (1890) 193–220. See also [2]. Hippocratic databases can provide guidance for incorpo- rating similar principles in other types of data repositories. The structure of the rest of the paper is as follows. Section 2 discusses current database systems, focusing on the closest related work: statistical databases and secure databases. We define the founding principles of Hippo- cratic databases in Section 3, and sketch out a strawman design for a Hippocratic database in Section 4. We give a set of technical challenges in Section 5, and conclude with some closing remarks in Section 6. 2 Current Database Systems In [53], the following two properties are considered funda- mental for a database system: 1. The ability to manage persistent data. 2. The ability to access a large amount of data efficiently. In addition, the following capabilities are said to be found universally: 1. Support for at least one data model. 2. Support for certain high-level languages that allow the user to define the structure of data, access data, and manipulate data. 3. Transaction management, the capability to provide correct, concurrent access to the database by many users at once. 4. Access control, the ability to deny access to data by unauthorized users and the ability to check the validity of the data. 5. Resiliency, the ability to recover from system failures without losing data. Other database text books also provide a similar list for the capabilities of a database system [16] [40] [48]. For in- stance, in [48], the primary goal of a database system is said to be providing an environment that is both convenient and efficient to use in retrieving and storing information. The control of redundancy is stated as an additional capa- bility in [16]. Interestingly, access control is not mentioned in [48], although they do discuss integrity constraints. Clearly, a Hippocratic database will need the capabili- ties provided by current database systems. However, given the design goals of current database systems, it is not sur- prising that they fall short in providing a platform for pri- vacy sensitive applications. In fact, efficiency – though it will continue to be important – may not be the central focus of Hippocratic databases. They may place greater empha- sis on consented sharing rather than on maximizing con- currency. The need for the database system to completely forget some data beyond the purpose for which it was col- lected has interesting implications on the current resiliency schemes. There will be new demands on the data definition and query languages, query processing, indexing and stor- age structures, and access control mechanisms. In short, Hippocratic databases will require us to rethink almost all aspects of current database systems. 2.1 Statistical Databases The research in statistical databases was motivated by the desire to be able to provide statistical information (sum, count, average, maximum, minimum, pth percentile, etc.) without compromising sensitive information about individ- uals [1] [47]. The proposed techniques can be broadly classified into query restriction and data perturbation. The query restriction family includes restricting the size of query results [13] [18], controlling the overlap among suc- cessive queries [14], keeping audit trails of all answered queries and constantly checking for possible compromises [8], suppression of data cells of small size [9], and clus- tering entities into mutually exclusive atomic populations [61]. The perturbation family includes swapping values between records [12], replacing the original database by a sample from the same distribution [33] [42], adding noise to the values in the database [52] [57], adding noise to the results of a query [4], and sampling the result of a query [11]. Hippocratic databases share with statistical databases the goal of preventing disclosure of private information, and hence some of the techniques developed for statisti- cal databases will find application in Hippocratic databases. However, the class of queries that Hippocratic databases have to contend with is much broader. 2.2 Secure Databases Whenever sensitive information is exchanged, it must be transmitted over a secure channel and stored securely to prevent unauthorized access. There is extensive literature on access control and encryption that is relevant [12] [38] [45] [46]. Hippocratic databases will also benefit from the work on database security [7] [30]. Of particular interest is work on multilevel relations in the context of multilevel secure databases [23] [24] [50]. It allows multiple levels of security (e.g., top secret, secret, confidential, unclassified) to be defined and associated with individual attribute val- ues. The security level of a query may be higher or lower than that of individual data items. A query with a lower level of security cannot read a data item requiring a higher level of clearance. On the other hand, a higher security query cannot write a lower security data item. Two queries having different levels of security can thus generate differ- ent answers over the same database. Many of our architec- tural ideas about Hippocratic databases have been inspired by this work. 3 Founding Principles of a Hippocratic Database We first present a summary of some of the current privacy regulations and guidelines. Our founding principles are motivated by, and based on the principles underlying these regulations. Privacy Policy Data Collection Queries Retention Offline Tools Privacy Constraint Validator Data Accuracy Analyzer Attribute Access Control Query Intrusion Detector Data Retention Manager Data Collection Analyzer Query Intrusion Model Audit Info Audit Info Store Privacy Metadata AuditTrail Record Access Control Encryption Support Privacy Metadata Creator Figure 1: Strawman Architecture authorizations implement many of the external-recipients constraints in the privacy-policies table: shipping cannot access credit-card-info, and charge cannot access book-info or shipping-address. Trent first designs the privacy policy, and then uses the Privacy Metadata Creator to generate the privacy meta- data tables. The mapping from the privacy policy to the privacy-policies table makes use of automated tools. Creat- ing the privacy-authorizations table requires understanding of who should have access to what data, which in turn is constrained by the privacy policy. Alternate Organizations The above design assumes that purpose together with attribute completely determine the set of recipients and retention period. There is also an im- plicit assumption that the set of attributes collected for a purpose is fixed. These assumptions can be limiting in some situations. Consider a scenario where Bob agrees to give his business phone number for “contact” purpose. He also consents to “telemarketing” purpose for his home phone number, but opts out of his business phone number for this purpose. Now if purpose opt-ins or opt-outs are tracked per person, not per hperson, attributei pair, a query will be able to incorrectly obtain Bob’s business phone number for telemarketing purposes. These limitations can sometimes be circumvented by splitting a conceptual pur- pose into multiple database purposes. In the above exam- ple, we would split “telemarketing” into “telemarketing- home” and “telemarketing-business”. If the above assumptions do not generally hold, alternate data organizations may be more suitable. For example, one can create a table with the columns fuser, table, attribute, purpose, recipientg that allows any desired combinations of attribute, purpose, and recipient for each user. The tradeoff is that the run-time overhead for checking whether a query can access a user’s data is likely to be substantially higher. It is also possible to design in-between data organizations. For instance, one can store a set of purposes for each at- tribute in the record, rather than once per record. In this design, purposes will only require that the set of recipients be fixed, not the set of attributes collected for that purpose. 4.2.2 Data Collection Matching Privacy Policy with User Preferences Before the user provides any information, the Privacy Constraint table attributes privacy-policies purpose, table, attribute, f external-recipients g, retention privacy-authorizations purpose, table, attribute, f authorized-users g Figure 2: Privacy Metadata Schema table attributes customer purpose, customer-id, name, shipping-address, email, credit-card-info order purpose, customer-id, transaction-id, book-info, status Figure 3: Database Schema purpose table attribute external-recipients retention purchase customer name f delivery-company, credit-card-company g 1 month purchase customer shipping-address f delivery-company g 1 month purchase customer email empty 1 month purchase customer credit-card-info f credit-card-companyg 1 month purchase order book-info empty 1 month registration customer name empty 3 years registration customer shipping-address empty 3 years registration customer email empty 3 years recommendations order book-info empty 10 years purchase-circles customer shipping-address empty 1 year purchase-circles order book-info f aggregated-all g 1 year Figure 4: Privacy-Policies Table purpose table attribute authorized-users purchase customer customer-id all purchase customer name f shipping, charge, customer-service g purchase customer shipping-address f shipping g purchase customer email f shipping, customer-service g purchase customer credit-card-info f charge g purchase order customer-id all purchase order transaction-id all purchase order book-info f shipping g purchase order status f shipping, customer-service g registration customer customer-id all registration customer name f registration, customer-service g registration customer shipping-address f registration g registration customer email f registration, customer-service g recommendations order customer-id f mining g recommendations order transaction-id f mining g recommendations order book-info f mining g purchase-circles customer customer-id f olap g purchase-circles customer shipping-address f olap g purchase-circles order customer-id f olap g purchase-circles order book-info f olap g Figure 5: Privacy-Authorizations Table Validator checks whether the business’ privacy policy is ac- ceptable to the user. The input to the validator is the user’s privacy preferences (constraints). In our example, Alice’s preference would be to opt out of everything except pur- chase, and she may have a constraint that purchase informa- tion should not be kept for more than 3 months. If on the other hand, Alice required a retention period of 2 weeks, the database would reject the transaction. Similarly, Bob may opt-in for the recommendations purpose but not for the purchase-circles purpose. This interaction may occur using a direct encrypted connection between the database and the user’s client [39]. An audit trail of the user’s ac- ceptance of the database’s privacy policy is maintained in order to address challenges regarding compliance. Data Insertion Having checked that the privacy pol- icy does not violate the user’s privacy preferences, data is transmitted from the user and stored in the tables. Each record has a special attribute, “purpose”, that encodes the set of purposes the user agreed to. In our example, Al- ice’s records would have a single purpose: purchase, while Bob’s records would have three purposes: purchase, reg- istration and recommendations. The set of purposes com- bined with the information in the privacy-authorizations ta- ble will be used to restrict access. Data Preprocessing The Data Accuracy Analyzer may run some data cleansing functions [19] [41] against the data to check for accuracy either before or after data insertion, thus addressing the Principle of Accuracy. In our example, Alice’s address may be checked against a database of street addresses to catch typos in the address or zip code. 4.2.3 Queries Queries are submitted to the database along with their in- tended purpose. For example, a query that mines associ- ations to build a recommendation model would be tagged with the purpose “recommendations”. Before Query Execution A query is only allowed to run if the set of authorized users for that purpose in the privacy- authorization table includes the user who issued the query. Next, the Attribute Access Control analyzes the query to check whether the query accesses any fields that are not explicitly listed for the query’s purpose in the privacy- authorizations table. In our example, if Mallory in the customer-service department issues a query tagged “pur- chase” that accesses credit-card-info, the query will not be allowed to run, since in Figure 5, for the purchase purpose and attribute credit-card-info, authorized-users consists of only charge, and does not include customer-service. During Query Execution For any query, the Record Ac- cess Control ensures that only records whose purpose at- tribute includes the query’s purpose will be visible to the query. This is similar to the idea of multilevel relations in multilevel secure databases [24] [50]. In our exam- ple, queries tagged “recommendations” will be able to see Bob’s set of books but not Alice’s, since Alice’s purpose attribute only lists purchase. After Query Execution To motivate the next component, assume Mallory gives up on trying to get the credit card info, and instead decides to steal the email addresses of all registered users in Mississippi. Unlike previous attempts, neither the Attribute Access Control nor the Record Access control will be able to stop the query – customer service regularly accesses the email address in order to respond to questions about order status. However, before the query results are returned, the Query Intrusion Detector is run on the query results to spot queries whose access pattern is different from the usual ac- cess pattern for queries with that purpose and by that user. The detector uses the Query Intrusion Model built by ana- lyzing past queries for each purpose and each authorized- user. This problem is related to that of intrusion detection [3] [34]. In our example, the profile for queries issued by customer-service and tagged purchase might be that the query only accesses customers whose order status is not “fulfilled”, and that customer-service queries cumulatively access less than 1000 records a day. Thus Mallory’s queries will be flagged as highly suspicious on both counts. An audit trail of all queries is maintained for external privacy audits, as well as addressing challenges regarding compliance. 4.2.4 Retention The Data Retention Manager deletes data items that have outlived their purpose. If a certain data item was collected for a set of purposes, it is kept for the retention period of the purpose with the highest retention time. So Alice’s in- formation in the order table will be deleted after 1 month, while Bob’s information will be kept for 10 years since Bob’s purposes include both purchase and recommenda- tions. 4.2.5 Other Features The Data Collection Analyzer examines the set of queries for each purpose to determine if any information is be- ing collected but not used, thus supporting the Principle of Limited Collection. It also determines if data is being kept for longer than necessary, and whether people have unused (unnecessary) authorizations to issue queries with a given purpose, thus supporting the Principles of Limited Reten- tion and Limited Use. In our example, Trent may initially have given customer-service access to shipping-address; the analyzer would spot that customer-service queries never access that field and suggest to Trent that customer-service may not need access to it. We assume the standard suite of database security fea- tures such as access control [7] [30]. Some data items may work on symmetrically private information retrieval [20] [36]. However, the computational cost of these algorithms is still too high for large-scale deployment. 5.8 Compliance Universal Logging Generating audit trails that are in the hands of users could provide an extremely powerful tool for protecting privacy. Consider a scenario where Mallory steals the email addresses stored in Mississippi’s database. If even a small fraction of the people whose email ad- dresses were accessed by Mallory’s query wondered why their email was accessed long after they made their pur- chase and contacted Mississippi, Trent would know that there might have been a privacy breach. Trent could then look at the audit trail of queries and might catch Mallory. The challenge is to provide each user whose data is ac- cessed with a log of that access along with the query read- ing the data, without paying a large performance penalty. A potential approach might be to use an intermediary who aggregates logs of many users and provides them access on demand. So the database only has to send the log to a small number of intermediaries rather than to a large number of users. Tracking Privacy Breaches Another way Mississippi might track whether it has fallen prey to privacy breaches would be to use fingerprinting [27] [56]. Trent signs up with PrivacyGuard, which inserts some number of “finger- print” records in Mississippi’s database, with emails, tele- phone numbers and credit card numbers. If Mallory man- ages to steal email addresses and sells them, PrivacyGuard would know of the privacy breach in Mississippi as soon as they receive an email sent to a fingerprinted address. The challenge is to get maximum coverage with the minimum number of fingerprint records. For example, as- sume that Mallory only sold the emails of those Mississippi customers who bought a certain category of books, since those email addresses were much more valuable to spam- mers. The percentage of Mississippi’s customers who buy books in that category may be quite small, say 1%. Thus inserting fingerprint records with random purchases might be less effective than first identifying the broad categories and then inserting fingerprints based on the category. 6 Closing Remarks Inspired by the Hippocratic Oath, we presented a vision of database systems that take responsibility for the privacy of data they manage. We enunciated the key privacy prin- ciples that such Hippocratic databases should support and presented a strawman design for a Hippocratic database. Finally, we identified the technical challenges and prob- lems posed by the concept of Hippocratic databases. In the landmark book Code and Other Laws of Cy- berspace, Prof. Lawrence Lessig observes that “code is law”, and that it is all a matter of code: the software and hardware that rule the Internet. We can architect cy- berspace to protect values that we believe are fundamental, or we can architect it to allow those values to disappear. The question for us in the database community is: where do we want to go from here? Acknowledgment We would like to thank Don Haderle for suggesting that we focus on the principles underlying the current privacy laws and regulations to drive the design of the Hippocratic database. A Privacy Violations Examples of recent privacy accidents involving database systems include:  Kaiser, a major US health provider, accidently sent out 858 email messages containing member IDs and re- sponses to questions on various illnesses to the wrong members. (Washington Post, 10 August 2000).  GlobalHealthtrax, which sells health products online, inadvertently revealed customer names, home phone numbers, bank account, and credit card information of thousands of customers on their Web site. (MSNBC, 19 January 2000). Examples of ethically questionable behavior include:  Lotus and Equifax considered joining their credit card and demographic data and selling the results on inex- pensive CDs. Similarly, Lexis-Nexis considered mak- ing Social Security Numbers available through its on- line news service. (Laura J. Gurak. Privacy and Per- suasion in Cyberspace. 1997).  Medical Marketing Service advertises a database available to pharmaceutical marketers which in- cludes the names of 4.3 million people with al- lergies, 923,000 with bladder control problems, and 380,000 who suffer from clinical depression. (www.mmslists.com).  Boston University has created a private company to sell the data collected for more than 50 years as part of the Framingham Heart Study. Data collected on more than 12,000 people, including medical records and genetic samples, will be sold. (New York Times, 17 June 2000).  The chain drug stores CVS and Giant Food admit- ted to making patient prescription records available for use by a direct mail and pharmaceutical company. (Washington Post, 15 February 1998). An example of illegal action:  Toysmart.com sold confidential, personal cus- tomer information collected on the company web site in violation of its own privacy policy. (www.ftc.gov/opa/2000/07/toysmart.htm). An example of lax security for sensitive data:  A researcher at the Carnegie Mellon University was able to retrieve the health records of 69% of vot- ers in Cambridge, Massachusetts from a supposedly anonymous database of state employee health insur- ance claims. (www.consumerreports.org/Special/ ConsumerInterest/Reports/0008med0.htm). References [1] N. R. Adam and J. C. Wortman. Security-control methods for statistical databases. ACM Computing Surveys, 21(4):515–556, Dec. 1989. [2] D. Banisar. Privacy and human rights. Electronic Pri- vacy Information Center, 2000. [3] D. Barbara, J. Couto, and S. Jajodia. ADAM: A testbed for exploring the use of data mining in intru- sion detection. SIGMOD Record, 30(4):15–24, 2001. [4] L. L. Beck. A security mechanism for statistical databases. ACM Transactions on Database Systems, 5(3):316–338, September 1980. [5] C. J. Bennett. Regulating Privacy: Data Protection and Public Policy in Europe and the United States. Cornell Univ Press, 1992. [6] Business Week. Privacy on the Net, March 2000. [7] S. Castano, M. Fugini, G. Martella, and P. Samarati. Database Security. Addison Wesley, 1995. [8] F. Chin and G. Ozsoyoglu. Auditing and infrence con- trol in statistical databases. IEEE Trans. on Software Eng., SE-8(6):113–139, April 1982. [9] L. Cox. Suppression methodology and statistical dis- closure control. J. Am. Stat. Assoc., 75(370):377–395, April 1980. [10] L. Cranor, J. Reagle, and M. Ackerman. Beyond con- cern: Understanding net users’ attitudes about online privacy. Technical Report TR 99.4.3, AT&T Labs– Research, April 1999. [11] D. Denning. Secure statistical databases with random sample queries. ACM Transactions on Database Sys- tems, 5(3):291–315, Sept. 1980. [12] D. Denning. Cryptography and Data Security. Addison-Wesley, 1982. [13] D. Denning, P. Denning, and M. Schwartz. The tracker: A threat to statistical database security. ACM Transactions on Database Systems, 4(1):76–96, March 1979. [14] D. Dobkin, A. Jones, and R. Lipton. Secure databases: Protection against user influence. ACM Transactions on Database Systems, 4(1):97–106, March 1979. [15] The Economist. The End of Privacy, May 1999. [16] R. Elmasri and S. B. Navathe. Fundamentals of Database Systems. Benjamin/Cummings, Redwood City, California, 1989. [17] European Union. Directive on Privacy Protection, October 1998. [18] I. Fellegi. On the question of statistical confidential- ity. J. Am. Stat. Assoc., 67(337):7–18, March 1972. [19] H. Galhardas, D. Florescu, D. Shasha, E. Simon, and C.-A. Saita. Declarative data cleaning: Language, model, and algorithms. In Proc. of the Int’l Conf. on Very Large Data Bases (VLDB), pages 371–380, 2001. [20] Y. Gertner, Y. Ishai, E. Kushilevitz, and T. Malkin. Protecting data privacy in private information re- trieval schemes. In ACM Symposium on Theory of Computing, pages 151–160, 1998. [21] H. Hacigumus, B. R. Iyer, C. Li, and S. Mehrotra. Executing SQL over encrypted data in the database- service-provider model. In Proc. of the ACM SIG- MOD Conf. on Management of Data, Madison, Wis- consin, June 2002. [22] H. Hacigumus, B. R. Iyer, and S. Mehrotra. Providing database as a service. In Proc. of the Int’l Conf. on Data Engineering, San Jose, California, March 2002. [23] S. Jajodia, P. Samarati, M. L. Sapino, and V. S. Sub- rahmanian. Flexible support for multiple access con- trol policies. ACM Transactions on Database Sys- tems, 26(2):214–260, 2001. [24] S. Jajodia and R. Sandhu. Polyinstantiation integrity in multilevel relations. In IEEE Symp. on Security and Privacy, 1990. [25] S. Jajodia and R. S. Sandhu. A novel decomposi- tion of multilevel relations into single-level relations. In IEEE Symposium on Security and Privacy, pages 300–315, 1991. [26] G. Karjoth, M. Schunter, and M. Waidner. The platform for enterprise privacy practices - privacy- enabled management of customer data. In 2nd Workshop on Privacy Enhancing Technologies (PET 2002), San Francisco, CA, April 2002. [27] S. Katzenbeisser and F. A. Petitcolas, editors. In- formation Hiding Techniques for Steganography and Digital Watermarking. Artech House, 2000. [28] J. Kaufman, S. Edlund, D. Ford, and C. Powers. The social contract core. In Proc. of the Eleventh Int’l World Wide Web Conference (WWW), Honolulu, Hawaii, May 2002. [29] J. Kleinberg, C. H. Papadimitriou, and P. Raghavan. On the value of private information. In Proc. 8th Conf. on Theoretical Aspects of Rationality and Knowledge (TARK), 2001. [30] C. Landwehr. Formal models of computer security. ACM Computing Surveys, 13(3):247–278, 1981. [31] M. Langheinrich, editor. A P3P Preference Exchange Language 1.0 (APPEL1.0). W3C Working Draft, February 2001. [32] L. Lessig. Code and Other Laws of Cyberspace. Ba- sic Books, 1999. [33] C. K. Liew, U. J. Choi, and C. J. Liew. A data dis- tortion by probability distribution. ACM Transactions on Database Systems, 10(3):395–411, 1985. [34] T. F. Lunt. A Survey of Intrusion Detection Tech- niques. Computers & Security, 12(4):405–418, 1993. [35] M. Marchiori, editor. The Platform for Privacy Pref- erences 1.0 (P3P1.0) Specification. W3C Proposed Recommendation, January 2002. [36] S. K. Mishra. On symmetrically private information retrieval. Master’s thesis, Indian Statistical Institute, 2000. [37] Office of the Information and Privacy Commissioner, Ontario. Data Mining: Staking a Claim on Your Pri- vacy, January 1998. [38] R. Oppliger. Internet security: Firewalls and beyond. Comm. ACM, 40(5):92–102, May 1997. [39] Oracle Corporation. Database Encryption in Ora- cle8i, August 2000. [40] R. Ramakrishnan and J. Gehrke. Database Manage- ment Systems. McGraw-Hill, 2000. [41] V. Raman and J. M. Hellerstein. Potter’s wheel: An interactive data cleaning system. In VLDB Journal, pages 381–390, 2001. [42] S. P. Reiss. Practical data-swapping: The first steps. ACM Transactions on Database Systems, 9(1):20–37, 1984. [43] M. Rotenberg. The Privacy Law Sourcebook 2000: United States Law, International Law, and Recent De- velopments. Electronic Privacy Information Center, 2000. [44] M. Rotenberg. Fair information practices and the ar- chitecture of privacy. Stanford Technology Law Re- view, 1, 2001. [45] A. Rubin and D. Greer. A survey of the world wide web security. IEEE Computer, 31(9):34–41, Sept. 1998. [46] B. Schneier. Applied Cryptography. John Wiley, sec- ond edition, 1996. [47] A. Shoshani. Statistical databases: Characteristics, problems and some solutions. In Proc. of the Eighth Int’l Conference on Very Large Databases, pages 208–213, Mexico City, Mexico, September 1982. [48] A. Silberschatz, H. F. Korth, and S. Sudarshan. Database Systems Concepts. McGraw-Hill, 3rd edi- tion, 1997. [49] D. X. Song, D. Wagner, and A. Perrig. Practical tech- niques for searches on encrypted data. In IEEE Symp. on Security and Privacy, Oakland, California, 2000. [50] P. Stachour and B. Thuraisingham. Design of LDV: A multilevel secure relational database management system. IEEE Trans. Knowledge and Data Eng., 2(2):190–209, 1990. [51] Time. The Death of Privacy, August 1997. [52] J. Traub, Y. Yemini, and H. Woznaikowski. The statis- tical security of a statistical database. ACM Transac- tions on Database Systems, 9(4):672–679, Dec. 1984. [53] J. D. Ullman. Principles of Database & Knowledge- Base Systems, volume 1. Computer Science Press, 1988. [54] J. D. Ullman. Principles of Database & Knowledge- Base Systems, volume 2: The New Technologies. Computer Science Press, 1989. [55] U.S. Department of Health, Education, and Welfare. Records, computers and the Rights of Citizen: Report of the Secretary’s Advisory Committee on Automated Personal Data Systems, xx-xxiii edition, 1973. [56] N. R. Wagner. Fingerprinting. In IEEE Symp. on Se- curity and Privacy, pages 18–22, Oakland, Califor- nia, April 1983. [57] S. Warner. Randomized response: A survey technique for eliminating evasive answer bias. J. Am. Stat. As- soc., 60(309):63–69, March 1965. [58] A. Westin. E-commerce and privacy: What net users want. Technical report, Louis Harris & Associates, June 1998. [59] A. Westin. Privacy concerns & consumer choice. Technical report, Louis Harris & Associates, Dec. 1998. [60] A. Westin. Freebies and privacy: What net users think. Technical report, Opinion Research Corpora- tion, July 1999. [61] C. Yu and F. Chin. A study on the protection of statis- tical databases. In Proc. of the ACM SIGMOD Conf. on Management of Data, pages 169–181, 1977.
Docsity logo



Copyright © 2024 Ladybird Srl - Via Leonardo da Vinci 16, 10126, Torino, Italy - VAT 10816460017 - All rights reserved