Example: Data Mining for the NBA

download Example: Data Mining for the NBA

If you can't read please download the document

  • date post

    10-May-2015
  • Category

    Documents

  • view

    894
  • download

    2

Embed Size (px)

Transcript of Example: Data Mining for the NBA

  • 1.Privacy Prof.Bhavani Thuraisingham The University of Texas at Dallas March 5, 2008 Lecture #18

2. What is Privacy

  • Medical Community
    • Privacy is about a patient determining what patient/medical information the doctor should be released about him/her
  • Financial community
    • A bank customer determine what financial information the bank should release about him/her
  • Government community
    • FBI would collect information about US citizens. However FBI determines what information about a US citizen it can release to say the CIA

3. Some Privacy concerns

  • Medical and Healthcare
    • Employers, marketers, or others knowing of private medical concerns
  • Security
    • Allowing access to individuals travel and spending data
    • Allowing access to web surfing behavior
  • Marketing, Sales, and Finance
    • Allowing access to individuals purchases

4. Data Mining as a Threat to Privacy

  • Data mining gives us facts that are not obvious to human analysts of the data
  • Can general trends across individuals be determined without revealing information about individuals?
  • Possible threats:
    • Combine collections of data and infer information that is private
      • Disease information from prescription data
      • Military Action from Pizza delivery to pentagon
  • Need to protect the associations and correlations between the data that are sensitive or private

5. Some Privacy Problems and Potential Solutions

  • Problem: Privacy violations that result due to data mining
    • Potential solution: Privacy-preserving data mining
  • Problem: Privacy violations that result due to the Inference problem
    • Inference is the process of deducing sensitive information from the legitimate responses received to user queries
    • Potential solution: Privacy Constraint Processing
  • Problem: Privacy violations due to un-encrypted data
    • Potential solution: Encryption at different levels
  • Problem: Privacy violation due to poor system design
    • Potential solution: Develop methodology for designing privacy-enhanced systems

6. Privacy Constraint Processing

  • Privacy constraints processing
    • Based on prior research in security constraint processing
    • Simple Constraint: an attribute of a document is private
    • Content-based constraint: If document contains information about X, then it is private
    • Association-based Constraint: Two or more documents taken together is private; individually each document is public
    • Release constraint: After X is released Y becomes private
  • Augment a database system with a privacy controller for constraint processing

7. Architecture for PrivacyConstraint Processing User Interface Manager Constraint Manager Privacy Constraints Query Processor: Constraints during query and release operations Update Processor: Constraints during update operation Database Design Tool Constraints during database design operation Database DBMS 8. Semantic Model for Privacy Control Patient John Cancer Influenza Has disease Travels frequently England address Johnsaddress Dark lines/boxes contain private information 9. Privacy Preserving Data Mining

  • Prevent useful results from mining
    • Introduce cover stories to give false results
    • Only make a sample of data available so that an adversary is unable to come up with useful rules and predictive functions
  • Randomization
    • Introduce random values into the data and/or results
    • Challenge is to introduce random values without significantly affecting the data mining results
    • Give range of values for results instead of exact values
  • Secure Multi-party Computation
    • Each party knows its own inputs; encryption techniques used to compute final results
    • Rules, predictive functions
  • Approach:Only make asampleof data available
    • Limits ability to learn good classifier

10. Cryptographic Approaches forPrivacy Preserving Data Mining

  • Secure Multi-part Computation (SMC) for PPDM
    • Mainly used for distributed data mining.
    • Provably secure under some assumptions.
    • Learned models are accurate
    • Efficient/specific cryptographic solutions for many distributed data mining problems are developed.
    • Mainly semi-honest assumption (i.e. parties follow the protocols)
    • Malicious model is also explored recently. (e.g. Kantarcioglu and Kardes paper in this workshop)
    • Many SMC based PPDM algorithms share common sub-protocols (e.g. dot product, summation, etc. )

11. Cryptographic Approaches for Privacy Preserving Data Mining

  • Drawbacks:
    • Still not efficient enough for very large datasets. (e.g. petabyte sized datasets ??)
    • Semi-honest model may not be realistic
    • Malicious model is even slower
  • Possible new directions
    • New models that can trade-off better between efficiency and security
    • Game theoretic / incentive issues in PPDM
    • Combining anonymization and cryptographic techniques for PPDM

12. Perturbation Based Approaches forPrivacy Preserving Data Mining

  • Goal: Distort data while still preserve some properties for data mining propose.
  • Additive Based
  • Multiplicative Based
  • Condensation based
  • Decomposition
  • Data Swapping

13. Perturbation Based Approaches forPrivacy Preserving Data Mining

  • Goal: Achieve a high data mining accuracy with maximum privacy protection.

14. Perturbation Based Approaches forPrivacy Preserving Data Mining

  • Privacy is a personal choice, so should enable individual adaptable (Liu, Kantarcioglu and Thuraisingham ICDM06)

15. Perturbation Based Approaches forPrivacy Preserving Data Mining

  • The trend is to make PPDM approaches fit in the reality
  • We investigated perturbation based approaches with real-world data sets
  • We give a applicability study to the current approaches
    • Liu, Kantarcioglu and Thuraisingham, DKE 07
  • We found out,
    • The reconstruction the original distribution may not work well with real-world data set
    • Distribution is a hard problem, should not use as a media step
    • Try to modify perturbation techniques, and adapt some data mining tools, e.g. Liu, Kantarcioglu and Thuraisingham, Novel decision tree UTD technical report 06

16. CPT: Confidentiality, Privacy and Trust

  • Before I as a user of Organization A send data about me to organization B, I read the privacy policies enforced by organization B
    • If I agree to the privacy policies of organization B, then I will send data about me to organization B
    • If I do not agree with the policies of organization B, then I can negotiate with organization B
  • Even if the web site states that it will not share private information with others, do I trust the web site
  • Note: while confidentiality is enforced by the organization, privacy is determined by the user. Therefore for confidentiality, the organization will determine whether a user can have the data. If so, then the organization van further determine whether the user can be trusted

17. Platform for Privacy Preferences (P3P):What is it?

  • P3P is an emerging industry standard that enables web sites to express their privacy practices in a standard format
  • The format of the policies can be automatically retrieved and understood by user agents
  • It is a product of W3C; World wide web consortium
  • www.w3c.org
  • Whena user enters a web site, the privacy policies of the web site is conveyed to the user; If the privacy policies are different from user preferences, the user is notified; User can then decide how to proceed
  • Several major corporations are working on P3P standards including

18. Platform f