Privacy-Preserving Support Vector Machines via Random Kernels Olvi Mangasarian UW Madison & UCSD La...

22
Privacy-Preserving Support Vector Machines via Random Kernels Olvi Mangasarian UW Madison & UCSD La Jolla Edward Wild UW Madison June 27, 2022

Transcript of Privacy-Preserving Support Vector Machines via Random Kernels Olvi Mangasarian UW Madison & UCSD La...

Page 1: Privacy-Preserving Support Vector Machines via Random Kernels Olvi Mangasarian UW Madison & UCSD La Jolla Edward Wild UW Madison November 14, 2015 TexPoint.

Privacy-Preserving Support Vector Machines via Random

Kernels

Olvi MangasarianUW Madison & UCSD La Jolla

Edward WildUW Madison

April 20, 2023

Page 2: Privacy-Preserving Support Vector Machines via Random Kernels Olvi Mangasarian UW Madison & UCSD La Jolla Edward Wild UW Madison November 14, 2015 TexPoint.

Vertically Partitioned DataHorizontally Partitioned Data

A

A1

A2

A3

A¢1 A¢2 A¢3

DataFeatures

1 2 ..………….…………. n

Examples

12........m

Page 3: Privacy-Preserving Support Vector Machines via Random Kernels Olvi Mangasarian UW Madison & UCSD La Jolla Edward Wild UW Madison November 14, 2015 TexPoint.

Problem Statement

• Entities with related data wish to learn a classifier based on all data

• The entities are unwilling to reveal their data to each other– If each entity holds a different set of features for all examples, then

the data is said to be vertically partitioned– If each entity holds a different set of examples with all features,

then the data is said to be horizontally partitioned

• Our approach: privacy-preserving support vector machine (PPSVM) using random kernels– Provides accurate classification– Does not reveal private information

Page 4: Privacy-Preserving Support Vector Machines via Random Kernels Olvi Mangasarian UW Madison & UCSD La Jolla Edward Wild UW Madison November 14, 2015 TexPoint.

Outline

• Support vector machines (SVMs)

• Reduced and random kernel SVMs

• Privacy-preserving SVM for vertically partitioned data

• Privacy-preserving SVM for horizontally partitioned data

• Summary

Page 5: Privacy-Preserving Support Vector Machines via Random Kernels Olvi Mangasarian UW Madison & UCSD La Jolla Edward Wild UW Madison November 14, 2015 TexPoint.

K(x0, A0)u = 1

Support Vector Machines

K(A+, A0)u ¸ e +e

K(A, A0)u· e e

+

__

_

___

_

++

+

+

+

+

+

+

__

_ _

_

_

__

__

_

++

++ +

+

+

++

_

_

_

_

_ K(x0, A0)u = K(x0, A0)u =

Slack variable y ¸ 0 allows points to be on the wrong side of the bounding surface

• x 2 Rn

• SVM defined by parameters u and threshold of the nonlinear surface

• A contains all data points•{+…+} ½ A+

•{…} ½ A

• e is a vector of ones

SVMs

Minimize e0s (||u||1 at solution) to reduce overfitting

Minimize e0y (hinge loss or plus function or max{•, 0}) to fit data

Linear kernel: (K(A, B))ij = (AB)ij = AiB¢j = K(Ai, B¢j)

Gaussian kernel, parameter (K(A, B))ij = exp(-||Ai0-B¢j||

2)

Page 6: Privacy-Preserving Support Vector Machines via Random Kernels Olvi Mangasarian UW Madison & UCSD La Jolla Edward Wild UW Madison November 14, 2015 TexPoint.

Support Vector MachineReduced Support Vector Machine

L&M, 2001: replace the kernel matrix K(A, A0) with K(A, Ā0), where Ā0 consists of a randomly selected subset of the rows of A

M&T, 2006: replace the kernel matrix K(A, A0) with K(A, B0), where B0 is a completely random matrix

Random Reduced Support Vector Machine

Using the random kernel K(A, B0) is a key result for generating a simple and accurate privacy-preserving SVM

Page 7: Privacy-Preserving Support Vector Machines via Random Kernels Olvi Mangasarian UW Madison & UCSD La Jolla Edward Wild UW Madison November 14, 2015 TexPoint.

Error of Random Kernels is Comparable to Full Kernels:Linear Kernels

Full Kernel AA0 Error

Ran

dom

Ker

nel A

B0 E

rror

Each point represents one of 7 datasets from the UCI repository

B is a random matrix with the same number of columns as A and 10% as many rows. dim(AB0) << dim(AA0)

Equal error for random and full kernels

Page 8: Privacy-Preserving Support Vector Machines via Random Kernels Olvi Mangasarian UW Madison & UCSD La Jolla Edward Wild UW Madison November 14, 2015 TexPoint.

Error of Random Kernels is Comparable to Full Kernels:Gaussian Kernels

Full Kernel K(A, A0) ErrorRan

dom

Ker

nel K

(A, B

0)

Err

or

Page 9: Privacy-Preserving Support Vector Machines via Random Kernels Olvi Mangasarian UW Madison & UCSD La Jolla Edward Wild UW Madison November 14, 2015 TexPoint.

Vertically Partitioned Data:Each entity holds different features for the same

examples

A

¢1

A

¢3

A

¢2

A

¢1

A

¢2

A

¢3

Page 10: Privacy-Preserving Support Vector Machines via Random Kernels Olvi Mangasarian UW Madison & UCSD La Jolla Edward Wild UW Madison November 14, 2015 TexPoint.

Serial Secure Computation of the Linear Kernel AA0

Yu-Vaidya-Jiang 2006

A¢1A¢10 + R1

(A¢1A¢10 + R1) + A¢2A

¢20

((A¢1A¢10 + R1) + A¢2A¢2

0) + A¢3A¢30

Page 11: Privacy-Preserving Support Vector Machines via Random Kernels Olvi Mangasarian UW Madison & UCSD La Jolla Edward Wild UW Madison November 14, 2015 TexPoint.

Our Parallel Secure Computation of the Random Linear Kernel AB0

A¢1B

¢10

A¢2B

¢20

A¢3B

¢30

A¢1B

¢10

A¢2B

¢20A¢3B

¢30

Page 12: Privacy-Preserving Support Vector Machines via Random Kernels Olvi Mangasarian UW Madison & UCSD La Jolla Edward Wild UW Madison November 14, 2015 TexPoint.

Privacy Preserving SVMs for Vertically Partitioned Data via Random Kernels

• Each of q entities privately owns a block of data A¢1, …, A

¢q that it is unwilling to share with the others• Each entity j picks its own random matrix B¢j and distributes K(A

¢j, B¢j) to the other p - 1 entites • K(A, B0) = K(A¢1, B¢1

0)©…©K(A¢p, B¢p0)

– © is + for the linear kernel– © is the Hadamard (element-wise) product for the Gaussian kernel

• A new point x = (x10, …, xp

0)0 can be distributed amongst the entities by similarly computing K(x0, B0) = K(x1

0, B¢1) ©…©K(xp0, B¢p

0)

• Recovering A¢j from K(A¢j, B¢j0) without knowing B¢ j is

essentially impossible

Page 13: Privacy-Preserving Support Vector Machines via Random Kernels Olvi Mangasarian UW Madison & UCSD La Jolla Edward Wild UW Madison November 14, 2015 TexPoint.

Results for PPSVM on Vertically Partitioned Data

• Compare classifiers which share feature data with classifiers which do not share– Seven datasets from the UCI repository

• Simulate situations in which each entity has only a subset of features– In first situation, features evenly divided between 5

entities

– In second situation, each entity receives about 3 features

Page 14: Privacy-Preserving Support Vector Machines via Random Kernels Olvi Mangasarian UW Madison & UCSD La Jolla Edward Wild UW Madison November 14, 2015 TexPoint.

Error Rate of Sharing Data Generally Better than not Sharing:Linear Kernels

Error Without Sharing Data

Err

or S

hari

ng D

ata

Error Rate Without Sharing

Error Rate With Sharing

7 datasets represented by two points each

Page 15: Privacy-Preserving Support Vector Machines via Random Kernels Olvi Mangasarian UW Madison & UCSD La Jolla Edward Wild UW Madison November 14, 2015 TexPoint.

Error Rate of Sharing Data Generally Better than not Sharing:Nonlinear Kernels

Error Without Sharing Data

Err

or S

hari

ng D

ata

Page 16: Privacy-Preserving Support Vector Machines via Random Kernels Olvi Mangasarian UW Madison & UCSD La Jolla Edward Wild UW Madison November 14, 2015 TexPoint.

Horizontally Partitioned Data:Each entity holds different examples with the same

features

A 1 A 2 A 3

A 3

A 2

A 1

Page 17: Privacy-Preserving Support Vector Machines via Random Kernels Olvi Mangasarian UW Madison & UCSD La Jolla Edward Wild UW Madison November 14, 2015 TexPoint.

Privacy Preserving SVMs for Horizontally Partitioned Data via Random Kernels

• Each of q entities privately owns a block of data A1, …, Aq that they are unwilling to share with the other q - 1 entities

• The entities all agree on the same random basis matrix

and distribute K(Aj, B0) to all entities

• K(A, B0) =

• Aj cannot be recovered uniquely from K(Aj, B0)

Page 18: Privacy-Preserving Support Vector Machines via Random Kernels Olvi Mangasarian UW Madison & UCSD La Jolla Edward Wild UW Madison November 14, 2015 TexPoint.

B

Privacy Preservation:Infinite Number of Solutions for Ai given AiB0

• Given– –

• Consider an attempt to solve for row r of Ai, 1 · r · mi from the equation– BAir

0 = Pir , Air0 2 Rn

– Every square submatrix of the random matrix B is nonsingular– There are at least

• Thus there are solutions Ai to the equation BAi

0 = Pi

• If each entity has 20 points in R30, there are 3020 solutions• Furthermore, each of the infinite number of matrices in the

affine hull of these matrices is a solution

PirAir0 =

Page 19: Privacy-Preserving Support Vector Machines via Random Kernels Olvi Mangasarian UW Madison & UCSD La Jolla Edward Wild UW Madison November 14, 2015 TexPoint.

Results for PPSVM on Horizontally Partitioned Data

• Compare classifiers which share examples with classifiers which do not share– Seven datasets from the UCI repository

• Simulate a situation in which each entity has only a subset of about 25 examples

Page 20: Privacy-Preserving Support Vector Machines via Random Kernels Olvi Mangasarian UW Madison & UCSD La Jolla Edward Wild UW Madison November 14, 2015 TexPoint.

Error Rate of Sharing Data is Better than not Sharing:Linear Kernels

Error Without Sharing Data

Err

or S

hari

ng D

ata

Page 21: Privacy-Preserving Support Vector Machines via Random Kernels Olvi Mangasarian UW Madison & UCSD La Jolla Edward Wild UW Madison November 14, 2015 TexPoint.

Error Rate of Sharing Data is Better than not Sharing:Gaussian Kernels

Error Without Sharing Data

Err

or S

hari

ng D

ata

Page 22: Privacy-Preserving Support Vector Machines via Random Kernels Olvi Mangasarian UW Madison & UCSD La Jolla Edward Wild UW Madison November 14, 2015 TexPoint.

Summary

• Privacy preserving SVM for vertically or horizontally partitioned data– Based on using the random kernel K(A, B0)– Learn classifier using all data, but without

revealing privately held data– Classification accuracy is better than an SVM

without sharing, and comparable to an SVM where all data is shared