Sign in
Author

Conference

Journal

Organization

Year

DOI
Look for results that meet for the following criteria:
since
equal to
before
between
and
Search in all fields of study
Limit my searches in the following fields of study
Agriculture Science
Arts & Humanities
Biology
Chemistry
Computer Science
Economics & Business
Engineering
Environmental Sciences
Geosciences
Material Science
Mathematics
Medicine
Physics
Social Science
Multidisciplinary
Keywords
(6)
kmeans clustering
Learning Model
Principal Component Analysis
Sampling Error
Value Function
Statistical Query
Related Publications
(36)
Revealing information while preserving privacy
PrivacyPreserving Datamining on Vertically Partitioned Databases
Differential Privacy
PrivacyPreserving Data Mining
Calibrating Noise to Sensitivity in Private Data Analysis
Subscribe
Academic
Publications
Practical privacy: the SuLQ framework
Practical privacy: the SuLQ framework,10.1145/1065167.1065184,Avrim Blum,Cynthia Dwork,Frank McSherry,Kobbi Nissim
Edit
Practical privacy: the SuLQ framework
(
Citations: 129
)
BibTex

RIS

RefWorks
Download
Avrim Blum
,
Cynthia Dwork
,
Frank McSherry
,
Kobbi Nissim
We consider a statistical database in which a trusted administrator introduces noise to the query responses with the goal of maintaining privacy of individual database entries. In such a database, a query consists of a pair (S, f) where S is a set of rows in the database and f is a function mapping database rows to {0, 1}. The true answer is ΣiεS f(di), and a noisy version is released as the response to the query. Results of Dinur, Dwork, and Nissim show that a strong form of privacy can be maintained using a surprisingly small amount of noise  much less than the
sampling error
 provided the total number of queries is sublinear in the number of database rows. We call this query and (slightly) noisy reply the SuLQ (SubLinear Queries) primitive. The assumption of sublinearity becomes reasonable as databases grow increasingly large.We extend this work in two ways. First, we modify the privacy analysis to realvalued functions f and arbitrary row types, as a consequence greatly improving the bounds on noise required for privacy. Second, we examine the computational power of the SuLQ primitive. We show that it is very powerful indeed, in that slightly noisy versions of the following computations can be carried out with very few invocations of the primitive:
principal component
analysis,
k means
clustering, the Perceptron Algorithm, the ID3 algorithm, and (apparently!) all algorithms that operate in the in the
statistical query
learning model
[11].
Conference:
Symposium on Principles of Database Systems  PODS
, pp. 128138, 2005
DOI:
10.1145/1065167.1065184
Cumulative
Annual
View Publication
The following links allow you to view full publications. These links are maintained by other sources not affiliated with Microsoft Academic Search.
(
portal.acm.org
)
(
portal.acm.org
)
(
www.informatik.unitrier.de
)
(
research.microsoft.com
)
(
research.microsoft.com
)
(
people.csail.mit.edu
)
(
doi.acm.org
)
(
research.microsoft.com
)
(
portal.acm.org
)
(
research.microsoft.com
)
More »
Citation Context
(104)
...The problem of statistical disclosure control { revealing accurate statistics about a set of respondents while preserving the privacy of individuals { has a venerable history, with an extensive literature spanning statistics, theoretical computer science, security, databases, and cryptography (see, for example, the excellent survey [1], the discussion of related work in [
2
] and the Journal of Ocial Statistics 9 to condentiality and ...
...This is a powerful observation; for example, using only noisy sum queries, it is possible to carry out many standard datamining tasks, such as singular value decompositions, nding an ID3 decision tree, clustering, learning association rules, and learning anything learnable in the statistical queries learning model, frequently with good accuracy, in a privacypreserving fashion [
2
]...
...As an example of \private programming" [
2
], consider kmeans clustering, described rst in its usual, nonprivate form...
Cynthia Dwork
.
A firm foundation for private data analysis
...In particular, while differentially private data publishing has recently gained favor for static tables [2], [10], [12], [17], [18], [28], applying differential privacy to dynamic anonymization may be problematic for the same reason repeated queries to a differentially anonymizing system are problematic — intuitively, the noise that must be added grows, since adversaries can use differencing to detect and remove the anonymizing noise [
4
], ...
Yeye He
,
et al.
Preventing equivalence attacks in updated, anonymized data
...For example, if c(D) is the fraction of database rows that satisfy some property — a counting query — then it is known that we can take A(D) to equal c(D) plus random Laplacian noise with standard deviation O(1/(�n )) ,w heren is the number of rows in the database and � is the measure of differential privacy [
5
]...
...A sequence of works [7,13,
5
,11] has provided a very good understanding of differential privacy in an interactive model in which realvalued queries c are made and answered one at a time...
...The SuLQ framework of [
5
] is a wellstudied, efficient technique for achieving(�, δ )differential privacy, with nonsynthetic output...
Jonathan Ullman
,
et al.
PCPs and the Hardness of Generating Private Synthetic Data
...SuLQ was proposed by Dwork et al. [7], [
14
] as a primitive to achieve differential privacy by adding calibrated noise to the actual result...
...A direct adoption of SuLQ for distributed environment can be modified from the existing single database algorithm in [
14
], [15]...
Ning Zhang
,
et al.
Distributed Data Mining with Differential Privacy
...We demonstrate our claim by showing that Airavat can implement the SuLQ primitive introduced by Blum et. al. [
8
]...
...The input consists of a set of vectors and an initial set of cluster centers. The algorithm proceeds in two steps [
8
]...
Indrajit Roy
,
et al.
Airavat: Security and Privacy for MapReduce
References
(20)
Timing Attacks on Implementations of DiffieHellman, RSA, DSS, and Other Systems
(
Citations: 931
)
Paul C. Kocher
Conference:
International Crytology Conference  CRYPTO
, pp. 104113, 1996
Revealing information while preserving privacy
(
Citations: 177
)
Irit Dinur
,
Kobbi Nissim
Conference:
Symposium on Principles of Database Systems  PODS
, pp. 202210, 2003
Efficient noisetolerant learning from statistical queries
(
Citations: 136
)
Michael J. Kearns
Journal:
Journal of The ACM  JACM
, vol. 45, no. 6, pp. 9831006, 1998
PrivacyPreserving Data Mining
(
Citations: 1030
)
Rakesh Agrawal
,
Ramakrishnan Srikant
Journal:
Sigmod Record
, vol. 29, no. 2, pp. 439450, 2000
Induction of decision trees
(
Citations: 5265
)
J. Ross Quinlan
Journal:
Machine Learning  ML
, vol. 1, no. 1, pp. 81106, 1986
Sort by:
Citations
(129)
A firm foundation for private data analysis
(
Citations: 17
)
Cynthia Dwork
Journal:
Communications of The ACM  CACM
, vol. 54, no. 1, pp. 8695, 2011
Preventing equivalence attacks in updated, anonymized data
Yeye He
,
Siddharth Barman
,
Jeffrey F. Naughton
Conference:
International Conference on Data Engineering  ICDE
, pp. 529540, 2011
Provably Private Data Anonymization: Or, kAnonymity Meets Differential Privacy
Ninghui Li
,
Wahbeh H. Qardaji
,
Dong Su
Journal:
Computing Research Repository  CORR
, vol. abs/1101.2, 2011
PCPs and the Hardness of Generating Private Synthetic Data
Jonathan Ullman
,
Salil P. Vadhan
Conference:
Theory of Cryptography
, pp. 400416, 2011
Distributed Data Mining with Differential Privacy
Ning Zhang
,
Ming Li
,
Wenjing Lou
Published in 2011.