Differential privacy is a technology
that allows Apple and other companies to analyze user data and learn about the
user community without knowing about individuals in the community.
Differential privacy helps to
analyze how user data is kept completely anonymous and transforms the information
shared with Apple.
iPhone Differential Privacy Policy for Data Protection. Differential privacy is a mathematical definition of privacy. |
What
is Differential Privacy in iPhone and How Can It Protect Your Data?
What is Differential Privacy?
Differential privacy is a mathematical and technical approach for publicly sharing information and BigQuery
public datasets, which limits the disclosure of private information
recorded in the database.
Differential privacy technique
improves the accuracy of queries from statistical databases, reduces the
chances of identifying individual iPhone users and increases the privacy of
personal habits and personal data for Apple users.
What is
Differential Privacy on iPhone?
Differential
privacy is a mathematical definition of privacy originally developed by Nissim, Dwork, Smith, and McSherry
with significant contributions from many others over the years.
Apple announced, at the Apple
Worldwide Developers Conference (WWDC), a series of new security and
privacy features, including one that attracted little attention - and
confusion.
Specifically, Apple announced that
it would use a technology called Differential Privacy (DP) to improve the
privacy of its data collection practices.
iPhone
Differential Privacy Policy for Data Protection
Starting with iOS 10, Apple uses
differential privacy technology to help detect usage patterns, without
compromising individual privacy, for a large number of users.
Differential privacy adds
mathematical noise to a small sample of an individual's pattern of use to hide
an individual's identity. As more people share the same pattern, global
patterns begin to appear, helping to improve the user experience.
In iOS 10, this technology will help
improve emoji suggestions, QuickType, Search reports Notes and Spotlight deep
link suggestions.
It seems that Apple will collect a
lot of data from your phone, to sum up and make a long story. But it does
this primarily to make its services better, not to collect the habits of using
individual users.
Apple has intended to apply advanced
statistical techniques to ensure that this aggregate data - the statistical
functions you calculate across all your information - does not leak your
individual contributions.
It sounds very good and seems like
at least a good time to talk a little bit about differential privacy, how to
achieve it, and what it might mean for your Apple iPhone.
In the past few years, some people
have been used to the idea that they send a lot of personal information to many
of the service providers for services they use in their Phones.
A number of surveys indicate that
average people are beginning to feel uncomfortable about information privacy
and information security.
This discomfort makes sense when you
consider that companies may use your personal data for marketing. But you
should always remember that sometimes there are decent motives for collecting
user information.
For example, Google
famously turns on Google Flu Trends. Microsoft recently
announced a tool that can diagnose pancreatic cancer by monitoring Bing
queries.
Of course, you benefit from data
from group sources that improve the quality of your services - from map apps to
restaurant reviews.
Unfortunately, data collection is often based on good intentions but may become worse.
Unfortunately, data collection is often based on good intentions but may become worse.
For example, in late 2000, Netflix
conducted a competition to develop a better algorithm to recommend films.
Netflix released
an “anonymous” demo dataset to lead
the competition that was stripped of identifying information. Unfortunately,
this process of de-identification has proved inadequate.
In a famous work, Narayanan and
Shmatikov showed that if you know a little additional information about a
particular user, the datasets can be used to redefine specific users - For
example, - predict their political affiliation!
This kind of concern should be worrying
to us. Not only because companies routinely share data but because statistics
about the dataset can sometimes leak information about the individual records
used in their calculation and because hacks always occur.
Differential privacy is a set of tools that were designed to identify and solve this types of problems.
Characteristics of a Differential Privacy
The
characteristics of differential privacy can intuitively summarize as follows:
The main characteristics of
differential privacy research is that in many cases, differential
privacy can be realized if the data sorting party is ready to add random
noise to the result.
For example, instead of merely
collecting and sorting real data, noise can be introduced from the Laplace or
Gaussian distribution, which produces a completely inaccurate result - but it
presents the contents of any given row.
Imagine that you have two identical
databases, one containing your information, and the other without.
Differential privacy ensures that
the probability of any statistical the operation will be (almost) the same as
whether it is performed on the first or second database.
Even differential privacy is more
useful, pump noise can be calculated without knowing the contents of the
database. Noise calculation can be performed based on the knowledge of the
function to be calculated and the acceptable amount of data leakage.
One way to look at this is that
differential privacy provides a way to see if your data has a significant
impact on query results. If not, you may also contribute to the database -
since there is almost no damage. Let's take an example:
Imagine you have chosen to enable
the reporting feature on your iPhone which tells Apple if you want to routinely
use emoji in your iMessage conversations. This report indicates such
information: 1 indicates you like, and 0 indicates you do not like.
Apple may
receive these reports and fill them in a huge database.
At the end of the day, it wants to
be able to derive the number of users who love these special emojis.
Tags
data protection
differential privacy
information privacy
information security
iphone
privacy policy
technological advances
technology
top iPhone apps