News

OpenDP Initiative Aims To Share Data and Preserve Privacy

Microsoft and Harvard University have been collaborating on open source tools aimed at easing data sharing while also preserving privacy, according to a Wednesday announcement.

The collaborative effort, which kicked off "over the past year," is called the "OpenDP Initiative," which is producing so-called "differential privacy" tools that add "noise" to data to obscure information about individuals. Definitions of differential privacy can be found in this Wikipedia entry and in this post by Matthew Green, a professor at Johns Hopkins University.

Differential privacy is a term that comes from cryptography. It seems to boil down to a mathematical expression of the statistical noise that's used to de-identify data.

The researchers envision OpenDP as being used to share data from archives without compromising privacy, including government data such as U.S. Census Bureau information, according an OpenDP whitepaper description. OpenDP could also permit companies to share data with social science researchers without "violating user trust (as in the Cambridge Analytica incident)," the whitepaper added. It would enable greater collaboration on COVID-19 data, as well, they explained.

The OpenDP Initiative maintains a GitHub page where its tools are released as open source software.

"As a part of this effort, we are granting a royalty-free license under Microsoft's differential privacy patents to the world through OpenDP, encouraging widespread use of the platform, and allowing anyone to begin utilizing the platform to make their datasets widely available to others around the world," Microsoft's announcement explained.

Privacy has been a tricky area for Microsoft and the software industry as a whole, given their widespread data mining practices. A few years ago, a Dutch government privacy authority found that Windows 10 violated the privacy of its users, for instance.

Things on the privacy front have more recently come to a boil with regard to the use of facial recognition technology to profile people using artificial intelligence algorithms and machine learning processes. Microsoft President Brad Smith recently pronounced that Microsoft would hold off on selling facial recognition technology to U.S. police departments, although he didn't mention that Microsoft isn't precluding sales to federal agencies, as was noted by the ACLU civil rights group.

This week, an open letter to the publisher Springer was made public by a group of academics. It also included the signatures of some researchers from IBM, Google and Microsoft, as noted in this Kaspersky Threatpost article. The letter condemns in advance the coming publication of a paper that's proposing the use of deep neural network models to predict crime.

"Such claims are based on unsound scientific premises, research, and methods, which numerous studies spanning our respective disciplines have debunked over the years," the letter explained.

Specifically, this "crime prediction technology reproduces, naturalizes and amplifies discriminatory outcomes." It added that "machine learning programs are not neutral; research agendas and the data sets they work with often inherit dominant cultural beliefs about the world."

About the Author

Kurt Mackie is senior news producer for 1105 Media's Converge360 group.

Featured

comments powered by Disqus

Subscribe on YouTube