What it is: Differential privacy is a way to mine data without exposing the privacy of the people generating that data.
In the world of artificial intelligence and machine learning, d at a is king. The more data you have, the more the computer can detect patterns and get “smarter.” That’s why the focus of most technology companies lies with collecting as much data as possible, which is why Google collects your personal browsing habits to make a better search engine. Because Google has so much data everyday, their search engine can get progressively more accurate over time, making it nearly impossible for competitors like Microsoft’s Bing to ever catch up.
However, the huge drawback of most machine learning projects involves sending data to the cloud so a company like Google can use it. This cause two problems. First, such data collection requires an Internet connection. Lose that Internet connection and the device suddenly loses its artificial intelligence. Second, such data collection also reveals the activities of individuals. Essentially this means you’re willingly giving up your privacy to help companies like Google sharpen their machine learning algorithms.
To avoid exposing users’ privacy, Apple uses something called differential privacy. The essential details is that differential privacy seeks to maximize data collection and analysis while minimizing the ability to pinpoint where that data came from. In other words, differential privacy protects your privacy.
In addition to protecting your privacy, Apple is developing its own neural engine on a chip. So instead of relying on a data center to provide artificial intelligence, Apple’s products will use a neural engine to provide the artificial intelligence whether your device has an Internet connection or not.
Apple’s approach to artificial intelligence is to protect your privacy while also making their devices smarter without relying on an always-on Internet connection. Whether this will prove better than current machine learning practices that send data to the cloud remains to be seen. As long as Apple’s solution is good enough while also protecting user privacy, it may be preferable than a “smarter” artificial intelligence device that exposes your activities so anyone (think the government) can examine your activities and use it against you.
To compete, rivals like Google and Microsoft will need to develop their own artificial intelligence on a chip. Then again, they may simply keep doing what they’re doing and assuming people will willingly trade their privacy for results. Some people don’t mind giving up their privacy, but if you do, then you may be wary of other approaches to machine learning that does not use differential privacy.