In this short article I will tell you how the industry over the years has misinterpreted privacy research and how it hindered the development of technology in this area. Finally, as a recent study corrected the situation, and what benefits can be derived from this.
It's about fingerprinting through the browser. Due to differences in OS, browser versions, fonts, plugins, and at least a dozen other factors, web browsers for different users tend to look different. This can be used by sites and third-party trackers to create unique fingerprints, browser identifiers. These prints are much more efficient than cookies for tracking users: they leave no marks and cannot be erased.
The question is: how effective is fingerprinting? That is, how unique is the fingerprint of a typical user device? The answer is of great importance for privacy on the Internet. But it is difficult to scientifically study this question: although many companies have huge databases of imprints, they do not share them with researchers.
The first large-scale fingerprinting experiment called Panopticlick launched the Electronic Frontier Foundation in 2009. Hundreds of thousands of volunteers visited the site
panopticlick.eff.org and agreed to take a print of their browser for research. The experiment showed an amazing
result : 83% of users in the entire sample showed unique fingerprints. For users with Flash or Java enabled, fingerprints are even more unique: 94%.
An experiment by researchers from INRIA in France with an even larger sample showed a generally similar result. Meanwhile, various researchers, including us, said that browsers are increasing the number of functions that can be used for fingerprinting:
Canvas ,
Battery, Audio and WebRTC .
')
The conclusion was clear: fingerprinting is extremely effective. The browser can not cope with this threat, giving the script less information: too many information leaks, too many attack vectors. The consequences are serious. Browser developers have come to the conclusion that they cannot cope with third-party surveillance, and therefore the protection of the department’s privacy is at the mercy of extensions.
[one] But these extensions also did not seek to limit fingerprinting. Most of them worked according to a complex scheme: they blocked thousands of tracker scripts from manually created lists, constantly playing catch-up when new players appeared.
But there was a turn: the INRIA team (including some authors of previous studies) was able to come to an agreement with a large French website and test its visitors for fingerprints. The results were
published several months ago, and this time the results are completely different: only one third of users had unique prints (compared with 83% and 94% earlier), despite the use of a full set of 17 signs by researchers. For mobile users, the number is even lower: less than 20%. These differences are due to two reasons: the increase in the sample in the new study and the fact that the self-selection of participants appears to have introduced a certain bias in previous studies. There are other factors: the Internet is gradually getting rid of plug-ins, such as Flash and Java, so the possibility of fingerprinting should continue to decline. Careful examination of the results shows that the simplest intervention of browsers to limit attributes with maximum entropy will significantly improve the ability of users to hide in a crowd.
Apple recently
announced that Safari will attempt to limit fingerprinting. It is likely that the last experiment affected this decision. It is noteworthy that few experts on privacy consider protection against fingerprinting useless, and even the W3C consortium has long published
recommendations for developers of new standards on how to minimize fingerprinting. It's not too late. But if in 2009 we knew what we know today, it would have long pushed browsers to develop and deploy such protection.
What is the main reason for misinterpreting the results? One simple lesson is that statistics is a complex science, and non-representative samples can completely distort the findings of research. But there is another conclusion that is harder to accept: a recent study checks ordinary users better because researchers do not ask or notify them.
[2] In Internet experiments, there is a contradiction between the traditional informed consent and the reliability of the results. To solve this problem, we need new ethical standards.
Another lesson is that privacy protection should not be perfect. Many researchers and engineers think of confidentiality in terms of “all or nothing”: one mistake destroys everything, and if the protection is not perfect, then you should not use it at all. This may make sense for some applications, such as the Tor browser, but for ordinary users of the main browsers, the threat model is death from a thousand small cuts. Privacy protection copes well, disrupting the
economics of surveillance .
Finally, the argument about the futility of protection is an example of defeatist attitudes in privacy. Faced with the onslaught of bad news in this area, we usually acquire the kind of
learned helplessness and arrive at the simplified conclusion that privacy is dying and nothing can be done about it. But this position is not supported by evidence: in fact, we see that in this area a new equilibrium point is consistently consistent. Although privacy violations are always there, but from time to time they are compensated for by legal, technological and social protection mechanisms.
Fingerprinting through browsers today remains at the forefront of the battle for privacy. The GDPR has
made life
difficult for the companies involved. Browser developers, too, it is time to seriously take up the fight against this vile practice.
[one] One obvious exception is the Tor browser, but for this you have to pay a serious loss in performance and inoperable features on the sites. Another exception is Brave, whose users are supposedly willing to put up with some inconvenience in exchange for preserving privacy. [return]
[2] The experiment was taken by users who had previously agreed to accept cookies, but did not specifically inform them about the study. [return]