We at Brave are against FLoC as we are against any malicious functionality. Brave runs on the Chromium engine, so we tore off FLoC before it even became trendy, and recently physically disabled the code that calculates FLoC identifiers (this pull request even hit # 1 on HackerNews). FLoC won’t work on our websites either – we don’t want information about users interested in Brave to leak out of Chrome. Thus, this feature has replenished our long list tentacles severed from Chromium.
Details and details about the dangers of FLoC – in the article.
Step into the abyss
For once, companies are forced to respect user privacy, albeit to a minimum. People become more informed, including they learn and use tools that protect their privacy (including Brave or Signal); relevant laws are adopted (CCPA, GDPR). Against this backdrop, the initiative of Google Corporation, which could seize the moment and build a new, user-centered, privacy-first Internet, is somewhat surprising, but instead proposes and immediately implements changes that support the advertising ecosystem in its current unsightly form.
As you know, Google will soon disable the ability to use third-party cookies in Chrome. In order to continue targeting advertising, the corporation came up with the use of the latest probe called FLoC (Federated Learning of Cohorts) – this technology assigns the browser user to a cohort or category based on the history of visited pages, and (surprise) discloses it to sites and followers scripts. Simply put, the browser takes the recent browsing history (with some reservations), and calculates from it LSH – hash, which should be the same for similar stories. It is argued that the number of browsers with the same hash will not be enough to uniquely identify each specific user. This hash (cohort identifier) will be provided to everyone who wants to get it.
Obviously, it takes more than playing thimbles with FLoC and Privacy Sandbox to transform the web from a tracking web into a garden of Eden. Changes to the content creator reward model are not only possible but necessary – the success of our private advertising platform Brave rewards shows that the radical approach works. We invite Google to join our efforts to fix fundamental things, fix the harm done by the ad-tech industry, and build a web that serves the interests of users.
Below we will explain why we called FLoC a step into the abyss: why it is bad for users, for sites and for the Internet in general.
FLoC hurts user privacy
The worst aspect of FLoC is hypocrisy. EFF colleagues (and many others) have already spoken out, so we will only mention the most shameful aspects.
FLoC gives websites your browser history
FLoC and privacy are constructively incompatible things. Primarily, FLoC shares information about your browsing behavior with sites and advertisers that would not have received this information without FLoC… Moreover, this is a completely new way of disclosing data, which history has not yet known.
Worse, this information will be received by sites where you are already logged in, as well as third-party scripts. For example, a pharmacy or a website where you usually buy something will receive additional information about your interests, as well as share it with all their third-party analytical scripts (including Facebook, Yandex and the rest of the big data lovers crowd).
According to GoogleFLoC provides more privacy than using third-party cookies. However, many browsers do not work with third-party cookies, in particular, Brave has always blocked them. It turns out that there is nothing to compare with, if not to talk about Chrome itself – the way it is now.
Google emphasizes that the work is carried out with the data not of a specific user, but of a whole group, selected on the terms of k-anonymity (that is, a group of k clients with similar interests), that is, there is no harm to the privacy of specific users. However, this is a substitution of concepts. Many interests, beliefs and other “parameters” inherent in a person are completely non-unique, but this does not mean that they can be redistributed without the user’s consent… This is a personal matter of the conditional Vasily Vasilyevich Pupkin, whether he wants to wear a jacket or coat, is he an atheist or a Buddhist, whether he prefers to watch anime, deny GMOs or support Stallman. There are many things that we can share with someone and hide from someone.
Overall, the idea that privacy is purely a lack of cross-site tracking is fundamentally wrong. Any concept of privacy should include the clause “do not spread information about a person without his permission.” FLoC technology “improves privacy” by cynical substitution of the concept of “privacy”.
FLoC makes it easier for sites to follow you
FLoC data significantly enriches the digital fingerprint (fingerprint) of the browser, which is difficult to blur even without FLoC. Google suggests that the Privacy Budget approach will help with this – this is the blocking of user-identifying requests from a particular site when their number exceeds a certain “trust limit”.
A year ago we explainedwhy we doubt the success of this approach. Google does not dispel any fears, in addition, it is still not specified exactly how the Privacy Budget will work – this task is still at the stage of hypothesis testing.
Releasing a privacy-detrimental feature while exploring ways to prevent that privacy-related harm is the root of the evil from which fingerprinting hydra has grown.
FLoC misrepresents privacy and its importance
Google is aware of all the concerns, but does not disclose specific solutions. For example, the company notes that some categorization methods (sexual orientation, political opinion, medical issues) will be excluded from FLoCand that the company is working to ensure that various “delicate” parameters are not taken into account. But this is a catch-22 – it turns out that in order to exclude such groups, you just need to first make a sample for them. A system that determines which cohort does not pass through “Delicate” parameter, fixing exactly how many users have it is absurd.
Moreover, the idea of creating a global list of “sensitive categories” is illogical and immoral. What is delicate differs from person to person. Take a man and a woman who are interested in dresses. For a woman, this is considered the norm, but for a man? This is definitely not Google’s decision. Adults expecting a baby will not hide their interest in baby products, but a frightened and nervous girl? Two different people may be looking for the same thing, but for one it will be a socially acceptable norm, and for the other it will be a serious way out of the comfort zone, and this way out is his own business.
It is not so important that the “delicate categories”, according to Google, will be somehow incomplete. A privacy system that is based on a nailed list of “delicate categories” or the only correct (and therefore omnipotent) definition of “sensitive behavior” has nothing to do with privacy.
FLoC harms both websites and publishers
While most of our concerns about FLoC concern users, some sites may also be affected – the ones that users trust the most.
Imagine that you are the owner of a website that sells, say, polka music. You have an established community of polka lovers. The niche is very narrow, you feel like a fish in water in it, everyone who wants to listen to polka comes to you, so you can set prices higher than on less specialized sites. However, FLoC in Chrome will be able to identify your customers as polka lovers and pass that information on to other sites that will happily drag your audience over.
FLoC will broadcast the polka cohort number to any scripts that request information from FLoC, and of course they won’t hesitate to take advantage of it. This obvious scenario FLoC will continue to support even after third-party cookies die off.
We urge sites to disable FLoC
Considering that FLoC can be harmful to site owners, we recommend that sites refuse to participate in FLoC (as many already do, for example, GitHub). We believe that any new technology that could potentially threaten privacy should not be enabled by default – this is a basic principle of respect for users. Obviously, Google turns on FLoC by default, because otherwise this functionality will not collect the amount of data that advertisers need.
FLoC, along with many other elements of the Privacy Sandbox, is a twist in the wrong direction. The Internet needs a fundamental change to end user surveillance, but FLoC works the exact opposite. Instead of a radical cure for a rotten advertising industry, Google is offering us the same thing in a different hand.
The main question when creating technologies should be: “Do users want this?” Instead, FLoC and Privacy Sandbox are interested in “how to sell more ads so that users don’t notice, or at least not loudly resent it.” The Brave project is living proof that taking radical action in the right direction benefits the Internet in general, as well as users, publishers, and even advertisers.