-
Notifications
You must be signed in to change notification settings - Fork 81
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
reduce fingerprint produced by uBlock Origin #1121
Comments
Or use only resources from package. |
Yes, that was one of the update mechanisms I considered, basically it's doing snapshots but at the intervals of extension updates. That of course is not a watershed update, users will frequently be running different versions of the extension, especially around when updates are released, but not very many (typically 2). The larger issue is that it leaves the lists out of date for longer periods. |
The word you are looking for is Linkability. Fingerprinting links traffic (by browser/device), it has nothing to with privacy (don't get me wrong: linking FPs/traffic can unmask users but is not a given) First up: Anonymity: The user would want to be on Tor (Tor Browser) or a VPN etc: because if you're not hiding your real IP, then there's no point in anti-fingerprinting Second: Stability and Universality: The fingerprint metric would need to be consistent/stable across large chunks (universal) of the internet: e.g. a third party or common first party script.
Third: Step One: The first method (of about eight) in defeating FPing, is to block scripts. uBlock Origin by it's very definition, can block these fingerprinters (but it's a whack-a-mole game), especially in a more hardened configuration. Or for the use case above with a state actor, they should be using NoScript and the Security Slider on Tor Browser So, yes, theoretically this is an issue. But recent studies have shown only about 3.5% to 5% of sites use fingerprinting techniques, and the scripts are fairly simple and boilerplate (mainly canvas and a few other properties). There is no widespread blocklist fingerprinting, given they require upkeep and are very slow to run. That said, I'm all about advanced FPing techniques/PoCs and making them useless or lowering their entropy, so I'm certainly not pooh poohing anything you said. I just think any warning needs to be very carefully worded with risk assessment and threat models taken into account. Would you rather a Polish user is scared by a FP warning, and doesn't block all the Polish adverts/trackers on the e.g. 0.01% chance (made up figure) he might get FPed this way vs stopping unlimited amounts of tracking and adverts (which carry malware risks). The answer to that rhetorical question is site isolation (FPI or dPFI: these are Firefox only), but I think you get my point |
Indeed those who chose less linkability will/should not be a high-value target. Such users tend to whitelist Noscript for page breakage and then it's fingerprintable - even TBB is not bullet-proof once JS is allowed. uBO is a contents blocker, not a security or privacy suite, but it can block common fingerprinting scripts with by far way better granularity than Noscript thx to path-level blocking and redirect resources - that's more than enough to casual user (majority) and common FP implementation. Those high-value targets (minority) should learn how to get reasonable anonymity and then suggested method is a subtle matter - there are many techniques (I mentioned some here) and blocking only one possible technique doesn't make much sense. |
I don't want to get into an argument over definitions, but this is a form of privacy, which is a fairly broad and pluralistic notion. The entire field of differential privacy is built around the measure of linkability, for example, and advances in fingerprinting are a staple at PETS.
Agreed, Tor Browser is pretty much the only application we should be focusing on here (plus Firefox with the about:config option for upstream testing).
A state actor isn't really the primary threat model I had in mind, so much as just general fingerprinting. Given that websites are already using HSTS cookies, it seems not too far-fetched to use a very similar idea with markets that are harder to reach (Facebook is notorious for this for example, they openly admit to using web fingerprinting to link web sessions, browsers, and devices, even for users who are logged out or people who don't have an account, in their privacy policy).
To be clear, the warning I was referring to was about using and ad blocker with Tor Browser at all. While there are quire a few people who use the combination, the reality is that they likely make up a very small fraction of Tor Browser users over all, and it is trivial to tell if someone is using an ad blocker, even with javascript disabled (just host a first party ad and get it blocked, then look at the server logs). While on larger sites it's likely there's enough cover traffic to blend in, on smaller forums or whistleblower sites, I would guess the numbers are closer to the dozens, if that. That's not an edge-case risk, that's a serious compromise for a lot of threat models, and one I would guess a lot of people who need these tools wouldn't think of. |
Well again, I think that many users are unaware of the threats of installing extensions like UBO. Hence my prioritization of the lowest-hanging fruit: giving the user a warning that this might be a bad idea. Tor Browser is certainly not bullet-proof, but it tries to do a good job even with lower security slider levels. In any case, I definitely don't see a reason why we would accept making the problem decidedly worse. Just because some users who don't try to make the strongest guarantees are part of a smaller anonymity set, you think they should be abandoned entirely? I don't see why we wouldn't want to give those users (who, yes, are not part of the largest Tor Browser population, but are still using it for a reason) the best shot at staying anonymous. |
^^ emphasis mine. Good. The point is to NOT use overly broad terms in some sort of warning, because as a lowest common denominator, you do not want to confuse end users. Any warning needs to be specific: privacy to most people means the contents shouldn't be able to be read by anyone except the sender and receiver - e.g. sending a sealed letter via mail, whispering in someone's ear e.g.
Tor Browser doesn't ship with an adblocker. It ships with FPI and the Tor protocol. They also recommend to not install extensions other than those shipped (maybe it should say it in the addons pane as well). There is no real need for TB to use an adblocker except 1. it would help with page loading times and tor network load and 2. make the web more pleasant. If anyone adds an adblocker to Tor Browser, then that is on them: this isn't uBO's problem Tails does ship Tor Browser with uBO: so any issues with list FPing is a Tails problem, but maybe one that uBO can help with, such as locking down the lists and updates so everyone is uniform
I'm not saying that this isn't a problem: I'm saying the risk is practically nil, the benefit can actually makes thing worse, and the threat model does not fit Firefox users just want to block ads and noise. Almost no-one is going to go to the lengths to mitigate FPing. uBO is a content blocker, not an anti-FPing extension (not saying that it shouldn't mask itself where possible etc) You're just not going to hide (in most cases) content being modified or resources being blocked: it's trivial to detect randomized canvas, randomized domrect, spoofed OS, spoofed languages, even spoofed browser versions, and it's easy enough to create hashes based on content behavior. Even if FF users did everything they could, they'll still stick out. So you're pandering to a very tiny percentage of uBO users - and it still won't help them (they're already pretty much unique if not actually). If this is their concern, they should be using Tor Browser. So, to make that clear: if you don't mitigate FPing then sticking to default filters makes no difference (you're already unique). If you do mitigate FPing, then sticking to default filters likely makes no difference (you're already in a very tiny group if not unique) The other side is you could make things worse: e.g. take all the Russian users who enable the Russian filter lists: you'd think that would be normal, IDK. Now you're warning users to stick to the default filter lists: a) all Russian users who abide by that are worse off on their Russian sites etc which defeats the purpose of having the list in the first place b) you've splintered the group. That's for language specific filters, and I think they should be OK to leave on: it's really the additional optional non-language lists And lastly, the threat just doesn't exist in the wild. No one is going to waste time or resources on slow JS methods when they already have far easier ways to do so. The return (snagging additional unique FPs) isn't worth the cost (implementing, maintaining, computing time and page speeds, resources). That's not to say that it won't ever be used. tl;dr: You're pandering to the wrong crowd. uBO/browsers don't offer anonymity or anti-FPing or unlinkability - it's Tor Browser and Tails that offer (but not guarantee) those. It's not a uBO problem, it's a Tails problem. |
uBO has never advertised itself as "anti-fingerprinting" solution, it's a content blocker first and foremost. This is like opening a whole new can of worms.
and that goes completely against the uBO manifesto -- https://github.com/gorhill/uBlock/blob/master/MANIFESTO.md |
Even if all the other problems were cleared, we still lack an actual case that sites use such a FP technique. I'm aware of only this test site |
I feel like everyone is piling on a strawman here, so I'm going to repeat some things that for some reason seem to have been missed.
It does not. When I said "more difficult", I was thinking of something similar to (or just using) the "I am an advanced user" checkbox to hide it for anti-fingerprinting users (i.e., it would still be possible to change lists, and ultimately very easy). That manifesto is about uBO deciding what content is acceptable, this is about making it harder for a user to shoot themself in the foot. In other words, if this breaks the manifesto, then so do the steps required to access "hard mode". |
People being the target of state actors should stick to Tor people's guidelines, which is no extension should be installed, be it uBO or anything else[1]. If you think this is an issue in need of attention, open an issue on the Tor issue tracker to forbid installing any additional extension beside the default ones. For corporate surveillance purpose, I personally do not believe using uBO in the Tor browser is an issue. That said, I have already stated in the past that I am opened to work directly with Tor/Tails people if they wish changes in uBO which they think would be beneficial to the Tor browser. So far the Tails people seem satisfied with the current state of uBO and the way if can be currently configured since I haven't received any specific request here. [1] And for that matter, I also fail to understand how NoScript is not also seen as an issue by Tor people since users are free to configure their trusted lists as they see fit. |
Please read my other responses, I've already addressed each of those points. (Aside from why Tor Project is okay with shipping NoScript, which is basically that it's not marketed as an ad blocker, and rarely used as such.) |
I don't "market" uBO, and I always emphasize that uBO is not an "ad blocker". |
Sorry, the description for uBlock Origin lists blocking ads as one of it's primary features is all I meant. In other words, I don't think it would be controversial to say that including a standard uBO install in Tor Browser would be a politically bad move for Tor Project? Maybe if they didn't ship any lists or something, but even then, I would think uBO's reputation has a much different connotation than NoScript's. |
FYI: that's no longer used, it was disabled 9 days ago. The new tracker is here: |
It does not help when objective technical assessment is thrown out the window by NoScript author who has reduced uBO as a mere "annoyance blocker". If some people out there can't see uBO as anything else as an "ad blocker", it's their call, there is nothing I can do about this beside to keep emphasizing that uBO is not an "ad blocker". The overarching goal of uBO is to reduce 3rd-party exposure to a minimum for a majority of users, this is beneficial for privacy, security, and resource consumption reasons. It is beneficial to block ads in addition to trackers because the safest assumption is that ads are privacy and security issues, beside consuming undue amount of resources. In any case a tangential benefit for people who sees uBO as a mere "ad blocker" is that they end up better protected privacy- and security-wise than if they just used an actual self-described "ad blocker", and also, in my opinion, even when technical people use NoScript on its own because uBO can do the same while able to compromise less privacy- and security-wise with its ability to create narrower exception rules.[1] Maybe it is the case that Tor people disregard uBO because of the label mis-assigned to it -- uBO could very well be configured to just enforce Peter Lowe's with auto-update disabled, and the firewall pane enabled in high security mode, but in the end it's their call, they have the technical background to evaluate uBO according to what it is, not according to what label someone stuck to it. If ever they change their mind, I am open to work with them directly. As it is, I decline adding a warning re. fingerprinting, because it's not clear that such warning is warranted in the actual world -- it could very well be that in the most typical scenarios not using uBO increases fingerprinting risks. Reducing 3rd-party exposure is the best way to foil corporate-surveillance fingerprinting in the actual world, and whoever is free to use advanced features in uBO to further reduce the threat of fingerprinting. For "activist in an oppressive regime", stick to Tor's guidelines. If you feel really strongly about this, I suggest you open an issue with Firefox because many more extensions than just uBO can have measurable effects in the DOM, so if a warning is really warranted, than best to have it for all extensions. [1] One can also block scripts or even any 3rd-party resources in a default-deny manner uBO, with the added ability per-site rules. |
Prerequisites
Description
tl;dr: uBlock Origin should take steps to reduce the fingerprint (i.e., identifying features) it produces: warn anti-fingerprinting users it generates a fingerprint, stop anti-fingerprinting users from changing their subscribed lists, use ~weekly snapshots for updates for anti-fingerprinting users, and implement a watershed update for anti-fingerprinting users.
Despite advice to the contrary, many Tor Browser users install ad blockers. I'm assuming uBlock Origin is the most popular among those who do. People do this despite knowing that Tor Browser should not have extra extensions installed, since any change in browser behavior detectable by a website will partition the anonymity set (i.e., it shrinks the list of possible people who could be the one using connections x, y, and z). Having spoken to probably half a dozen people who use Tor Browser with an ad blocker, they all have the same justification: "Ad blocking is important enough to me that I'm okay with being part of the smaller anonymity set that uses an ad blocker with Tor Browser".
What these users haven't realized, in every case I've seen so far, is that they're not partitioning themselves into "users who block ads vs. those who don't", or even "users who use uBlock Origin". A website can fingerprint a user by looking at what specific elements are being blocked (whether network blocking by just looking at resource requests, or cosmetic blocking using browser fingerprinting scripts). From that information, the site can learn not only which filter lists a user has enabled, but when each list was last updated. Since lists are frequently updated server-side multiple times daily, while clients pull from those lists at a ~random time (per list) every 5 days or so, that's a lot of entropy a site can use to fingerprint—far more than should be conveyed on Tor Browser.
For example, suppose an activist in an oppressive regime was using Tor Browser with uBlock Origin installed to access a local social media site with two different accounts: one with their real name, and one with a pseudonym. The social media site uses these fingerprinting techniques on it users, and tells the oppressive regime it noticed these two accounts always log in blocking the same elements as each other, and are therefore highly likely to be the same person.
Ideally, there are three things that should happen when a user uses uBlock Origin with Tor Browser (or more precisely, with the
resistFingerprinting
property of theprivacy.websites
API enabled). First, it should display a warning to the user that because of how anti-fingerprinting works, extensions like uBlock Origin can make them less private, since websites will be able to tell they're using it. The users I've spoken to already understood this part, but my selection is biased towards the more technically knowledgeable, and I'm guessing many users would find this counter-intuitive. Second, the extension should make it more difficult to change which filter lists are enabled from the default (plus maybe ad guard mobile on mobile devices). Third, the updates to those lists should use a watershed update mechanism, where those users start using a new list at the same time as each other.The first two should be fairly straight forward, but watershed update mechanisms can get a bit tricky, since we probably don't want to just have all Tor users hammer the list servers at the exact same time. The following scheme is a simple mechanism I've come up with, but there are obviously other options, with different trade-offs.
Every Monday at midnight, a snapshot of all the default filter lists is taken server side. Users who have anti-fingerprinting enabled pull their updates from this snapshot, rather than the current live version, but do not start using said update. Then, on Saturday at midnight, all of those users switch to the more recent snapshot, which they have already downloaded.
Note that the days of the week/times here are mostly arbitrary, I just picked specific ones for clarity. All that matters is that there is enough time between when the snapshot is updated and the watershed update time that all users should have pulled that update. The above assumes a normal user who uses uBlock Origin multiple times a week. An edge case that has to be considered is a user who doesn't use uBlock Origin for a length of time that includes when they would normally update their lists, and when the watershed update happens. To accommodate that scenario, two snapshots have to be kept by the server (the one users are currently using, and the one they will update to next). When a user starts uBlock Origin in anti-fingerprinting mode, it needs to check if its lists are too old, and if so, pull both lists immediately.
This could be implemented in phases. The warning and disabling of filter list selection should be simple to implement, and would go a long way towards informing users and preventing a good chunk of potential entropy. Then, creating the weekly snapshots and changing anti-fingerprinting users to it would dramatically reduce entropy for most of those users, since at that point, most of them would be using one of two filter lists (last week's if they haven't updated yet, or this week's if they have). Finally, adding the watershed update would eliminate the last bits of extra entropy for all users, so that the only information an adversary gains from uBlock Origin is that it's installed.
A specific URL where the issue occurs
I made a demonstration fingerprinting page (warning: the page will likely make requests to advertisers you don't block yet—but just headers, not assets). It only fingerprints when (approximately) your EasyList subscription was last updated. It's a little bit buggy, since it's just a 24 line proof of concept, so it doesn't try to use sites that have CORB headers, sites that fail to download with curl, sites that don't have TLS, it doesn't try to account for other addons like Privacy Badger, etc., but all of these bugs are solvable if someone were actually trying to fingerprint users. As of this writing, there are about 45 updates it can use from the past week (again, a better implementation could use more), so ~5.5 bits of entropy. By fixing the bugs, identifying which lists are in use, and running a similar algorithm on those lists (giving a multiplicative growth, not additive), that number would get even larger.
Steps to Reproduce
Access the above page, including on a browser with the
privacy.resistFingerprinting
about:config option on.Expected behavior:
uBlock Origin reveals nothing about the user other than the fact it's used (obviously it would be ideal if even that wasn't visible, but that's not feasible right now ;)).
Actual behavior:
uBlock Origin reveals information via the elements blocked, according to the filter lists subscribed to and when they were last updated.
Your environment
The text was updated successfully, but these errors were encountered: