Suggestions for better privacy protection #1

Open
opened 2025-04-21 15:12:26 -04:00 by idk · 4 comments
Owner

Assign a separate temporary container to each secondary domain and randomize the browser fingerprint to prevent cross-site browser fingerprint tracking.

Related Firefox extensions for reference:

https://addons.mozilla.org/firefox/addon/temporary-containers

https://addons.mozilla.org/firefox/addon/fingerprint-spoofing

Assign a separate temporary container to each secondary domain and randomize the browser fingerprint to prevent cross-site browser fingerprint tracking. Related Firefox extensions for reference: https://addons.mozilla.org/firefox/addon/temporary-containers https://addons.mozilla.org/firefox/addon/fingerprint-spoofing
Author
Owner

They have an open source version called fingerprintjs, even if it's not as powerful compared to the paid version.

They have an open source version called [fingerprintjs](https://github.com/fingerprintjs/fingerprintjs), even if it's not as powerful compared to the paid version.
Author
Owner

You make some really good points, I hear what you're saying, and your results seem to bear you out, I think. The results I'm most interested in are the ones which come from https://fingerprint.com/demo, that project in particular takes the fingerprinting techniques and combines them into an identifier which it attempts to use to track you.

So next version will we use a different per-second-level-domain container. That much is settled.

What remains is to figure out which things to randomize. I am still hesitant to randomize things which aren't necessary. I know for sure I can't use fps as an example, either, because they don't publish source code and the javascript shipped in the .xpi is highly minified. It's completely unusable as an example, unfortunately. So instead we need to find examples for the ones we might be able to meaningfully spoof, and then figure out if spoofing them produces a desirable result. Using the fps extension settings page as a guideline, possibly:

  • Canvas
  • WebGL
  • Audio
  • Font
  • ClientRects

are useful targets for randomization.

You make some really good points, I hear what you're saying, and your results seem to bear you out, I think. The results I'm most interested in are the ones which come from https://fingerprint.com/demo, that project in particular takes the fingerprinting techniques and combines them into an identifier which it attempts to use to track you. So next version will we use a different per-second-level-domain container. That much is settled. What remains is to figure out which things to randomize. I am still hesitant to randomize things which aren't necessary. I know for sure I can't use `fps` as an example, either, because they don't publish source code and the javascript shipped in the `.xpi` is highly minified. It's completely unusable as an example, unfortunately. So instead we need to find examples for the ones we might be able to meaningfully spoof, and then figure out if spoofing them produces a desirable result. Using the `fps` extension settings page as a guideline, possibly: - Canvas - WebGL - Audio - Font - ClientRects are useful targets for randomization.
Author
Owner

I ran some tests with fingerprint.com/demo, browserleak.com and coveryourtracks on my self-hardened Firefox and TorBrowser, with private browsing mode turned off to use temporary containers.

Test environment. Self-hardened Firefox on manjaro OS and Whonix TorBrowser running in a virtual box

  1. in self-hardened Firefox (temporary container, spf, TorBrwoser's User-Agent and ublock orgin - but no country-specific ruleset), spf is turned off and fingerprint.com still recognizes cross-container user IDs. enable all options and turn on in multiple containers fingerprint.com does not recognize cross-container user IDs. coveryourtracks prompts do not resist fingerprint tracking, but remains completely random after checking each entry.

screnshots_2022-11-11_00-39-44-obfuscated

  1. In TorBrowser, with only temporary containers installed and private browsing mode turned off (browsing history unchecked, of course, and data still deleted on reboot), fingerprint.com still recognizes cross-container user IDs unless a separate tor circuit is used for different containers (which would be worse if no containers were used). As described on the Whonix website, the original TorBrowser does not prevent cross-tab tracking at all). coveryourtracks shows that there is no unique fingerprint.

screnshots-11-11_00-27-01-obfuscated

Random fingerprints with containers allow containers to really reach a level of isolation that is almost as good as multiple different browsers. Hiding everyone under the same type of random fingerprint is easier and harder to monitor than hiding everyone under the same fingerprint, and even if TorBrowser had an identical fingerprint , we still couldn't hide the fact that you were using TorBrowser, and the price we paid for that was not being able to install most of the extensions that can cause fingerprint changes. But if you use randomized fingerprints, the fingerprint changes caused by these extensions will be harmless

I ran some tests with fingerprint.com/demo, browserleak.com and coveryourtracks on my self-hardened Firefox and TorBrowser, with private browsing mode turned off to use temporary containers. Test environment. Self-hardened Firefox on manjaro OS and Whonix TorBrowser running in a virtual box 1. in self-hardened Firefox (temporary container, spf, TorBrwoser's User-Agent and ublock orgin - but no country-specific ruleset), spf is turned off and fingerprint.com still recognizes cross-container user IDs. enable all options and turn on in multiple containers fingerprint.com does not recognize cross-container user IDs. coveryourtracks prompts do not resist fingerprint tracking, but remains completely random after checking each entry. ![screnshots_2022-11-11_00-39-44-obfuscated](/uploads/0c7ddbc4639d1a0fd691d2879c41f8fe/screnshots_2022-11-11_00-39-44-obfuscated.png) 2. In TorBrowser, with only temporary containers installed and private browsing mode turned off (browsing history unchecked, of course, and data still deleted on reboot), fingerprint.com still recognizes cross-container user IDs unless a separate tor circuit is used for different containers (which would be worse if no containers were used). As described on the Whonix website, the original TorBrowser does not prevent cross-tab tracking at all). coveryourtracks shows that there is no unique fingerprint. ![screnshots-11-11_00-27-01-obfuscated](/uploads/d2c171fd4deab63f0f44b0e6fd6985be/screnshots-11-11_00-27-01-obfuscated.png) Random fingerprints with containers allow containers to really reach a level of isolation that is almost as good as multiple different browsers. Hiding everyone under the same type of random fingerprint is easier and harder to monitor than hiding everyone under the same fingerprint, and even if TorBrowser had an identical fingerprint , we still couldn't hide the fact that you were using TorBrowser, and the price we paid for that was not being able to install most of the extensions that can cause fingerprint changes. But if you use randomized fingerprints, the fingerprint changes caused by these extensions will be harmless
Author
Owner

Leaning ACK on the use of temporary containers per secondary domain. It will require significant changes to the part of the extension where the containers are initialized and the part of the extension where the I2P history is cleared. That's probably going to take at least a week to implement and test.

Leaning NACK on the use of spoofing for the purposes of obfuscating the fingerprint at this time, but I could be convinced differently pending some research and/or discussion.

So there are these two broad things I think we can target to help reduce the impact of fingerprinting by reducing the usefulness of a fingerprint. Assuming a thorough fingerprinting defense implemented in a browser(i.e. privacy.resistFingerprinting, hypothetically)

  1. Convergence- Fingerprinting defenses work best if everybody uses them and if everybody uses the same ones. I2P is in a unique situation where it's pretty much always known that you're using an I2P browser, so we sort of get to define what we want our fingerprint(s) to be. However, it makes sense to keep everybody on the same features at the same time.
  2. Entropy- Once you've got everybody "Converged" on a browser fingerprint, it may make sense for some use cases to introduce measures to obfuscate or "Spoof" additional data. If for instance we wanted to introduce an anti-fingerprinting measure in advance of it being included in Firefox or Chromium or Brave or whatever.

What the doext/fps extension does is entropic, and it does it in ways which interfere with convergent approaches. In fairness, they have checkboxes to change that behavior. In any case, though I can't just copy that approach, it's safer and more effective to do entropy only when a convergent solution cannot be applied.

  • Right now the only one I am really considering an entropic solution for is Canvas fingerprinting.
  • Right now, we defer to Firefox's consent+allowlist approach only, which is convergent.

I'm open to entropy based approaches where necessary though, if you can enumerate some new ones to me.

Leaning ACK on the use of temporary containers per secondary domain. It will require significant changes to the part of the extension where the containers are initialized and the part of the extension where the I2P history is cleared. That's probably going to take at least a week to implement and test. Leaning NACK on the use of spoofing for the purposes of obfuscating the fingerprint at this time, but I could be convinced differently pending some research and/or discussion. So there are these two broad things I think we can target to help reduce the impact of fingerprinting by reducing the usefulness of a fingerprint. Assuming a thorough fingerprinting defense implemented in a browser(i.e. `privacy.resistFingerprinting`, hypothetically) 1. Convergence- Fingerprinting defenses work best if everybody uses them and if everybody uses the same ones. I2P is in a unique situation where it's pretty much always known that you're using an I2P browser, so we sort of get to define what we want our fingerprint(s) to be. However, it makes sense to keep everybody on the same features at the same time. 2. Entropy- Once you've got everybody "Converged" on a browser fingerprint, it *may* make sense for *some* use cases to introduce measures to obfuscate or "Spoof" additional data. If for instance we wanted to introduce an anti-fingerprinting measure in advance of it being included in Firefox or Chromium or Brave or whatever. What the doext/fps extension does is entropic, and it does it in ways which interfere with convergent approaches. In fairness, they have checkboxes to change that behavior. In any case, though I can't just copy that approach, it's safer and more effective to do entropy only when a convergent solution cannot be applied. - Right now the only one I am really considering an entropic solution for is Canvas fingerprinting. - Right now, we defer to Firefox's consent+allowlist approach only, which is convergent. I'm open to entropy based approaches where necessary though, if you can enumerate some new ones to me.
Sign in to join this conversation.
No Label
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: idk/I2P_in_Private_Browsing_Mode_for_Firefox#1
No description provided.