Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fingerprint blocking can be detected #10481

Closed
abrahamjuliot opened this issue Jun 26, 2020 · 6 comments
Closed

Fingerprint blocking can be detected #10481

abrahamjuliot opened this issue Jun 26, 2020 · 6 comments
Labels
closed/wontfix needs-discussion Although the issue is clear, we haven't yet reached a decision about the right solution. OS/Desktop privacy/feature User-facing privacy- & security-focused feature work. privacy-pod Feature work for the Privacy & Web Compatibility pod

Comments

@abrahamjuliot
Copy link

abrahamjuliot commented Jun 26, 2020

Brave version (brave://version info)

Tested in stable (83) and nightly

Description

Fingerprint blocking can be detected. 822 is related.

Problem

Lie detection makes way for trackers to decide whether or not to trust the fingerprint data.

Steps to Reproduce

  1. check if brave is in navigator
  2. [Nightly] webgl and 2d canvas data URI should not match after one is given style (or custom context): create webgl and 2d canvas contexts, add minimal style to each, and compare their data URIs
  3. [83 Stable and Nightly] webgl renderer and vendor should not match string length: remove non digits or non a-zA-Z letters, then compare the string lengths of webgl renderer and vendor

Expected result:

  1. If either context of webgl and 2d canvas is given a unique context, then the data URIs should not match when compared to each other. They currently do in Nightly strict fingerprint blocking.
    -- possible solution: apply the randomization to the canvas context and not the dataURI
  2. The vendor and renderer should not match in length, especially when non digits & non a-zA-Z letters are removed
    -- possible solution: set vendor null for everyone in fingerprint allow and then apply subtle randomization to the renderer in standard and strict
    -- randomization ideas: [a] include non digits, spaces, non a-zA-Z letters, and random length), [b] inject a random renderer consistent with the platform

Test Script

function isBrave() {
    if (!('brave' in navigator)) {
        return false
    }
    const compressLen = str => str && str.replace(/(\W|_)/g, '').length // compress vender and render length
    const canvas2d = document.createElement('canvas')
    const canvasWebgl = document.createElement('canvas')
    const context2d = canvas2d.getContext('2d')
    const contextWebgl = canvasWebgl.getContext('webgl')

    // get webgl vendor and renderer
    const renererInfo = contextWebgl.getExtension('WEBGL_debug_renderer_info')
    const vendor = renererInfo && contextWebgl.getParameter(renererInfo.UNMASKED_VENDOR_WEBGL)
    const renderer = renererInfo && contextWebgl.getParameter(renererInfo.UNMASKED_RENDERER_WEBGL)

    // apply style to 2d canvas context
    context2d.fillText('RandomIronyValid&9', 100, 100)
    context2d.fillStyle = 'red'

    // apply style to webgl canvas context
    contextWebgl.clearColor(0.2, 0.4, 0.6, 0.8)
    contextWebgl.clear(contextWebgl.COLOR_BUFFER_BIT)

    // get canvas data URIs
    const canvas2dDataURI = canvas2d.toDataURL()
    const canvasWebglDataURI = canvasWebgl.toDataURL()

    // detect lies and return
    const hasLiedVendorRenderer = (
        vendor == renderer || compressLen(vendor) == compressLen(renderer)
    )
    const hasLiedDataURI = canvas2dDataURI == canvasWebglDataURI
	
	// Both are true in Nightly strict fingerprint blocking
	// hasLiedVendorRenderer is true in 83 Stable strict and standard fingerprint blocking
    return {
        hasLiedDataURI,
        hasLiedVendorRenderer
    }
}

const brave = isBrave()
if (brave) {
    console.log(brave)
}
@srirambv srirambv added needs-discussion Although the issue is clear, we haven't yet reached a decision about the right solution. privacy-pod Feature work for the Privacy & Web Compatibility pod privacy/feature User-facing privacy- & security-focused feature work. labels Jun 26, 2020
@srirambv
Copy link
Contributor

cc: @brave/sec-team

@fmarier
Copy link
Member

fmarier commented Jun 26, 2020

cc @pes10k

@pes10k
Copy link
Contributor

pes10k commented Jun 26, 2020

Hi @abrahamjuliot thank you for taking the time to file the issue! However, im not sure I understand the concern here. Its not a secret that Brave randomizes fingerprints (we mention it on our wiki and our blog for example). We also expect that, for some features, scripts can perform statistical attacks to recover the underlying values (in default mode, not strict mode), but would need to do things that look very different from benign API use.

The "Farbling Level: Default" section of the blog post gives exactly what the goals of the randomization are (in order):

  1. Create a unique fingerprint, per session, per eTLD+1 for naive scripts (ie the ones that consume the randomized values)
  2. Reduce the amount of identifying information available to more sophisticated scripts (ie those that update to not consume the randomized values)
  3. If neither of the above are possible, make malicious use of these APIs look very different from benign use, to allow for future possible interventions

What we would be worried about though are any of the following:

  1. If scripts could gain access to the consistent underlying values, and regain the ability to calculate consistent, identifying fingerprints of users again, in a way that was not easily distinguishable from benign use
  2. That the randomization was being done in a way that it, itself, became a consistent tracking vector

So, I think this is not a bug, but please correct me if I've misunderstood @abrahamjuliot. And, in either event, thank you for taking the time to file the bug!

@pes10k
Copy link
Contributor

pes10k commented Jun 26, 2020

Marking this wontfix bc, whether this is a bug or not, we already assume any site that sees navigator.brave will assume randomization is going on. @abrahamjuliot if that doesn't seem correct to you though, lets continue to discuss

@abrahamjuliot
Copy link
Author

abrahamjuliot commented Jun 26, 2020

Hi @pes10k thank you for this detailed reply. I will also take a close look at the episode 4 blog post.

...im not sure I understand the concern here. Its not a secret that Brave randomizes fingerprints (we mention it on our wiki and our blog for example).

...I think this is not a bug, but please correct me if I've misunderstood

I think this is not a bug, but rather a weak point in blocking fingerprinting that can be improved. The concern is not that trackers can distrust certain APIs in Brave, but that they (more sophisticated scripts) can decide whether or not to distrust the APIs. It would be great if such attacks were left in the dark on this matter and left to conclude all brave users are blocking fingerprinting, and this would benefit all brave users regardless of the block fingerprinting setting.

At the moment, trackers have these 2 options:

  1. Trust if no lies are detected
  2. Distrust if lies are detected

It would be great if trackers were left with these 2 options instead:

  1. Trust all brave users
  2. Distrust all brave users (benefits all brave users)

@pes10k
Copy link
Contributor

pes10k commented Jun 30, 2020

Thanks @abrahamjuliot , i better understand your suggestion now, thank you for taking the time to explain. I think that, though, us trying to protect people who turn a privacy feature off is probably outside what we can reasonably take on. Making the 2d and 3d drawing operations randomized in the same way (which use very different code under the hood, since WebGL stuff can be shunted to graphics hardware, while 2d usually is not) would be a very large additional amount of work to maintain (especially since we have to keep repatching Blink, which makes keeping up to date with Chromium complicated enough already). And while that would be neat to do in the abstract, i think it just, unfortunately, falls on the wrong side of the cost-benefit trade off, given the large number of other critical privacy issues we need to tackle in the web platform.

But, all that being said, im grateful for the suggestion, and to have suggestions like yours for ways to keep improving the privacy protections we can provide. Thank you again for taking the time to make the suggestion, and I hope you'll share other ideas you might have with us too!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
closed/wontfix needs-discussion Although the issue is clear, we haven't yet reached a decision about the right solution. OS/Desktop privacy/feature User-facing privacy- & security-focused feature work. privacy-pod Feature work for the Privacy & Web Compatibility pod
Projects
None yet
Development

No branches or pull requests

4 participants