Skip to content

Latest commit

 

History

History
204 lines (108 loc) · 10.9 KB

2022-03-02-minutes.md

File metadata and controls

204 lines (108 loc) · 10.9 KB

Privacy TF Minutes - Wed, 02 March 2022

  • Present: Jeffrey, Nick, Wendy, Robin, Don, Christine, Tess
  • Regrets: Dan

Misc

Christine: We should find time to check that we have a list of the core principles we want to include.

Jeffrey: Good idea.

Robin: How do we go about that?

Christine: The list might not be complete, but hopefully the work you've already done will be a good test that we've covered the list. We check the list and what's already written against each other. And then others will hopefully pick it up.

Nick: And it's easier to do that in short form, without a document.

Jeffrey: I hear a proposal that we brainstorm today, and then check that against the document.

Christine: And we should focus on the design principles today.

Nick: Are there PRs to review?

Jeffrey: Nick's data rights PR?

Nick: Not ready

What principles should we cover?

Summary

  • Data minimization (collection, exposure, and retention)
    • no party should have access to a significant chunk of someone's browsing history
  • Purpose limitation
  • Informed, meaningful consent
  • User rights on un-owned devices
  • User participation: access, correction, deletion
    • Ability to have anonymous and ephemeral interactions
  • Data security: encrypted, confidential, authenticated
  • Don't try to distinguish personal data from non-personal: it's all likely to be personal somehow
  • Identity separation
  • Accountability: compliance & visibility
  • Make identification and personalization visible
  • Automation and delegation of privacy work, including to parents/guardians
  • Being let alone: quiet, free from interruption, ending communications
  • Really Good Reason principle for exemptions
  • Privacy from whom: not just the service, but from other users, friends and family, or from governments
  • Collective privacy
  • Beneficial misrepresentations

Notes

Christine: Data minimization, and in the web privacy context.

a) Collect as little as possible b) Expose as little as possible c) Don't keep it (retention)

Robin: Retention is minimization over time.

Christine: Interplays with features. There may be a desire for an API that does lots of different things. Part of minimization is not just about what data you collect from a feature, but also asking whether you need all those features.

Robin: Intersects with fingerprinting?

Christine: Yes. One of the consequences of not-minimizing is that you increase the fingerprintability and the risk that a user will be recognized uniquely or as a class.

Robin: On the web scope: In private advertising CG, we're thinking that no party should have access to a significant chunk of someone's browsing history. So not just minimization but also about no single entity getting it. Might be no problem if your history is split among 100 parties even if it would be a problem if 1 party got all of that.

Christine: I suspect that's a particular design principle that falls within data minimization, but whether it can be applied across all specs isn't clear. I do agree. Another way to minimize access is to minimize who has access to what.

Nick: Decreasing centralization of the data.

Robin: Makes there be a natural protection against profiling.

Christine: It's not enough to just spread things around, since the data can be recombined. Splitting up is second.

Nick: Could be minimization of scope or identifiers. Having common identifiers is a separate risk from what's collected

Don: An adjacent issue to this is purpose limitation, if data was provided for one purpose you don't use it for another, even if it's useful to keep it for this purpose.

Nick: Yes, that's next door.

Christine: How do you see the purpose limitation captured here?

Don: Well, there's collect or store as little as possible, but there's also expose to other purposes as little as possible. How many push-ups you can do shouldn't be exposed to a job search system.

Christine: Would you say that same-origin has purpose limitations?

Don: Well, one origin can have multiple purposes so this does not really help. The same party could have the push-ups table and a jobs query.

Nick: I think about same-origin as related to scope of identifiers. Purpose limitation means you need ways of specifying purposes, to annotate data, etc.

Jeffrey: I suggest that we should go broad rather than into details. I'd like to add something about consent needing to be informed. Also, users have rights even if they use a device they don't own.

Robin: Should we say more than "informed"?

Jeffrey: I mean properly informed, and that includes a lot of things to make sure it happens.

Nick: Looking at FIPs, we like to pooh-pooh but it is useful. We should take user participation (eg. rectification). And the other principle is basic security.

Robin: There's been a lot of negativity around FIPs because they have been presented as sufficient when they aren't, but it doesn't mean they're not necessary.

Christine: Is there anything in W3C about how data needs to be encrypted? Notably about whether things that might not look like personal data.

Jeffrey: We've been moving towards saying everything should have all the TLS guarantees, but I challenged that with Web Packaging.

Christine: It sounds like this should be a principle.

Nick: It doesn't necessarily help to try to distinguish between personal data and not.

Jeffrey: That should be a principle.

Christine: Yes. PROTECT ALL DATA.

Nick: People writing specs might not be great at knowing what is personal data.

Jeffrey: Scanning a new book, Modern Socio-Technical Perspectives on Privacy ...

  • separating identities across sites
  • IoT and privacy from the things around us in space
  • group privacy: individuals can't consent to implications about other people
  • different cultural perspectives
  • a11y
  • via Dan: guardianship and people protecting other people

Christine: Another area is building in accountability. Part of it can be demonstrating compliance. Browsers have indicators that for instance the mic or camera is on. We could have something similar.

Jeffrey: That ties into visibility-to-researchers.

Nick: I wonder if there's a principle about the user having ultimate authority over what they want their agent to do with the data?

Don: There's something in there that a party should reveal to you when they have identified you. Like don't give people a falsely generic experience if they have identified you. Twitter makes this obvious with the sign-in with Twitter function. (If you are signed in with Twitter, the site must show a Twitter logo and your Twitter profile picture to make the logged-in state and the identity that the site is using clear)

Jeffrey: That's related to how Google identifies that search results have been personalised. This goes beyond identifying you but also making use of data about you.

Wendy: Anonymity? Ephemerality, and persistence. The choice to have an ephemeral experience.

Jeffrey: And we'll have to be careful writing that to avoid implying things we don't mean, like questions about funding sites and fighting abuse.

Nick: There's an interesting goal there about ephemerality. The security model (might not be written down), that it's safe to visit a link. (Won't be hacked, won't have all your data stolen) Also something about the click not being permanently recorded without the ability to delete it.

Robin: To the extent possible, privacy management should be automated. Re access and deletion, there's work from Consumer Reports on a protocol for deletion rights. GPC and ADPC support.

Nick: In order to make choices in practice, need to be able to use tools to automate it.

Nick: Right to be let alone. Right to quiet and lack of interruption. Should be able to close a tab.

Robin: Journalistic exemption? Most privacy regimes say that you're allowed to violate privacy in pursuit of journalistic purposes. To hold people to account. They might not consent to having people know bad things they've done.

Jeffrey: Overlaps with the anti-abuse carveouts.

Don: Public safety: If you see someone's house is on fire, you can break their window to rescue them.

Wendy: I think we've touched on that a few times, there are a few exemptions that are generally agreed. We are not saying that our principles are absolute, only that you better have a really good reason if you are going to impinge on these. Many will say journalists, anti-abuse people have good reasons.

Nick: I do see this as a general principle. For instance for accounting purposes, even if I delete my data you might need to keep receipts and the such.

Robin: Maybe we should have a Really Good Reason principle to capture the fact that there are exemptions but they need to be pretty damn good and to have accountability.

Wendy: We tend with computers to have absolute approaches but we need to give consideration to exceptions (how will they be available, how will their use be governed).

Robin: That's why the blockchain doesn't work.

Jeffrey: It's not quite a privacy principle, but something about how collective governance is useful in making sure privacy decisions and tradeoffs aren't self-serving.

Nick: Expand on accountability: it's going to be relevant for any of these balancing exceptional things. It's a useful idea for redress. We're looking for privacy by default, but also we need to accept that to maintain privacy we need corrections and fixes over time.

Nick: Privacy from different people. We often think of it as privacy from the website, but there's also privacy from other site users, other users of your device, from law enforcement and your government.

Jeffrey: Incognito mode is a good example of that, where it's not privacy from the website, just other users of your device.

Wendy: e.g. data collected by one party can be subpoena'ed by another one.

Don: Collective privacy. Want to be clear that there's negative-space privacy. "If all the crossfitters and accredited investors are ok sharing their info, it also identifies the non-crossfitters and non-accredited-investors".

Robin: Reminds me of sale-of-data issues. There's this idea that we could give people money in exchange for their data. Typical attack is that if you're paid, then lying about yourself is theft. Which is an added threat.

Jeffrey: White lies are important for society to exist?

Don: Misrepresentation of your own data, unless it causes a certain kind of harm. E.g. Misrepresenting yourself as a luxury car intender is almost never a problem; but as other categories can be more of a problem.

Nick: the Web is real people with direct implications, not separate from "the real world".

Jeffrey: Is that everything?

Robin: Now what do we do with these ideas?

Jeffrey: We should go through the summary, make sure it includes everything in the conversation, and then compare it with the document and file issues and PRs for things that are missing.

Robin: This was really useful to catch some things we omitted because we thought they were obvious.