Skip to content

Latest commit

 

History

History
196 lines (103 loc) · 8.22 KB

2022-02-09-minutes.md

File metadata and controls

196 lines (103 loc) · 8.22 KB

TAG Privacy Taskforce Call

9 February 2022

Present: Robin Jeffrey Pete Wendy Nick Dan Christine Don Tess

Regrets: Amy

Scribe: Wendy

--

Agenda

Are we ready for a fpwd?

dka: I reread the doc top to bottom, and was struck by a few things. Not sure we're ready to release FPWD, even though I had been pushing for that. Don't think we have enough principles, and feel we need to reorder. One of the main things in the TAG styleguide is "getting to the point". In that spirit, suggest that each of the principle should start with the principle, more active voice, direct to the takeaway. Left a couple PRs re language features. Also think we need another pass to tighten the intro/discussion. Make it less manifesto-y. Appeal to a wider audience.

Jeffrey: I'm happy with that direction, recognizing it will take work

Robin: I'm fine with it. Think we should also hold ourselves accountable to publish. How many principles do we need? Re intro, what's the focus?

DKA: maybe 5 principles with text. Reordered so principle comes first. Tightening of intro can happen after FPWD.

Jeffrey: thanks for the focus on principles. We'd benefit from more people writing principles text. It doesn't have to be polished.

DKA: remove "boil the ocean" as idiomatic. Also removed extraneous "code is law" ref.

Pete: later in the call, let's go through a principle as model

DKA: ethical web principles removed ref to "dark patterns" because there's some controversy in the community about the naming. Also, we didn't use the concept later. We do reference "harmful patterns" in EWP

robin: is it OK to keep the references for now?

DKA: TAG got some feedback against the term "dark patterns"

Jeffrey: general principle against equating light=good, dark=bad

npdoty: thanks, I had heard the suggestion against white/black

robin: I'd like us to be able to reference the concept

dka: I'll issue a PR to remove the text

robin: I suspect we might want to say something about them if we have a consent principle

Jeffrey: we should talk about how to do consent well; we may not even talk about patterns. If we reintroduce it, we can add the paragraph back

Don: CafeMedia changed "dark patterns" to "manipulative design practices"

Jeffrey: podcast re "dark pattern" not distinguishing between manipulative, misleading

robin: don't think the PR is ready to land

Wendy: my question - what are we trying to do with it?

Robin: i think this might be several principles or considerations for other principles... so feedback not in yet... not sure.

robin: we can add it and remove it when we're happy with the number of principles

tess: +1, it's useful on the editor's draft

[merging]

ok to merge

Jeffrey: I rewrote.

robin: I like the principle of not using security data for growth; do we want to say that keeps applying even to anonymized data?

Jeffrey: aggregated, deidentified, ... We discussed data shouldn't be used cross-purposes, and also cross-ref that.

Christine: is there a way we could avoid saying "privacy principles in conflict". Generally privacy principles are written to support each other. I don't want to start a thread of conflict.

Jeffrey: how do you feel about examples?

Christine: Children's privacy. From service, (from adults). vs Service worrying about legal compliance.

Jeffrey: I think it's children's privacy from other users of the service.

Christine: Children's privacy from non-children vs their privacy from the service. Yet verifying ages is most often about a service doing compliance. Often, identifying a child by age is privacy invasive. So I think this is more privacy against safety.

Don: also, why just privacy from non-children? children perpetrate SWAT attack against other children, too

Pete: +1 to Christine. Still concerned about "privacy principles in conflict"

Nick: Also concerned about the conflict framing. There are times when to protect privacy, you need to collect data. that's not conflict. Data collection isn't per se violation. Misuse of data is the privacy violation. e.g. proving to you I'm the account holder is in service of my privacy, not an invasion. I'm find with principle that services shouldn't misuse data, secondary use. Anti-abuse folks are concerned about language "the platform shouldn't provide/block APIs"

robin: +1 to Nick. Decision process by which you reach e.g. log retention periods does require balancing multiple principles. If I decide to keep everything, raises data protection issues; delete everything raises others.

dka: "tension"? "appear to be in tension?" but sometimes a false dichotomy.

robin: not just "appear to be in conflict". When we're in the trenches, "I'd like to do this, is it appropriate?" look at principles such as minimization, anti-abuse

Jeffrey: happy to change to tension. Consider IRS use of biometrics, they made the wrong tradeoff between privacy rights.

Pete: some want to write a doc that says "these are principles" and some want "when you incorporate privacy considerations this what they shiuld say"...

Wendy: on phrasing -- i wonder if it's the recommendations based in principles that are in tension rather than the underlying principles.

Robin: Not sure about the distinction ... spec vs principles... or understanding of privacy maximalism - as secrecy maximalism vs figuring out the best role...

Peter: a doc that's

dka: let's try to define more principles. how privacy should be incorporated in specs was ethos was to write S&P questionnaire. This doc was to fill the gap between questionnaire and EWP.

npdoty: should teh principle speak to browsers/platform, not just "services"?

Jeffrey: meant to refer to e.g. browser API, may offer/prevent technical measures.
MPC, policies about when you can get around IP blindness, are referring to those measures. It says "services must" so we might want to add reference to browsers should help

npdoty: is principle that it may be in the user's own interest to provide some kind of assurance to the service, e.g. about their identity.

Jeffrey: it's not in all users' interests,

dka: back to the vulnerable users, we don't want to expose them by the anti-abuse data.

Jeffrey: that's why I say tension

dka: maybe say less, keeping the principle continue to iterate.

Pete: can we agree to move the text out of the principles section. It's not itself a principle.

Jeffrey: Happy to move it.

DKA: Section 1 for now.

Drafting Principles

dka: Principle 1, make it simple to read, then followed by explanatory text. 2.1.1 is clear. Others need work.

dka: I'll try to write some text for notifications and interruptions

Pete: I'll look at profiling/personal data, which seem overlapping.

robin: don't take anything as gospel. I made empty sections liberally when there was a TODO

Pete: I'll talk to Christine re profiling and look at 2.2

Jeffrey: I think profiling is a particular use of personal data. could be a subsection.

Jeffrey: I have draft text for Guardians and Device Owners

Tess: I'll take Principles for Vulnerable People, and Harassment

Wendy: Influence and Manipulation

Nick: expanding Rights (access, erasure, portability, correction)

Don: Collective Governance is open

Jeffrey: Opt-in, Consent, Opt-Out needs an owner.

Don: I could take that. opt-in/opt-out, consent/withdraw consent (+ object?)

DKA: it needs principle text. What would a principle be around that?

Jeffrey: don't think all the principles need to be in Sec 2

DKA: for writing a spec, it would be helpful to have TOC as enumeration. with frequent pointers back to intro. inter-dcument links.

Don: I think object fits in the category with opt in/out, consent/withrdraw

robin: agree, and symmetry can be illusory

dka: I'll publish minutes after the call. Work on your assignments!