Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Go through list of suggestions from the WebAdv "success criteria" that some had proposed #61

Closed
darobin opened this issue Oct 6, 2021 · 6 comments
Assignees
Labels
agenda+ Add to the next call's agenda.

Comments

@darobin
Copy link
Member

darobin commented Oct 6, 2021

Sift through it for things that are relevant to privacy.

https://github.com/w3c/web-advertising/blob/main/success-criteria.md#interests-of-society (and following sections).

@darobin darobin self-assigned this Oct 6, 2021
@darobin darobin added the agenda+ Add to the next call's agenda. label Oct 6, 2021
@darobin
Copy link
Member Author

darobin commented Oct 6, 2021

I have gone through the list, and here is my proposal.

A number of things might be fair values to hold, but are out of scope for privacy principles (eg. interoperable standards of communication). Proposal: simply skip those.

Others are in scope but already included or implied (eg. A right to choose private browsing). Proposal: nothing needs to be done.

A number of the things listed fall under collective issues that could arguably contribute to our section on collective privacy problems (which I find insufficiently fleshed out). For instance, the following topics that they list have been in part caused by unfettered data sharing across contexts or by excessive collection by large platforms and improving privacy on the Web is instrumental in helping solve them:

  • Free elections protected against foreign manipulation since targeting people with manipulation has become too easy and unaccountable.
  • Low barriers of entry for individuals to start new businesses and compete against existing incumbents since sharing data structurally leads to winner-take-all situations.
  • Reducing the quality and quantity of publisher content is caused in part by the fact that publishers can't win if they have to share their audience with other parties.

Proposal: let's discuss if such things would be relevant to include or not. I am ambivalent because they are more about consequences of bad privacy than about privacy directly — but I could be convinced either way.

A number of things could be considered omissions and may usefully be included (listing candidates but not necessarily all):

  • Freedom from having to self-censor for fear of content consumption being associated with (…) identity
  • Data subject rights (erasure, access, correction, objection). These are in part implied in the tiered model but that should be explicit. We should however tread carefully around the right to be forgotten in search results. (Though I'm serious that the TAG or some other group should develop a view.)
  • Algorithmic explainability (though that's a hell of a can of worms).

Proposal: let's discuss which ones belong.

A number of things are known to be problematic or false and don't need to be revisited (eg. transparency works, choice improves autonomy). Proposal: skip.

There is also a set of considerations relating to publishers, advertisers, browsers, and other such constituencies. I do believe that we need to extend the Priority of Constituencies to include more participants and to be clearer about some roles, but 1) that is not for this document to do and 2) that WebAdv proposal is designed to put people at a significant disadvantage with respect to the businesses that they engage with, and is incompatible Web principles (as well as with any consideration that humans are the key driver of ethics).

@dmarti
Copy link
Collaborator

dmarti commented Oct 6, 2021

The point on free elections might be usefully generalized to deceptive communications in general. The same data practices that enable targeted political content and advertising are also used for run-of-the-mill scams. (For example, offering a counterfeit product only to people who are unlikely to work for or with the makers of the original.)

https://blog.zgp.org/when-can-deceptive-sellers-outbid-honest-sellers-for-ad-impressions/

@darobin
Copy link
Member Author

darobin commented Oct 13, 2021

From @sandandsnow:

  • We could have a bit on data portability, in line with pushes to interoperability (and data rights).
  • Self-censorship: there are studies, eg. about post-Snowden
  • We should think about interfacing to opaque things (rather than algo explainability): EME, EPub, ad bidding… Making things understandable as a privacy design feature. A sentiment we should capture: "Privacy researchers need to be able to see what is going on."

@darobin
Copy link
Member Author

darobin commented Oct 13, 2021

@wseltzer adds "User privacy is enhanced by research being possible into how data is processed. Users rely on external analysis."

This could be collective data rights.

@sandandsnow
Copy link
Collaborator

Link to Assuring a Strong and Secure Web Platform - https://www.w3.org/blog/2015/11/strong-web-platform-statement/

@jyasskin jyasskin removed the agenda+ Add to the next call's agenda. label Oct 27, 2021
@darobin darobin added the agenda+ Add to the next call's agenda. label Jan 5, 2022
@darobin
Copy link
Member Author

darobin commented Jan 19, 2022

I've done the following:

@darobin darobin closed this as completed Jun 27, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
agenda+ Add to the next call's agenda.
Projects
None yet
Development

No branches or pull requests

4 participants