-
Notifications
You must be signed in to change notification settings - Fork 20
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Why should I trust an app? #46
Comments
Both user and resource associated Authorization Servers #43 could let users configure some kind of 'trust policies'. For example app that have specific certifications, published by specific entities etc. |
Yes, but that is still just the technical aspect. A random user would not have much to go on to formulate those policies. Dedicated Authorization Servers could serve as a centralization point. I think we need to be much more elaborate in involving the social fabric of the ecosystem, the humans, not the machines. |
Hi. I agree that there are two forms of "trust" here - one related to security which relate to certificate, etc. and the other about humans which relate to how and why the app is using their data. For the second one, I am of the strong opinion that users should be able to express their own policies (or reuse from community) about how they want to let others use their data, and that such policies should be used to assist them in making decisions and reducing the ability of apps/companies to manipulate and take advantage of people. For example, certain things that such (machine-readable) policies can help flag or provide additional contextual information when apps request access to make a better informed decision:
In order to enable this, both the app request and the user/community preference or guides need to be in machine-readable forms so that the agent can interpret and use them. Otherwise there is a strong likelihood to continue the current malpractices where users get a notice that only provides a link to a website T&C that they either don't read or don't fully understand, and end up giving access to do something with their data they had not anticipated or intended. |
IMO we should collaborate with community-driven services like https://tosdr.org/ to address this specific problem. |
Currently, at use.id we're using the OIDC Dynamic Client Registration metadata values |
Hi. I have written up an article titled "Making Sense of Solid for Data Governance and GDPR" https://osf.io/m29hn/ that analyses how Solid in its current state relates to GDPR's requirements, what are some of the possible governance models (for Pods and Apps), and some issues that are known to be problematic also apply to Solid. The aim is to emphasise the necessity and importance of answering (through developments) the question this issue has raised. The article also explores some specific ideas for improving things (Section 8). |
We're discussing a lot about the technical management of access control, but there's much more to it in the social space. I'm fortunate enough to have kids who trust me enough to ask me whether they can install a certain app on their mobiles and tablets. Although I'm not revealing that to them, the usual answer is "I have no idea". I have very little to go on in terms of deciding whether an app is trustworthy. It is just a bunch of heuristics, and funnily, the kids develop their own heuristics too.
I think this highlights a much bigger problem than the technical ones, we have to enable people to make much more informed decisions about access control on the open Web for Solid to be useful beyond tight, social groups. If not, Solid is likely to also just accommodate a small number of large players, or be a place where social engineering is rampant to extract private information.
The text was updated successfully, but these errors were encountered: