From 5181c0fed45f961ff2d35e01785f08b1c9712826 Mon Sep 17 00:00:00 2001 From: Coralie Mercier Date: Mon, 5 Aug 2024 20:23:22 +0200 Subject: [PATCH] QA lowercase applied to a bunch of words where uppercase wasn't right --- index.html | 10 +++++----- 1 file changed, 5 insertions(+), 5 deletions(-) diff --git a/index.html b/index.html index cd72b57..a59d09f 100644 --- a/index.html +++ b/index.html @@ -615,16 +615,16 @@

Note: However, like all innovations, these technologies can have downsides. To paraphrase Paul Watzlawick, the innovation of these technologies must not become “ultra-solutions” where “operation successful, patient dead[ultra-solutions]. So, the challenge is enabling this technological innovation by being aware of the threats to Privacy, security, and Human Rights.

Therefore, it is necessary to analyze the various threats to mitigate them at their root in designing and implementing these technologies and related standards.

-

As an example, below is an initial analysis of threats to human rights (Harms) concerning government-issued digital identities using Microsoft’s Responsible innovation toolkit:

+

As an example, below is an initial analysis of threats to human rights (harms) concerning government-issued digital identities using Microsoft’s responsible innovation toolkit:

  • -

    Opportunity Loss (Discrimination): This complex issue spans multiple areas. Digital divide: if digital identities are required for access to public services and no alternatives are present, and if they depend on certain hardware, software, or stable connectivity, it can lead to discrimination for people who do not have availability of these resources. In addition to discrimination within the same country, there is further discrimination if there is no “cross-border” interoperability between the technologies and implementations used by different governments.

    +

    Opportunity loss (discrimination): This complex issue spans multiple areas. Digital divide: if digital identities are required for access to public services and no alternatives are present, and if they depend on certain hardware, software, or stable connectivity, it can lead to discrimination for people who do not have availability of these resources. In addition to discrimination within the same country, there is further discrimination if there is no “cross-border” interoperability between the technologies and implementations used by different governments.

  • -

    Economic loss (Discrimination): The availability of digital identities and related credentials, which can contain a lot of information regarding wealth status, can be used to discriminate against access to credit. This can also be generalized - as was identified during a W3C breakout session - and concerns the Javons paradox. The more information available, the more likely it is that collection, particularly in greedy data-driven contexts, is abused.

    +

    Economic loss (discrimination): The availability of digital identities and related credentials, which can contain a lot of information regarding wealth status, can be used to discriminate against access to credit. This can also be generalized - as was identified during a W3C breakout session - and concerns the Javons paradox. The more information available, the more likely it is that collection, particularly in greedy data-driven contexts, is abused.

  • -

    Dignity loss (Dehumanization): For example, if the vocabulary used does not correctly describe people’s characteristics, this can reduce or obscure people’s humanity and characteristics.

    +

    Dignity loss (dehumanization): For example, if the vocabulary used does not correctly describe people’s characteristics, this can reduce or obscure people’s humanity and characteristics.

  • -

    Privacy Loss (Surveillance): if this technology is not designed and implemented properly, it can lead to surveillance by state and non-state actors such as government and private technology providers. For example, centralized or federated models are more prone to these threats, while decentralized models are less so, but it depends on how they are implemented. Therefore, it is necessary to provide privacy-preserving technologies and implement them properly.

    +

    Privacy loss (surveillance): if this technology is not designed and implemented properly, it can lead to surveillance by state and non-state actors such as government and private technology providers. For example, centralized or federated models are more prone to these threats, while decentralized models are less so, but it depends on how they are implemented. Therefore, it is necessary to provide privacy-preserving technologies and implement them properly.

Note: W3C is handling this issue with a Threat Model.

2.3. Digital identity management models