There are many definitions of trust, and all people have their own internal perspective on what THEY trust.
As I outline in this next section, there is a lot of meaning packed into the word “trust” and it varies on context and scale. Given that the word trust is found 97 times in the NSTIC document and that the NSTIC governing body is going to be in charge of administering “trust marks” to “trust frameworks” it is important to review its meaning.
I can get behind this statement: There is an emergent property called trust, and if NSTIC is successful, trust on the web would go up, worldwide.
However, the way the word “trust” is used within the NSTIC document, it often includes far to broad a swath of meaning.
When spoken of in every day conversation trust is most often social trust.
Trust in a social context: The typical definition of trust follows the general intuition about trust and contains such elements as:
- the willingness of one party (trustor) to rely on the actions of another party (trustee);
- reasonable expectation (confidence) of the trustor that the trustee will behave in a way beneficial to the trustor;
- risk of harm to the trustor if the trustee will not behave accordingly; and
- the absence of trustor’s enforcement or control over actions performed by the trustee.
When discussing digital systems there is another meaning for trust related to cryptography and security and other policy enforcement.
Computational Trust – In Information security, computational trust is the generation of trusted authorities or user trust through cryptography.
Trusted Systems – In the security engineering subspecialty of computer science, a trusted system is a system that is relied upon to a specified extent to enforce a specified security policy. As such, a trusted system is one whose failure may break a specified security.
The choice of one individual to trust another depends on who they are, depending on the context, relationship and other factors. This can change and perhaps be tracked.
Trust Metrics -In psychology and sociology, a trust metric is a measurement of the degree to which one social actor (an individual or a group) trusts another social actor.
Trust Operates on Different Scales
In The Speed of TRUST: The One Thing That Changes Everything, Stephen M.R. Covey articulates 5 different ones. I think this model is helpful because it highlights how much trust means and how it operates differently at different scales.
Covey starts with people trusting themselves: SELF TRUST
Are we credible to ourselves?
- Do we have integrity are we congruent inside and out and walking our talk, living in accordance with one’s own values and beliefs?
- What is our intent when interacting with straightforward motives based on mutual benefit?
- What are our capabilities? Do we have the ability to establish, grow, extend and restore trust? What abilities do you have that inspired confidence, talents attitudes, skill, knowledge, style.
- What are our results? Do we get the right things done, are they done well and what is our consistency of results or tack record?
People in the Quantified Self movement are actually using digital devices and sensors to track themselves. They are using data analysis tools to see how fast they ran or what their caloric intake was. One of the reasons people track themselves to work on improving themselves, set goals and measure achievement over time. As they achieve results towards a goal they increase their credibility – their self trust.
Covey moves on to people trusting each other: RELATIONSHIP TRUST
One cultivates this kind of trust with others when one behaves consistently in ways that build trust. People are biologically wired to track behavior of others and form opinions about trustworthiness in real time, all the time balancing a wide array of variables. One way to simplify this is to imagine that with every person you interact with you have a “trust account”. The way you make deposits “In” to someone’s bank account is to have consistent behavior. Deposits are withdrawn from the “account” when someone is not consistent in following agreements.
Behaviors he believes generate trust:
- Create Transparency
- Demonstrate Respect
- Practice Accountability
- Deliver Results
- Get Better
- Extend Trust
- Talk Straight
- Listen First
- Show Loyalty
- Confront Reality
- Clarify Expectations
- Keep Commitments
People are really different: different kinds of behaviors matter more or less to an individual, and therefore a behavior’s meaning affects the current balance on any person’s given trust account account differently.
The Identity Ecosystem is an online environment where individuals and organizations will be able to trust each other because they follow agreed upon standards to obtain and authenticate their digital identities and the digital identities of devices. The Identity Ecosystem Framework is the overarching set of interoperability standards, risk models, privacy and liability policies, requirements, and accountability mechanisms that govern the Identity Ecosystem.
This quote from NSTIC makes a big assertion that trust is going to flow between people because they followed agreed-upon standards to obtain and authenticate their digital identities.
The implicit use case might be an individual, lets say her name is Jenna, goes to an attribute verifier service provider like her retail branch bank with attributes like drivers license, latest utility bill and her record showing she has also had a bank account with them for 5 years. The bank checks Jenna’s physical world credentials and then issue a digital token she can use to do 2-factor authentication online. The digital token, when she goes online, presents Jenna’s name as written on her driver’s license.
I see three behaviors in this use case:
Confronting Reality – there is a reality for most people in western liberal democracies that the government of the county or province you were born issued you a paper saying so, and this ironically named breeder document begets you more forms of identification. If a user has not been using their real name, they will now be forced to do so. The reality is, birthplace can have a huge effect on a person’s legal and identify reality.
Creating Transparency – Jenna has linked her “real legal name” to an account which that when she uses it will be transparent about who she is and let everyone know. This means people who look her up online can find her street address in real life. Well, it turns out this creates a vulnerability because others can find where her house is, stalk her or make threats against her.
Practicing Accountability – The ability to be accountable. If Jenna choose a criminal action online, others would be able to trace her by the real name she was using. But so too if she was mildly socially rude, people would know to withdraw from her “trust account”.
There are nine other behaviors really matter in human to human trust relationships but which are not covered in any way by the standards for obtaining and authenticating digital identities – the so-called trust frameworks.
There are other aspect that are not comparable about this scenario when you map them to how people trust one another in everyday life. I don’t trust people because I know their legal name because I checked it on their drivers license. In physical space, I see someone I know and I know it is them because they are in the same body form they were last time I saw them. This verisimilitude to the mental picture I have of them allows me to authenticate36 them visually. When I see them, I can pull up my mental trust account and see how much I have deposited in their account.
In the digital realm, I anchor my mental trust account to identifiers I hold for people in my mind. I need to have confidence that the system they use to authenticate (using a user name and password) is secure, that it isn’t someone else logging in and “being them” because they control the identifier.
When people interact with businesses, they use similar mental models for judging trustworthiness based on observed actions and experiences. The use of the phrase “trust framework” by its very name implies that those who have complied with its requirements are trustworthy because they had a standard way to obtain a digital identity and authenticate. There is a great diversity of particular behaviors that people use to make trust judgements. If people want to use one trust framework or another because they judge one or another ratings agency assesses it to be more “trustworthy” we have a very messy, convoluted conversation.
In groups of people working together: ORGANIZATIONAL TRUST
This mode of trust is about alignment of the structures, systems and symbols of organizational trust. If trust is low in an organization, then to compensate, certain behaviors or systems patterns emerge that are costly: Redundancy, Bureaucracy, Politics, Disengagement, Turnover, Churn and Fraud.
For organization there is: MARKET TRUST
The perception of a business entity in the market place is where there are all kinds of services that help consumers navigate what products to buy. Market trust is developed by repeated activity observed over time.
Beyond the business or nonprofit is: SOCIETAL TRUST
This is about giving back and contributing to the society and the commons. It is particularly important to give back to society trust assets one owns but everyone benefits from. It is vital that societal trust be maintained because other scales for trust operate at this level as a support structure. This is where there is backup when other forms of trust fail and you can trust the court system to give you fair treatment when seeking redress.
“If NSTIC is successful, trust on the web would go up, worldwide.” The trust in this sentence is at the societal level scale and I believe it is true. However the way to succeed in achieving this level of trust is not to name policy-tech frameworks throughout the system “trust frameworks”. I am very keen on NSTIC succeeding, however I am concerned that naming this critical part of the proposed ecosystem “trust frameworks” will actually generate mistrust of the system. If the term “trust framework” is the way policy-technology frameworks within the ecosystem are named and explained to the public, but people find those frameworks untrustworthy, they will suspect anything self labeled with “trust”. People will ask themselves: why should we trust a Trust Framework? Who made up the trust frameworks? Individuals will think to themselves: I am the one who decides what to trust…don’t tell me to trust something just because you call it a “Trust Framework.” Given the recent large scale institutional breakdown in trust in the banking system, consumers are skeptical of large publicly traded companies saying “trust us” we have a “trust framework” to protect you.
I highlighted the challenge with using the word, trust, for policy-technology frameworks at the NSTIC governance workshop at the beginning of June where Jeremy Grant asked me if I had a better name. I do have a better name for trust frameworks:
Accountability Frameworks.
Here is some of my reasoning:
- It is 2 words.
- It captures the heart of the intended purpose: Accountability
- Accountability is achieved in these frameworks via both technology standards and policies that are adopted and audit-able.
- Trust remains an emergent property of these accountability frameworks.
- There can be real conversations by various stakeholders who may have different needs and interests about the nature of the accountability in different frameworks. They can look to see weather particular accountability frameworks are trustworthy from a particular point of view.
- It avoids the problem of talking about the “trustability of trust frameworks”.
Trust is absolutely essential in the Identity Ecosystem. People must trust that the information they share will be handled with care, respected and that human dignity is maintained by the individual actors within the Identity Ecosystem. This is achieved by having real accountability in the system around the user’s rights to use their data being respected. When the system is functioning well and accountability frameworks are followed then overall systems behavior of the Identity Ecosystem will be trustworthy.
This post is from pages 20-24 of Kaliya’s NSTIC Response – please see this page for the overview and links to the rest of the posts. Here is a link to the PDF.
This is the section before: Alignment of Stakeholders around the many NSTIC Goals
This is the section after: Ecosystem Maps – Present, Evolving, Future
As you nicely outlined ‘trust’ has many different connotations and contexts so is a difficult subject matter try to provide a winning 2 or 3 word phrase for like in your NSTIC example. So while you made some good points all good standards provide a ‘framework’ so adding ‘trust’ on the front end is probably safest bet that will works best for players in this space so suggest sticking with ‘trust framework’.
Couple examples
1) What is a Trust Framework? | Open Identity Exchange http://bit.ly/n4ePBX
2) the semantic web stack ( or framework ) has “trust” and “proof” layers at the top.
Trust does not = proof.
Trust is a feeling that people have. It is a “choice” based on many factors. People who live within the default systems of society generally don’t question weather they are trustable..but others do all the time. How are the systems we are building accountable to the people who are supposed to “trust them”? I don’t just believe the trust
If you have “accountability frameworks” they you could have accountability for pseudonymous identities without having to have them be “proven” “real”. This is where bob’s concept of Limited Liability Persona‘s come in.