Articles

The Chatham House Rule: Your new collaboration tool?

A novel approach to sharing security data and expertise in an atmosphere of trust

In June, 1927, someone had a brilliant idea. Or, at least, that’s when the idea was first codified, at a meeting of the Royal Institute of International Affairs at Chatham House in London. All attendees of the meeting could quote comments made at the meeting, but they weren’t allowed to say who had made the comment.

This became known as the Chatham House Rule, and the most recent incarnation is defined thus:

“When a meeting, or part thereof, is held under the Chatham House Rule, participants are free to use the information received, but neither the identity nor the affiliation of the speaker(s), nor that of any other participant, may be revealed.”

This is brilliantly clever. It allows at least two things:

  • The sharing of information which might be sensitive to a particular entity when associated with that entity, but which is still useful when applied without that attribution;
  • The sharing of views or opinions which, when associated with a particular person or organization, might cause wider issues or problems.

The upshot: A powerful kind of trust

The upshot of this is that if somebody (say, Person A) values the expertise, opinion and experience of another person (say, Person B), then they can share that other person’s views with people who may not know Person B, or whose views on Person B may be biased by their background or associations. This is a form of transitive trust, and situations where transitive trust are made explicit are, in my opinion, to be lauded (such trust relationships are too often implicit, rather than explicit).

Benefits for security discussions

Security is one of those areas which can have an interesting relationship with open source. I’m passionately devoted to the principle that open-ness is vital to security, but there are times when this is difficult. The first is to do with data, and the second is to do with perceived expertise.

While we all (hopefully) want to ensure that all our security-related code is open source, the same cannot be said for data. There is absolutely a place for open data – citizen-related data is the most obvious, e.g. bus timetables, town planning information – and there’s data that we’d like to be more open, but not if it can be traced to particular entities. Aggregated health information is great, but people aren’t happy about their personal health records being exposed. The same goes for financial data: Aggregated information about people’s spending and saving habits is extremely useful, but I, for one, don’t want my bank records revealed to all and sundry.

Moving specifically to security, what about data such as the number of cyber attacks – successful and unsuccessful – against companies? The types that were most successful? The techniques that were used to mitigate? All of these are vastly useful to the wider community, and there’s a need to share them more widely. We’re seeing some initiatives to allow this already, and aggregation of this data is really important.

 There comes a time, however, when particular examples are needed. And as soon as you have somebody stand up and say “This is what happened to us”, then they’re likely to be in trouble from a number of directions, which may include: their own organization, their lawyers, their board, their customers and future attackers, who can use that information to their advantage. This is where the Chatham House Rule can help: it allows experts to give their views and be listened to without so much danger from the parties listed above.

It also allows for other people to say “we hadn’t thought of that”, or “we’re not ready for that” or similar without putting their organizations – or their reputations – on the line. Open source needs this, and there are times when those involved in open source security, in particular, need to be able to share the information they know in a way that doesn’t put their organizations in danger.

Tackling bias between organizations

Another area of difficulty is expertise, or more specifically, trust in expertise. Most organizations aim for a meritocratic approach – or say they do – at least within that organization. But the world is full of bias, particularly between organizations. I may be biased against views held or expressed by a particular organization, just because of their past history and my interactions with that company, but it is quite possible that there are views held and expressed by individuals from that company which, if separated from their attribution, I might take seriously.

There comes a time when particular cyber attack examples are needed.

 I may be biased against a particular person, based on my previous interactions with him/her, or just on my underlying prejudices. I only need one person who does not hold my biases to represent those views (as long as they personally trust the organization, or even just the person, expressing them) in order to process and value those views myself, gaining valuable insight from them. The Chatham House Rule can allow that to happen.

In fact, the same goes for intra-organization biases: Maybe product management isn’t interested in the views of marketing, but what if there are important things to learn from within that department, that product management can’t hear because of that bias? The Chatham House Rule allows an opportunity to get past that.

To return to open source, many contributors are employed by a particular organization, and it can be very difficult for them to express opinions around open source when that organization may not hold the same views, however carefully they try to separate themselves from the official line. Even more important, in terms of security, it very well be that they can bring insights which are relevant to a particular security issue which their company is not happy about being publicly known, but which could benefit one or more open source projects. To be clear: I’m not talking, here, about exposing information which is specifically confidential, but about sharing information with the permission of the organization, but within specific constraints.

Maybe product management isn’t interested in the views of marketing, but what if there are important things to learn?

There are all sorts of biases within society, and open source is, alas, not without its own. When a group of people gets to know each other well, however, it is often the case that members of that group can forge a respect for each other which goes beyond gender, age, academic expertise, sexuality, race, or the like. This is a perfect opportunity for meetings under the Chatham House Rule: It gives this group the chance to discuss and form opinions which can be represented to their peers – or the rest of the world – without having to worry so much about any prejudices or biases that might be aimed at particular members.

A final caution

The Chatham House Rule provides a great opportunity to share expertise and knowledge, but there is also a danger that it can allow undue weight to be expressed to anecdotes. Stories are a great way of imparting information, but without data to back them up, they are not as trustworthy as they might be. Because the Chatham House Rule inhibits external attribution, this does not mean that due diligence should not be applied within such a meeting to ensure that information is backed up by data.

About the author

Mike Bursell joined Red Hat in August 2016, following previous roles at Intel and Citrix working on security, virtualisation, and networking. After training in software engineering, he specialised in distributed systems and security, and has worked in architecture and technical strategy for the past few years. His responsibilities as Red Hat’s Chief Security Architect include forming security and blockchain strategy, external and internal visibility, and thought leadership. He regularly speaks at industry events in Europe, North America, and APAC.

Mike has an MA from the University of Cambridge and an MBA from the Open University.