Heriot-Watt University in Scotland is hosting a “Workshop on Serious Games for Cyber Security,” May 21-22.
This is a brief response to Steve Christey Coley, who wrote on Twitter, “but BH CFP reads mostly pure-tech, yet infosec’s more human-driven?” I can’t respond in 140, and so a few of my thoughts, badly organized:
- BlackHat started life as a technical conference, and there’s certain expectations about topics, content and quality, which have changed and evolved over time.
- The best talk in the world, delivered to the wrong audience, is not the best talk in the world. For example, there’s lots of interesting stuff happening with CRISPR. We probably wouldn’t even accept a talk on the security implications. Similarly, we probably wouldn’t take a talk on mosquito-zapping lasers, as much fun as it would be.
- I and other members of the PC, work to change those expectations by getting good content that is at the edge of those expectations. Thus, there’s a human factors track again this year.
- That track gets a lot of “buy a UPS uniform on ebay” submissions, and the audience doesn’t tend to like those. They’re not cutting edge.
- I would love it if we got more SOUPS-like content, redone a little to meet audience expectations for a Blackhat talk, which are different than expectations for an academic talk.
- So what I look for is something new, in a form that I believe will be close enough to the expectations of the audience that we drive and evolve change in useful directions.
- Finding the right balance is hard.
So, what do you think a good BlackHat talk on human factors talk might be?
(I should be clear: I am one of many reviewers for BlackHat, and I do not speak for them, or any other reviewer. I cannot discuss specific submissions or the discussions we have around them.)
Update: Since this was written quickly, I forgot to link to “How to Get Accepted at Blackhat.” Read every word of that, ask yourself if your submission is a good one.
Have a survival kit: ricola, Purell, gatorade, advil and antacids can be brought or bought on site.
Favorite talk (not by me): I look forward to Sounil Yu’s talk on “Understanding the Security Vendor Landscape Using the Cyber Defense Matrix.” I’ve seen an earlier version of this, and like the model he’s building a great deal.
Favorite talk I’m giving: “Securing the ‘Weakest Link’.”
A lot of guides, like this one, are not very comprehensive or strategic. John Masserini’s A CISO’s Guide to RSA Conference 2016 is a very solid overview if you’re new, or not getting good value from a conference.
While you’re there, keep notes for a trip report. Sending a trip report helps you remember what happened, helps your boss understand why they spent the money, and helps justify your next trip. I like trip reports that start with a summary, go directly to action items, then a a list of planned meetings and notes on them, followed by detailed and organized notes.
Also while you’re there, remember it’s infosec, and drama is common. Remember the drama triangle and how to avoid it.
As we head into summer conference season, drama is as predictable as vulnerabilities. I’m really not fond of either.
What I am fond of, (other than Star Wars), as someone who spends a lot of time thinking about models, is the model of the “drama triangle.” First discussed by Stephen Karpman, the triangle has three roles, that of victim, persecutor and rescuer:
“The Victim-Rescuer-Persecutor Triangle is a psychological model for explaining specific co-dependent, destructive inter-action patterns, which negatively impact our lives. Each position on this triangle has unique, readily identifiable characteristics.” (From “Transcending The Victim-Rescuer-Persecutor Triangle.”)
One of the nifty things about this triangle — and one of the things missing from most popular discussion of it — is how the participants put different labels on the roles they are playing.
For example, a vulnerability researcher may perceive themselves as a rescuer, offering valuable advice to a victim of poor coding practice. Meanwhile, the company sees the researcher as a persecutor, making unreasonable demands of their victim-like self. In their response, the company calls their lawyers and becomes a persecutor, and simultaneously allows the rescuer to shift to the role of victim.
Rescuers (doubtless on Twitter) start popping up to vilify the company’s ham-handed response, pushing the company into perceiving themselves as more of a victim. [Note that I’m not saying that all vulnerability disclosure falls into these traps, or that pressuring vendors is not a useful tool for getting issues fixed. Also, the professionalization of bug finding, and the rise of bug bounty management products can help us avoid the triangle by improving communication, in part by learning to not play these roles.]
I like the “Transcending The Victim-Rescuer-Persecutor Triangle” article because it focuses on how “a person becomes entangled in any one of these positions, they literally keep spinning from one position to another, destroying the opportunity for healthy relationships.”
The first step, if I may, is recognizing and admitting you’re in a drama triangle, and refusing to play the game. There’s a lot more and I encourage you to go read “Transcending The Victim-Rescuer-Persecutor Triangle,” and pay attention to the wisdom therein. If you find the language and approach a little “soft”, then Kellen Von Houser’s “The Drama Triangle: Victims, Rescuers and Persecutors” has eight steps, each discussed in good detail:
- Be aware that the game is occurring
- Be willing to acknowledge the role or roles you are playing
- Be willing to look at the payoffs you get from playing those roles
- Avoid being sucked into other people’s battles
- Take responsibility for your behavior
There’s also useful advice at “Manipulation and Relationship Triangles.” I encourage you to spend a few minutes before the big conferences of the summer to think about what the drama triangle means in our professional lives, and see if we can do a little better this year.
So Bill Brenner has a great article on “How to survive security conferences: 4 tips for the socially anxious
.” I’d like to stand by my 2010 guide to “Black Hat Best Practices,” and augment it with something new: a word on etiquette.
Etiquette is not about what fork you use (start from the outside, work in), or an excuse to make you uncomfortable because you forgot to call the Duke “Your Grace.” It’s a system of tools to help otherwise awkward social interactions go more smoothly.
We all meet a lot of people at these conferences, and there’s some truth behind the stereotype that people in technology are bad at “the people skills.” Sometimes, when we see someone, there will be recognition, but the name and full context doesn’t come rushing back. That’s an awkward moment, and it’s worth thinking about the etiquette involved.
When you know you’ve met someone and can’t recall the details, it’s rude to say “remind me who you are,” and so people will do a bunch of things to politely encourage reminders. For example, they’ll say “what’s new” or “what have you been working on lately?” Answers like “nothing new” or “same old stuff” are not helpful to the person who asked. This is an invitation to talk about your work. Even if you haven’t done anything new that’s ready to talk about, you can say something like “I’m still exploring the implications of the work I did on X” or “I’ve wrapped up my project on Y, and I’m looking for a new thing to go frozzle.” If all your work is secret, you can say “Oh, still at DoD, doing stuff for Uncle Sam.”
Whatever your answer will be, it should include something to help people remember who you are.
Why not give it a try this RSA?
BTW, you can get the best list of RSA parties where you can yell your answers to such questions at “RSA Parties Calendar.”
(Posted for friends)
AdaCamp is a conference dedicated to increasing women’s participation in open technology and culture: open source software, Wikipedia-related projects, open data, open geo, fan fiction, remix culture, and more. The conference will be held June 8 and 9th in San Francisco.
There will be two tracks at the conference: one for people who identify as significantly female, and one (likely on just the Saturday) for allies and supporters of all genders.
Attendance to AdaCamp is by invitation following application. Please spread the word and apply. Travel assistance applications are due April 12; all other applications are due April 30. (Yes, the application process looks kind of daunting, but it’s worth it!)
Details at http://sf.adacamp.org/
I forgot to mention onstage that I’ve actually illustrated all eight of the Saltzer and Schroeder principles, and collected them up as a single page. That is “The Security Principles of Salzter and Schroeder, Illustrated with Scenes from Star Wars“. Enjoy!
I wrote a blog post regarding the BSidesSF/RSA conf dust-up.
(If I knew how to work Adam’s twitter integration thingy, you’d have been spared this)
This looks like it has the potential to be a very interesting event:
The University of Miami School of Law seeks submissions for “We Robot” – an inaugural conference on legal and policy issues relating to robotics to be held in Coral Gables, Florida on April 21 & 22, 2012. We invite contributions by academics, practitioners, and industry in the form of scholarly papers or presentations of relevant projects.
We seek reports from the front lines of robot design and development, and invite contributions for works-in-progress sessions. In so doing, we hope to encourage conversations between the people designing, building, and deploying robots, and the people who design or influence the legal and social structures in which robots will operate.
Robotics seems increasingly likely to become a transformative technology. This conference will build on existing scholarship exploring the role of robotics to examine how the increasing sophistication of robots and their widespread deployment everywhere from the home, to hospitals, to public spaces, and even to the battlefield disrupts existing legal regimes or requires rethinking of various policy issues.
They’re still looking for papers at: http://www.we-robot.com. I encourage you to submit a paper on who will get successfully sued when the newly armed police drones turn out to be no more secure than Predators, with their viruses and unencrypted connections. (Of course, maybe the malware was just spyware.) Bonus points for entertainingly predicting quotes from the manufacturers about how no one could have seen that coming. Alternately, what will happen when the riot-detection algorithms decide that policemen who’ve covered their barcodes are the rioters, and opens fire on them?
The possibilities for emergent chaos are nearly endless.
National Institute of Standards and Technology
Gaithersburg, MD USA
April 5-6, 2011
Call for Participation
The field of usable security has gained significant traction in recent years, evidenced by the annual presentation of usability papers at the top security conferences, and security papers at the top human-computer interaction (HCI) conferences. Evidence is growing that significant security vulnerabilities are often caused by security designers’ failure to account for human factors. Despite growing attention to the issue, these problems are likely to continue until the underlying development processes address usable security.
See http://www.thei3p.org/events/sausage2011.html for more details.
My talk at Black Hat this year was “Elevation of Privilege, the Easy Way to Get Started Threat Modeling.” I covered the game, why it works and where games work. The link will take you to the PPTX deck.
- Breath mints
Ariel Waissbein has been building security games for a while now. He was They were kind enough to send a copy of his their “Exploit” game after I released Elevation of Privilege. [Update: I had confused Ariel Futoransky and Ariel Waissbein, because Waissbein wrote the blog post. Sorry!] At Defcon, he and his colleagues will be running a more capture-the-flag sort of game, titled “Hide and seek the backdoor:”
For starters, a backdoor is said to be a piece of code intentionally added to a program to grant remote control of the program — or the host that runs it – to its author, that at the same time remains difficult to detect by anybody else.
But this last aspect of the definition actually limits its usefulness, as it implies that the validity of the backdoor’s existence is contingent upon the victim’s failure to detect it. It does not provide any clue at all into how to create or detect a backdoor successfully.
A few years ago, the CoreTex team did an internal experiment at Core and designed the Backdoor Hiding Game, which mimics the old game Dictionary. In this new game, the game master provides a description of the functionalities of a program, together with the setting where it runs, and the players must then develop programs that fulfill these functionalities and have a backdoor. The game master then mixes all these programs with one that he developed and has no backdoors, and gives these to the players. Then, the players must audit all the programs and pick the benign one.
First, I think this is great, and I look forward to seeing it. I do have some questions. What elements of the game can we evaluate and how? A general question we can ask is “Is the game for fun or to advance the state of the art?” (Both are ok and sometimes it’s unclear until knowledge emerges from the chaos of experimentation.) His blog states “We discovered many new hiding techniques,” which is awesome. Games that are fun and advance the state of the art are very hard to create. It’s a seriously cool achievement.
My next question is, how close is the game to the reality of secure software development? How can we transfer knowledge from one to the other? The rules seem to drive backdoors into most code (assuming they all work, (n-1)/n). That’s unlike reality, with a much higher incidence of backdoors than exist in the wild. I’m assuming that the code will all be custom, and thus short enough to create and audit in a game, which also leads to a higher concentration of backdoors per line of code. That different concentration will reward different techniques from those that could scale to a million lines of code.
More generally, do we know how to evaluate hiding techniques? Do hackers playing a game create the same sort of backdoors as disgruntled employees or industrial spies? Because of this contest and the Underhanded C Contests, we have two corpuses of backdoored code. However, I’m not aware of any corpus of deployed backdoor code which we could compare.
So anyway, I look forward to seeing this game at Defcon, and in the future, more serious games for information security.
In “Engineers Are People, Too” Adam Shostack will address an often invisible link in the chain between research on usable security and privacy and delivering that usability: the engineer. All too often, engineers are assumed to have infinite time and skills for usability testing and iteration. They have time to read papers, adapt research ideas to the specifics of their product, and still ship cool new features. This talk will bring together lessons from enabling Microsoft’s thousands of engineers to threat modeling effectively, share some new approaches to engineering security usability, and propose new directions for research.
A fair number of people have asked for the slides, and they’re here: Engineers Are People Too.