Florida workers claim outsourced HR system reveals PII, lacks audit trail

The Tallahassee Democrat reports on an interesting disclosure instance: whistleblowers revealing allegedly shoddy data security practices at their former employer. The twist is that those doing the talking are not the folks whose jobs were outsourced, but former employees of the outsourcing firm.
From the article:

In an affidavit taken for a lawsuit by five state workers who say they were put at risk of identity theft, a former Convergys employee alleges that some People First workers playfully poked through personnel files of Bush, Attorney General Charlie Crist, Chief Financial Officer Tom Gallagher and DMS Secretary Tom Lewis, whose agency has been laboring with Convergys for two years to work out chronic kinks in People First. Another ex-employee signed an affidavit saying she was told by Convergys bosses not to let state employees know their information was at risk.

If these claims are true, it’d hardly be a shock that a consultant-built system paid for with government money turned out to be lousy. Now that disclosure has garnered widespread acceptance, though, tales of how such systems get built are not confined to departure lounge banter among consultants, or the water-cooler grousing of the rank and file subjected to the resulting “deliverables”.
A separate article describes how a subcontractor of Convergsys has also been accused of shenanigans involving PII:

Despite assurances by Convergys that personal information on state employees is safely kept on computers in the United States, a once-secret lawsuit against a former subcontractor alleges that private data was sent to India, Barbados and possibly China.
[… Plaintiff attorney] Newcomer said GDXdata used overseas scanning and indexing services “to save money” without telling Convergys. The suit says Convergys had billed the state at least $32 million when the case was filed, for work the company and the state thought was done domestically.

Now it’s time for me to try out a new toy. Tip o’ the hat to Eric Rescorla, from whose blog I learned of Etymotic.

US Department of Justice, several SSNs, Process Errors

The federal government is responsible for issuing Social Security numbers, but it may not be doing enough to protect these critically personal pieces of information on its own Web sites. Acting on a tip, InformationWeek was able to access Web pages that include the names and Social Security numbers of people involved in Justice Department-related legal actions. It’s a discomforting discovery at a time when identity theft and fraud are on the rise.

One document on the Justice Department Executive Office for Immigration Review’s site listed the name and Social Security number of a woman involved in a 2003 immigration-review case. Another document from 2002 listed the name and Social Security number of a man who was being prosecuted for committing insurance fraud. Other searches of the Justice Department’s site yielded more Social Security numbers and identifying information.

From “Justice Department Reveals Social Security Numbers,” Information Week.

Dodo bones


Scientists have discovered the “beautifully preserved” bones of about 20 dodos at a dig site in Mauritius. Little is known about the dodo, a famous flightless bird thought to have become extinct in the 17th century.

No complete skeleton has ever been found in Mauritius, and the last full set of bones was destroyed in a fire at a museum in Oxford, England, in 1755.

From the BBC, “Scientists find ‘mass dodo grave’.” See also the press release, or the site at the Dutch ‘Naturalis‘ National Museum of Natural History. (In Dutch. The Museum’s English home page doesn’t have the cool pictures the Dutch site has.)

Nuclear Surveillance

In search of a terrorist nuclear bomb, the federal government since 9/11 has run a far-reaching, top secret program to monitor radiation levels at over a hundred Muslim sites in the Washington, D.C., area, including mosques, homes, businesses, and warehouses, plus similar sites in at least five other cities, U.S. News has learned. In numerous cases, the monitoring required investigators to go on to the property under surveillance, although no search warrants or court orders were ever obtained, according to those with knowledge of the program. Some participants were threatened with loss of their jobs when they questioned the legality of the operation, according to these accounts.

From US News and World Report, “Nuclear Monitoring of Muslims Done Without Search Warrants,” via Orin Kerr at Volokh Conspiracy. In his major post on “The Security Threat of Unchecked Presidential Power,” Bruce Schneier comments:

This is novel reasoning. It’s as if the police would have greater powers when investigating a murder than a burglary.

[Update: The article also states,

Officials also reject any notion that the program specifically has targeted Muslims. “We categorically do not target places of worship or entities solely based on ethnicity or religious affiliation,” says one. “Our investigations are intelligence driven and based on a criminal predicate.”

If that’s the case, why can’t they get warrants?

Friday Star Wars and Psychological Acceptability

This week’s Friday Star Wars Security Blogging closes the design principles series. (More on that in the first post of the series, “Economy of Mechanism.”) We close with the principle of psychological acceptability. We do so through the story that ties the six movies together: The fall and redemption of Anakin Skywalker.

There are four key moments in this story. There are other important moments, but none of them are essential to the core story of failure and redemption. Those four key moments are the death of Anakin’s mother Shmi, the decision to go to the dark side to save Padme, Vader’s revelation that he is Luke’s father, and his attempts to turn Luke, and Anakin’s killing Darth Sideous.

The first two involve Anakin’s failure to save the ones he loves. He becomes bitter and angry. That anger leads him to the dark side. He spends twenty years as the agent of Darth Sideous, as his children grow up. Having started his career by murdering Jedi children, we can only assume that those twenty years involved all manner of evil. Even then, there are limits past which he will not go.

The final straw that allows Anakin to break the Emperor’s grip is the command to kill his son. It is simply unacceptable. It goes so far beyond the pale that the small amount of good left in Anakin comes out. He slays his former master, and pays the ultimate price.


Most issues in security do not involve choices that are quite so weighty, but all have to be weighed against the psychological acceptability test. What is acceptable varies greatly across people. Some refuse to pee in a jar. Others decline to undergo background checks. Still others cry out for more intrusive measures at airports. Some own guns to defend themselves, others feel having a gun puts them at greater risk. Some call for the use of wiretaps without oversight, reassured that someone is doing something, while others oppose it, recalling past abuses.

Issues of psychological acceptability are hard to grapple with, especially when you’ve spent a day immersed in code or network traces. They’re “soft and fuzzy.” They involve people who haven’t made up their minds, or have nuanced opinions. It can be easier to declare that everyone must have eight character passwords with mixed case, numbers, and special characters. That your policy has been approved by executives. That anyone found in non-compliance will be fired. That you have monitoring tools that will tell you that. (Sound familiar?) The practical difficulties get swept under the rug. The failures of the systems are declared to be a price we all must pay. In the passive voice, usually. Because even those making the decisions know that they are, on a very real level, unacceptable, and that credit and its evil twin of accountability, is to be avoided.

Most of the time, people will meekly accept the bizarre and twisted rules. They will also resent them, and believe that small ways of getting back, rather than throwing their former boss into a reactor core. The story, so much in the news about NSA wiretapping, is in the news today because NSA officials have been strongly indoctrinated that spying on Americans is wrong. There’s thirty years of culture, created by the Foreign Intelligence Surveillance Act, that you don’t spy on Americans without a court order. They were ordered to discard that. It was psychologically unacceptable.

A powerful principle, indeed.

(If you enjoyed this post, you can read the others in the “Star Wars” category archive.)

More on Snow’s Assurance Paper

This is a followup to Gunnar Peterson’s comments on “Epstein, Snow and Flake: Three Views of Software Security.” His comments are in an update to the original post, “The Road to Assurance:”

None of these views, by themselves are adequate. The combination of horizontal and vertical views is what yields the most accurate picture. Obviously, iteration is the only way to work towards that. Adam’s brilliant suggestion? OODA Loops.

I think there’s some misunderstanding here. First, I don’t understand what Gunnar means by ‘horizontal’ and ‘vertical’ views. Secondly, I’m not actually suggesting OODA loops as a means of advancing. Being intelligent about our choice of things to observe and how to interpret our observations is essential, and much harder than it seems.

A project I’m working on has an aspect I call “the jell-o slicing problem.” That is, there are lots of valid ways to slice jell-o. None of them are obviously more valid than all the others, but many of them are obviously more valid than some others. Some of the original project descriptions were broad and aspired to really great things. Things that we’ve been meaning to get to for quite some time.
Choosing what to observe and how to measure those observations is causing us much grief.

I think there is probably a simple set of things that we can look at to increase assurance. I think most people probably think so, and when we start digging in, in forums like “build security in” and the NIST/DHS SAMATE project, we realize just how divergent, chaotic, and different our views are.

As I finished this, I see that Gunnar has another article, “Assurance Techniques Review.” I’ll respond in a bit.

It’s Chaos Out There!


  • In “Play Break,” Hilzoy writes:

    Here’s what it’s about: as most parents know, little boys tend to be more interested in toys like trucks, and little girls in toys like dolls. (I was an exception: someone gave me a doll once, and I dissected it.) There is no obvious way to decide whether this is innate or a cultural artifact by watching human children. So why not see whether the same gendered toy preferences exist in, oh, vervet monkeys?

    Guess what? They do.

  • The Matasano folks have five good posts up. Five! Today! Each worth reading! (“Why I Love Vulnerability Analysis In 2005,” “Mark his words,” “Mark my words,” “Pro-Forma ’05-’06 Punditry Results” and “OpenSSH Developer Interview.”)

    Have they no respect for our time?

  • Dan Solove is rounding up the wiretapping news.

(Tony the vervet monkey photo by One more shot Rog. No word on if a carton is a masculine toy or a feminine toy.)

Do Wiretap Revelations Help the Terrorists?

loose-lips-sink-ships.jpgThe question is a fair and natural one to ask, and I’d like to examine it in depth. I think my intuitive answer (“revelations about wiretaps don’t help the terrorists”) is wrong, and that there are surprising effects of revealing investigative measures. Further, those are effects I haven’t seen discussed. Allow me to explain the logic.

First, terrorist organizations need to communicate on a wide variety of levels, from ‘moral support’ to target selection and dates. Second, we can wiretap all their communications, under a variety of legal standards.

So, should we talk about wiretapping of terrorists? The President has asserted that it ‘helps the terrorists’ in some way. Lets ask how that might be. Does talking about wiretapping help the terrorists? Revelations of wiretapping cause both awareness and fear. Either or both could lead to temporarily improved communications security process. What could those be? New crypto? New attention to detail? Better shredding? There are others, which I’ll talk about in a minute. For now, let’s work with the assumption that revelations lead to better adherence to security processes, and the second assumption that better security processes are bad for the listeners. Let’s take those two benefits one at a time.

The first is enhancing terrorist awareness of their threat environment. This is important. As time passes, people become complacent. As they become complacent, their investment in security processes drops off. (There are lots of interesting analogies to this in the business world.) Complacency thus helps the attacker, and hurts the terrorist. So revealing our wiretapping, reducing complacency, hurts the eavesdroppers. Unfortunately for the eavesdroppers, the terrorist exists in a highly adrenaline-filled environment, with regular revelations that his colleagues have been arrested, tortured, or assassinated. Each and every one of these events causes the terrorist to assess his security posture. So, our first assumption (revelations lead to better adherence to security processes), while true, is but one of many causes for that adherence.

Improved communications security is not the only effect of the revelations. What happens if a terrorist is already under surveillance? They may go to ground, or they may reveal alternate communication methods (phone numbers, email addresses, web sites) not yet known. Their security processes presumably include backup methods, and driving those methods into the view of the security services is an important goal.

At this point, we have something of a balance between two hard-to-quantify ideas: better operational security versus the value of exposing alternate channels. There is, however, one final effect of driving terrorists to ground, and it tips the balance.

The final piece is that al Qaeda terrorists gone to ground do not engage in attacks. That gives the investigative services more time to find and arrest them. To me, that tips the balance. Whatever benefits accrue to the terrorists through bless complacency are balanced by exposing additional channels. Delaying murder, and giving us another chance to prevent it tips the balance, even before the benefits of the rule of law are brought in. So! Bring on the revelations!
[Update: Yes, that’s the original poster, with the word “might,” as it appears at archives.gov.]

Epstein, Snow and Flake: Three Views of Software Security

Among those who understand that software is, almost without exception, full of security holes, there are at least three major orientations. I’ve recently seen three articles, all of which I wanted to talk about, but before I do I should explain how I’m using the word orientation, and the connotations it carries.

As used by John Boyd, orientation is the interaction of cultural traditions, genetic heritage, new information, previous experience, and analysis and synthesis, all of which filter new information as decisions are being made. Understanding the orientation of a person or organization is a powerful way to predict how they will act in response to new circumstances. Orientation is shaped by cultural tradition and experiences. Orientation is often presented as part of the Observe, Orient, Decide, Act (OODA) loop. The OODA loop is often seen as a tactical one, but Boyd discussed it on all levels, from a knife fight to grand strategy. I am using orientation in that broad sense here, and will assign labels, grossly oversimplified, to three of them.

I realize after I wrote this that all three of the people I’m quoting here are vastly smarter than perhaps I imply. My goal is not to attack any of them, but to contrast some of the background which informs their approaches. To draw out this contrast, I quote a little unfairly.

After the break, a bit of inside baseball on security orientations.

Continue reading