Monthly Archives: January 2009

One more for the deprovisioning files

I have heard several analysts recently say that the current business climate makes IdM (or at least provisioning and access control) more important than ever. It would be easy to dismiss this as wishful think until you read something like this:

A former Fannie Mae IT contractor has been indicted for planting a virus that would have nuked the mortgage agency’s computers, caused millions of dollars in damages and even shut down operations. How’d this happen? The contractor was terminated, but his server privileges were not.

That not to say there aren’t disgruntled employees during a normal business cycle. But the more people that are being laid off, the greater the risk.

How many organizations really have a handle on properly turning off the access for all the people they are laying off (especially those in IT)? Of  course if the IdM systems are not out in place during an up business climate it’s going to be much harder to put it in place in a down one.

Asymmetric Risk, Malpractice Insurance, and Personal Oxen

Bruce Schneier has two very interesting posts on his blog that stand out (to me at least) by their proximity to each other. Most recently Bruce has this to say about the recent financial meltdown:

The most interesting part explains how the incentives for traders encouraged them to take asymmetric risks: trade-offs that would work out well 99% of the time but fail catastrophically the remaining 1%. So of course, this is exactly what happened.

But three posts earlier Bruce has this to say about software vendors:

So if BitArmor fails and someone steals your data, and then you get ridiculed by in the press, sued, and lose your customers to competitors — BitArmor will refund the purchase price.

Bottom line: PR gimmick, nothing more.

Yes, I think that software vendors need to accept liability for their products, and that we won’t see real improvements in security until then. But it has to be real liability, not this sort of token liability. And it won’t happen without the insurance companies; that’s the industry that knows how to buy and sell liability.

Talk about asymmetric risk. If software vendors accepted liability (or even partial liability) for anything that might happen as a result of their product, who in their right mind would ever go into the business? The problem is that liability is open-ended while the profit on each deal is not. It would nuts for any vendor to take such an asymmetric risk. It would be like an MD practicing medicine without malpractice insurance.

Which is, as Bruce alludes, how any such liability would ultimately by acceptable. Software vendors would buy liability insurance to protect themselves in the event that they are ever found at fault. Like malpractice insurances this would pool the risk and spread it over all the software vendors.

Which in the end eliminates any real incentive to avoid the mistakes to begin with. Sure the premiums would increase if found at fault, but just as with malpractice insurance the pain would be diluted by eventually raising every ones rates. And everyone would just price the rate increase into their business model exactly like the medical community does today. In the end it won’t really be the vendors money or risk.

And that’s what it really boils down to in the end. It’s a matter of exactly whose ox is getting gored. You notice that they only people suggesting that software vendors be held liable or otherwise punished for defects are not themselves producing software products. I have never seen a zero defect advocate that could actually deliver zero defect software.

SPML gateways move forwards

Mark Diodati points out this interesting open source SPML gateway. There is an accompanying blog by Jerry Waldorf of Sun that has a lot of background on the project and presents some interesting concepts of things that you could do with an SPML gateway.

This is exactly the kinds of stuff I had hoped to see happen when I started working on SPML. Working at Access360 it was frustrating to see so much time and effort spent writing connectors to all the disparate systems that needed to be provisioned. If only project keychain had existed back then.

Great stuff if you are interested in provisioning.

If only, if only

I like skeptics. I like to consider myself one. I also thoroughly enjoy reading the IT Skeptic. But this borders on pure fantasy:

Then there is the question of the pace at which this beast is moving. Although the document referenced here is dated October 2008 the changelog ends in January 2008, and it is certainly the only output we have seen this year other than one(?) multi-vendor demo. There are zero commitments from DMTF or from the vendors for any sort of timeline for delivery of anything. As I have pointed out in the past,

“WARNING: vendors will waive this white paper around to overcome buyer resistance to a mixed-vendor solution. For example if you already have availablity monitoring from one of them, one of the other vendors will try to sell you their service desk and use this paper as a promise that the two will play nicely. “

All I could think of when I read this was “If only”. If only the vendors cared enough about interoperability standards to make it a selling point. Then you might eventually get real interoperability, even if it started as vaporware.

But the reality is the front line sales guys usually don’t know or care about standards, past checking boxes in an RFC. William Vambenepe sum’s it up nicely in this rebuttal:

Has anyone actually seen this happen? I am asking because so far, both at HP and Oracle, the only sales reps I have ever met who know of CMDBf heard about it from their customers. When asked about it, the sales person (or solutions engineer) sends a email to some internal mailing list asking “customer asking about something called cmdbf, do we do that?” and that’s how I get in touch with them. Not the other way around.

Also, if the objective really was to trick customers into “mixed-vendor solutions” then I also don’t really understand why vendors would go through the effort of collaborating on such a scheme since it’s a zero-sum game between them at the end.

I don’t mean this to be critical of the sales guys. They care (as they should) about the requirements the customers care about. Until the customers start making support for interoperability standards like CMDBf (in the ITSM space) and SPML (in the IdM space), these standards will never get robust implementation. And the customer will continue to get stuck with siloed solutions.

Reconcilable Differences

Having worked in both IdM and ITSM I am constantly struck by the similarities and how the same problems get reworked  over and over again in the two industries. For instance William Vambenepe has this to say about configuration item reconciliation:

Whether you call it a CMDB or some other name, any repository of IT model elements has the problem of establishing whether two entities are the same or not.

Which is exactly the same account reconciliation problem that provisioning vendors have struggled with for years. When a provisioning system discovers a Linux account with user ID jbohren, does it belong to me, or my father Joe Bohren? BTW, if you email to my first initial and last name at, it won’t reach me. If you do the same at it will.

It’s also the same problem that role management software is dealing with when trying to determine if roles in different systems represent the same logical business duty. Does the role named Accounting Manager represent the manager of the accounting department or is it the IT guy who manages the account software system?

Reconciliation is a big scalability problem in IdM and ITSM systems. Often there are too many orphaned items (items that can not be unambiguously matched to a known entity) for the IT staff to handle. Also determining what to do with orphaned items can be very difficult.

One interesting approach account to reconciliation is to let the account owners adopt the orphaned accounts. The adoption process would involve the owner provider the credentials to log into the account in a web page. If the system can verify that those are the correct credentials then, then that person is assumed (or allowed) to be the owner.

But this approach only works with accounts and account based systems. For now reconciling other orphaned items is still mostly a manual process. I would be curious to hear about solutions that other people have found for various reconciliation problems.

Talking SPML

Oddly enough the New Year has seen a spate of SPML discussions. James McGovern gets the whole thing kicked off here. Jackson Shaw adds his thoughts here, and makes the point that SaaS really needs federation and provisioning to work well.

Mark Diodotti (who has been following SPML for a long time) has some interesting thoughts about it here. Mark points out that SPML lacks built in authn and authz capabilities. This was an intentional design decision in both SPML 1.0 and 2.0 as it was felt at the time that authn and authz should be part of the web services infrastructure, not the provisioning standard. In retrospect that decision put too much faith in how well authn and authz standards would be adopted. This also points out the unique position that identity web services are in. They must be secured yet they must drive the security as well. It’s a real chicken-egg dilemma. Or to use the WSDM nomenclature, a real MUWS-MOWS dilemma.

Ian Glazer (a former colleague of mine at Access360 and who also served with me on the PSTC) wants to stop talking about federated provisioning. Ian makes the point that federated provisioning is not really any different than enterprise provisioning. Ian is correct in that they are basically the same, although there are some subtle differences in how they play out in deployment.

I really hope that these discussions lead to some real movement around leveraging SPML to enable SaaS services. I am always up for an SPML conversion. If you want to discuss SPML (or identity or change management), my work email is my first initial and last name at and my personal email is the same at

Much ado about metric

XKCD has put out this great summary of metric units. While the comic is great fun (I especially like all the shiny Firefly/Serenity references), it has regrettably set off a round of bashing the US for clinging to the English system while the rest of the presumably more enlightened world uses the Metric system.

While I agree that the metric system is superior, I find many of the arguments put forth for switching to be specious at best. The most popular of these is the all time poster-boy of why we should use the metric system, the loss of the Mars Climate Orbiter in 1999. In that case the Mars Climate Orbiter was lost because some of the telemetry data was delivered by an outside contractor in English but the orbiter software was expecting metric.

I’m sorry but this is a very unconvincing argument.  First of all, just because metric would be better for use in a satellite guidance system is no reason that I have to buy salsa measured in grams  (for the record the jar of Tostitos Salsa I just bought comes marked in both English and Metric, 24 oz and 680g respectively). In fact in college I did all my engineering work in Metric and then lived my non-academic life in English. It’s really not that big a deal.

Second, software errors caused by unit mismatch can happen even in a consistently all metric environment. For instance in the Mars Climate Probe case the results would have been just as catastrophic if data that was expected to be in Meters was delivered in Kilometers. One common software error that I have seen repeatedly over my career is using local time instead of GMT time.

Third, there is very little advantage in switching but it would be hugely expensive.

But here is my dirty secret: I love that the US uses the English system. Not because it’s better (it’s not), but because it represents a libertarian philosophy. Rather than the government forcing everyone to use one system of measure, the choice is left to the consumers. If they decide they want the metric system, they will force the manufacturers to use it. So far, when offered the choice, American consumers have collectively decided that we should keep the English system. I don’t see that changing any time soon.

BTW, I also wrote about the myths and misconceptions of the Metric system here.