Once upon a time in India, in a village (so the story goes), there was a problem with cobras. There were too many of them.
Cobras, those freaky little reptiles, have a bad rap but the unfortunate truth is they *can* kill you, so it’s understandable that the village wanted them gone. And so the village leaders instituted a bounty for every dead cobra. This would surely be successful, as everyone likes money, and no one likes cobras! Couldn’t miss!
Sure enough, tons of dead cobras were brought in…. but the overall cobra problem didn’t seem to subside. This is because just outside the village were people (you guessed it) breeding cobras, so they could kill them, so they could collect the bounty. Naturally the government didn’t want to pay for purpose-bred cobras, so they stopped the bounty. And the breeders, with no more financial incentive to breed cobras, let the cobras loose, thereby increasing the cobra population.
This would be the precise opposite of the desired solution of the bounty, and this sort of circumstance is called the “Cobra Effect”. You can read about it here (Wikipedia lists the village as the city of Delhi, but I’m not sure I buy that). Another example is the famous pigs of Fort Benning.
Essentially, the Cobra Effect is when your proposed solution actually makes the problem *worse* than it was to begin with. It doesn’t always have to be economic in nature, as I am seeing at work currently.
Fourteen months ago I took my current job and in the first couple of weeks I volunteered to work on a given project. The given project had been languishing for a few months and was on someone’s radar again, so it needed attention. The basic idea was to take 200,000 records and consolidate them into about 6 or 7 thousand, with minimal disrupt. We crafted a comprehensive plan to get the project done, executed it, and…
…it blew up in a horrific, ugly mushroom cloud. Everything that could go wrong did: bad data meant some emails went to wrong people. Emails that went to the right people invariably succeeded in pissing them off, and emails that had been declared not necessary to go out turned out to have been rather necessary, after all. Data was updated but not correctly, thanks to an artifact in code knowledge no one remembered (so the after effect was, “Oh, that’s why that was there.”). 112 Hours later it was fixed.
After six distinct debriefs and detailed postmortems (“Fix the contact information”, “Vet it with this team in this other fashion even though they originally said the first way was fine”, “Avoid Excel”, “Use Excel”, “put a PM on it”, “Take the PM off of it”, “Let’s start from scratch”, “Let’s use what we had before and refine it”, “Take it out of this team”, “Give it back to that team”) it looks like the current plan is to…
… do nearly exactly what we originally did. Only now, we’re doing it with 30% more records, because the first reaction from the first go-around that went awry was the recipients of the new format/data/project went into the system and… created more records.
Points:
1. Unintended consequences are everywhere, and the best intentions often create more of them, and
2. The Cobra Effect doesn’t just apply to economics, although given a few minutes I could probably monetize this project and it would make me cry, and
3. You can have a fancy name and anecdote for something, and even have it written about in many management books, but it won’t prevent people from making ill-advised choices (despite best efforts at education).