Poor Performance

By Terry Little, U.S. Air Force

We were near the end of a major source selection. I was feeling somewhat trapped-in a box I had built for myself. Several months before the source selection I had persuaded my boss, his boss and the staff to allow me to pilot an approach to source selection that was dramatically different from any we had tried before. That approach was to use the offerors' proposals to evaluate technical details and the price, but to use past performance as the exclusive way to evaluate the credibility of the offerors' processes, plans and promises. The overall effect was that past performance counted for 50 percent of the source selection decision. There had been lots of questions. Most were about three things: (1) how we were going to evaluate past performance, (2) why we thought an offeror's past performance was a reliable indicator of future performance, and (3) how we were going to deal with a protest from an offeror who lost because of past performance. I had convincing answers-convincing in the sense that no one could come up with a compelling reason not to do what we were proposing. However, no one (including me) really knew what pitfalls lay ahead.

Now that we were nearing the end of the source selection, we had a problem. The offeror with the most exciting, innovative technical proposal and the lowest price (H Company) also had an apparent big black mark on its record: a substantial cost growth in a program (Program X) that had surfaced almost immediately after H Company had won the contract for Program X. Making matters worse was the fact that the problem was ongoing and that Program X was very similar to our program. H Company acknowledged the problem on Program X but claimed that the government had knowingly mandated unreasonable requirements and then used the competitive environment to force H Company and its competitor to commit to an unrealistic, underfunded program. H Company also believed that its proposal had actually helped the program get started, because the price was consistent with the money available. To the company, the enormous cost growth was just cold reality setting in once the program had been authorized to move forward. Further, H Company people asserted that they had learned their lesson from the experience and would never again propose an unrealistic, underfunded program. Their explanation had the ring of truth.

The government's side of the story was different. Yes, in retrospect, maybe some of our requirements had been unreasonable, the Program X manager said. Yes, maybe we didn't have enough money to do what we wanted to do. But H Company had never questioned any of our requirements despite having had ample opportunity to do so. With the information I had and the subjectivity of the past performance evaluation, I knew I could make a case either for discounting H Company's poor performance on Program X as an anomaly or for using it as the primary basis for awarding the contract to someone else. I considered that awarding to H Company meant we would not be awarding to I Company, which had a less exciting proposal but a solid past performance record. I also wondered whether H Company had known the government's requirements were unreasonable when it had proposed Program X, or whether the unreasonableness became a revelation only after H Company began trying to execute the program. Neither alternative seemed to bode well. We awarded to I Company, which performed well. H Company protested and lost; Program X was terminated. Just after the protest decision, H Company won a competitive contract and got into immediate trouble because of having agreed to an unexecutable schedule.

Terry Little is the Defense Department's most seasoned program manager and has pioneered many acquisition reform initiatives.

Lessons

  • While an old dog may be able to learn new tricks, past performance is still the best indicator of future performance.
  • Integrity is fragile, and once shattered is slow to be repaired.
  • Judgment is the ability to combine hard data, questionable data and intuitive guesses to arrive at a conclusion that events prove to be correct.
  • Successful leaders rely on both rational and intuitive approaches in making decisions.
  • Perhaps the most irrational assumption we can make is assuming that decision-making is purely an analytical exercise.