Pick One

Once you have developed RFPs for a software solution, sent them to the leading candidates, and gotten them back, the hardest work, making a choice, begins. Though space does not allow a thorough review of everything that needs to take place at this point, there are two or three matters that may be of most value.

Evaluation strategies

At this stage you should apply two strategies. First, if this has not already been done, assign team members to specific facets of the evaluation, based on their role on the team (or in the company), their availability, and their backgrounds. The CFO or financial team member may not be an appropriate candidate for site visits, but is a key player in all aspects of the financial dimensions of the project (cash flow, ROI, treatment of costs for tax purposes). On the other hand, if multiple site visits are necessary, a single operations person may not be able to spend all the time required to see every site. Continuity suggests that some combination of a group of three or more members of the team should be present at each site visit. In short, the roles and responsibility of each aspect of the evaluation should be assigned, including contingency plans where possible, to avoid unnecessary delays.

Apples to Apples Matrix
Single Source Individual Experience Functions Support Training Impl. Method Mods. $ ROI Internal Resources Technical Architecture Intangibles Vendor Rank
Weight 2 2 3 2 2 1 2 3 2 1 1
Vendor Rating Score Rating Score Rating Score Rating Score Rating Score Rating Score Rating Score Rating Score Rating Score Rating Score Rating Score
C 2 4 1 2 2 6 3 6 2 4 1 1 2 4 3 9 2 4 3 3 3 3 5.55
G 1 2 3 6 2 6 2 4 3 6 2 2 3 6 1 3 2 4 3 3 1 1 5.55
D 3 6 1 2 3 9 2 4 1 2 2 2 2 4 2 6 2 4 3 3 3 3 5.45
B 1 2 2 4 2 6 1 2 1 2 2 2 3 6 2 6 3 6 2 2 3 3 5.00
E 2 4 3 6 1 3 2 4 1 2 1 1 1 2 3 9 2 4 2 2 1 1 4.82
F 2 4 1 2 3 9 1 2 1 2 2 2 2 4 2 6 2 4 1 1 2 2 4.73
A 0 0 0 0 2 6 3 6 1 2 3 3 2 4 1 3 2 4 3 3 2 2 4.09
H 0 0 2 4 1 3 1 2 1 2 3 3 1 2 2 6 2 4 1 1 1 1 3.64
TOTAL 22 26 48 30 22 16 32 48 34 18 16
AVERAGE 2.75 3.25 6.00 3.75 2.75 2.00 4.00 6.00 4.25 2.25 2.00
Rank
0 Not available
1 Low
2 Medium
3 Highest

Second, and also for purposes of continuity, it is important to supply each participating team member with checklists or other materials to use in this phase of the evaluation.

Spell out objectives such as seeing the software in operation and talking to users and members of the site organization’s installation team — ideally not in the presence of the vendor. Develop key questions in advance. Among other things, this will help ensure that the most important information is gathered for each site visit, no matter who conducts it. Questions might include validating the vendor’s representations about implementation effort, time, and cost; the user’s support experience; whether there were any surprises during implementation; or any things the users would do differently in retrospect.

Check this!

Now, finally, it’s time to pick a solution. But wait! First, check to be sure you’ve really got all the essential information. Why? By way of illustration, a recent industry research study (conducted by Noll Research and Prestobiz.com), identified several major, unanticipated hindrances to successful WMS projects. Sixty-one percent of the 860 WMS users in the survey identified the extent of training and the size of the user learning curve as the most important unexpected problem they encountered. Modifications and upgrades followed, at 45%. Integration problems came in at 38%. One other significant issue was miscalculation of implementation time, which caught 25% of the respondents unprepared. If, after reviewing your selection database, you cannot comfortably describe how vendor candidates will contend with any one of these issues in the project you are undertaking, your work is not yet finished.

Team Summary
2 4 1 5 3 RANK
B 5.55 4.73 5.00 5.45 4.82 20.73
C 3.64 5.55 5.55 5.45 4.73 20.19
G 5.50 4.73 5.55 3.90 5.00 19.68
F 4.80 4.65 4.73 5.30 5.55 19.48
H 4.90 5.40 3.64 5.25 5.40 19.19
E 4.10 5.45 4.82 4.73 5.00 19.10
A 4.86 4.55 4.09 4.75 4.90 18.25
D 3.95 3.64 5.45 4.82 5.55 17.86
37.30 38.70 38.83 39.65 40.95

The ‘apples’ matrix

One good way to establish a solid basis for comparison is to define weighted evaluation criteria. Each team member then scores each proposed solution using the matrix. Below is a simple example of one such matrix. While the details of this example are not especially important and will vary by circumstances, the concepts are helpful. The matrix can be filled out independently and with or without names.

Column headings identify the criteria to be used for rating each vendor, with the relative weight (value to the client or user) in the row immediately beneath. In the score columns, the rating is multiplied by the weight to calculate points earned. In our example, for instance, technical architecture and intangibles (vendor responsiveness, size of installed base, etc.) win fairly high ratings from the team but produce the lowest scores because they are rated as criteria of low importance (“1”). By comparison, despite a similar variety of ratings by the team, both ROI and functionality produce high scores due to the weight they are accorded (“3”).

Averaging the score columns creates a picture of the impact of various criteria. The average of all scores by vendor (right hand column), sorted in descending order, provides a quick way to determine the relative strength of each solution, as perceived by the team. Team Evaluator 1 ranked vendors “C” and “G” the highest, with tied scores. Vendor “D” was a close third, after which the gap widens for all others.

You can add another level to this process by combining the individual team members’ vendor rankings to produce a total score (see “Team Summary”). Again the data has been sorted based on total rank and the top three scores are highlighted. These three would be the likely finalists, against which any other factors would be applied. In our example, vendors “C” and “G” are still in the top three, but the aggregate team ratings suggest that vendor “B” may offer the best solution. Vendor “D” falls from third for Evaluator 1 alone to last in this analysis by the group.

Totaling each evaluator’s ratings across all vendors (bottom row of the summary) allows you to compare the consistency with which ratings were applied across the team and will help you readily to identify any major variances in the process that need to be addressed.

Final steps

At this stage, you should solicit any final team input, and then prepare a ranked recommendation for the solution to present to the key decision-makers. Once the decision is made, notify the successful vendor and begin contractual talks at once. The pace of this process can be accelerated if representative contract materials are solicited for review at the RFP stage. Vendor notification should include an indication that the final choice of the solution is contingent on completion of contract negotiations.

While it is never pleasant to be one of the “almost” vendors, everyone should understand the process clearly from the outset. The greater the degree to which vendors feel that they have been treated fairly and honestly, the more they will appreciate your efforts, and the better your prospects for getting the information, help, and support you need to make a selection process work well. As the software vendor community consolidates and at the rate people move from position to position, it is almost certain paths will cross again and positive experiences will serve you well.

Ron Hounsell is vice president of software solutions at Tom Zosel Associates, a distribution and logistics consulting firm based in Long Grove, IL. He can be reached by e-mail at [email protected], by phone at (847) 540-6543, and by fax at (847) 540-9988.