The process of selecting a vendor solution or service provider can be daunting. This is especially true given the vast landscape of options available to solve almost any need an asset manager or owner could be evaluating. When a medium or even short list of prospective vendors is achieved, a typical next step is to issue a request for proposal (RFP), which can present additional challenges to navigate. This could include gaining your organization’s consensus on requirements that the vendor is going to be evaluated against and conducting a multi-dimensional assessment of the vendor, their functional capabilities, data environment, technical footprint, market positioning, costs, etc. While creating a comprehensive RFP alone is a time and resource intensive effort, evaluating the responses to each question across multiple vendors is an entirely different challenge.
I’ve participated in RFP projects where vendor responses were hundreds of pages long. Wading through responses, digesting the information and running a comparative analysis between responses from multiple vendor participants is a time-consuming exercise. However, remember that this process has been undertaken by just about every one of our clients and is almost as old as the asset management industry itself. Through our many experiences, we have found that focusing the evaluation process on the following three goals can help streamline the process:
Quickly get to the bottom line of the actual capabilities and functionality the vendor has to offer
Put the vendor to the test proving that these capabilities exist and will satisfy requirements
Rate vendor proficiencies (or lack thereof) in a way that is weighted toward the highest priority items for the evaluating institution
A major advantage of effectively conducting a vendor evaluation comes with the experience of having already been through the RFP process multiple times with the same vendors. From this first-hand knowledge, I have compiled a few fundamental ways to gain a clear understanding of the actual capabilities and functionality provided by each vendor or service provider.
RFP Follow Up
Carefully evaluating and requesting clarification on broadly stated RFP answers is critical to gaining clarity on vendor capabilities. Providing explicit written follow up questions is the first line of defense in reaching the most accurate representation of capabilities. This can be done by including detail around specific pieces of functionality that are drawn from industry best practices to lead the vendor to be more specific in their answers. Often, having a follow up call with the vendor to go through these additional questions will help confirm the specific capability requirements being evaluated.
An Effective POC
A proof of concept (POC) is also a great exercise to bring clarity to the depth of capabilities the vendor’s solution has to offer. Citisoft’s Mike Walker recommends several helpful tips in his blog How to Get the Most Out of Your Proof of Concept which summarizes key elements of setting up and running a successful POC. In short, I have seen that setting up scenarios that are critical to the potential business users of the solution—utilizing a short history of investment data from select accounts that are representative of the evaluator’s asset mix—can be very effective. This will help draw out not only the depth of functionality, but almost equally as important, the mechanics of how the functionality works to meet these requirements. Full-day onsite sessions are often necessary to allow the vendor to display this functionality to evaluators while permitting time to answer specific questions that relate to scenarios being run live in the solution.
A Brief User Trial
Lastly, certain detailed evaluations may necessitate trials of the system for key users. This is a great way to gauge the usability of the solution and test if the results displayed in the POC can be recreated by a user not as versed in the system as the vendor employee who provided the initial walkthrough. Running a trial can uncover many complexities and nuances that almost require a mini-implementation to ensure success. Considerations such as data quality/readiness, training, and support model for the trial should be thought through before turning the keys over to users. Given the scale of the evaluation process and scope of the trial exercise, it is recommended to limit the trial to running only the POC scenarios that users are familiar with, have been demonstrated to them, and have been set up in the solution being evaluated.
Being rigorous yet prudent during an evaluation and selection effort is the balance that should be struck to help maximize success. Each of the steps outlined above will help the evaluator gain more clarity on the best fit for their needs. Ultimately, a decision must be made on the appropriate framework of the evaluation to ensure that the selected vendor will be a good partner for the organization and will provide an effective solution both now and in the future.