Chapter 13. Evaluating tools

Selecting appropriate software tools to support your unified content strategy can be a difficult and lengthy process. Your best defense for selecting tools is to be armed with as much information about them as possible. You need to develop detailed evaluation criteria and test before you buy. This chapter describes the process of evaluating tools, from identifying your needs to conducting a proof of concept.

Identifying your needs

Begin your evaluation process by identifying your needs. Part II, “Performing a substantive audit: Determining business requirements” provides guidelines on how you can identify your organizational needs, and Chapter 7, “Envisioning your unified content life cycle,” provides guidelines for creating a vision of the functionality of your unified content life cycle. After you’ve documented your vision, you can use it to identify your criteria for tools selection. For example, Table 13.1 illustrates how Company A’s vision translates into their tools selection criteria. Note that although the task in Table 13.1 is to create new content, much of its associated criteria relate to the content management system.

Table 13.1. Identifying tools criteria

Function: Create new content

Tool

Criteria: Must support

Before beginning to create content, reuse authors select an authoring template and identify the subject of the content (for example, a specific product). The content management system searches for existing information elements that are valid for the template and the subject. It then pre-populates (systematic reuse) the template with appropriate content.

CMS

Systematic

Authors can either keep the automatically populated content or choose to remove it (unless the content is identified as mandatory). If authors keep the reusable content, security they can retain it untouched or modify it to create derivative content when (unless the reusable content element exist is locked). Reusable content that is mandatory or locked can be modified by only its owner or a supervisor.

CMS

Derivative reuse

Locked reuse

Element level

Identifying derivatives

Authors can use the search tool contained within the content management system and retrieve appropriate elements (opportunistic reuse). They can search using text or phrases contained in the reusable element (full text search), and use metadata to narrow searches to specifically the type of information they are looking for (for example, a product description or procedure).

CMS

Opportunistic reuse

Search using full text retrieval

Search based on metadata

Use your criteria to develop a series of questions to ask the vendor. In addition to asking, “Does your product support X,” ask “How does it support it?” Company A’s sample search criteria could look something like Table 13.2.

Don’t stop with just the criteria you derive from your vision; do some research about everything the tools can do. For example, the vision in Table 13.1 identifies both full-text searches and searches that use metadata among selection criteria. When evaluating tools, you may find other kinds of searching that are useful. How about searching based on the status of the element (such as all elements that are currently checked out, or all elements that are approved)? What about searching across other content formats? A CMS can store your content elements in one format, but may be able to store whole documents in formats such as PDF. Would that be useful? In finding out more about what the tools can do, you may identify functions that are useful to you that you hadn’t thought of before.

Table 13.2. Sample criteria

Criteria

Specifics (how)

What types of searching are supported?

Full text?

Boolean?

Natural language?

Index?

Keyword?

Structural?

Metadata?

Standard CMS metadata such as check-in/check-out data; create, modify, obsolete?

Can you search across output files (for example, web, PDF), as well as content?

General criteria

Develop general criteria for the product, as well as specific criteria. General criteria address functionality or support that all vendors should provide, regardless of the type of product they are selling. Don’t forget to evaluate the company, as well as the product, to see how reliable they are. Some of the general criteria you should consider include the following:

  • Usability

    How easy is the tool to learn, use, and master (become better than merely proficient)? Can new users be easily trained to use the system effectively? Can the interface be customized for increased usability or different functionality? If so, how easy is it to customize (can you do it or do you need a specialized programmer)?

  • Training

    What training is provided? Are there different levels of training? Is it all classroom, or is there web-based training as well? What is the cost? Is onsite training provided? Who are the instructors? What are their qualifications?

  • Supporting documentation

    What type of documentation is provided (user manual, reference manual, help, web)? How is it provided (CD, paper, web)? If vendors don’t provide hard copy (manuals), can you buy them? Ask to see a copy (or sample) and evaluate its quality. How are updates to documentation provided?

  • Technical support

    What type of technical support is provided (email, phone)? Is it included in the price of the product or is there a fee? If it is included, for how long? What is the guaranteed turn-around time on answers to questions?

  • Future upgrades/enhancements

    How are future enhancements or upgrades handled? How many updates/enhancements are there per year? Are upgrades included in the price? If so, how many? How receptive is the vendor to your suggestions and requests for new features and bug fixes?

  • Implementation time

    How long will it take to install and configure the tool? How long does it take the vendor to figure out what you need, design any customized components, and get the system up and running?

  • Cost

    What is the total cost of the product? Are consulting services provided and what are their costs? Are training and technical support provided and what are the additional costs, if any? What other costs might you incur (for example, evaluation copy cost)?

    What is the total cost of ownership, including maintenance, lifetime support (life of product), upgrades, fixes, add-ons, documentation, and training?

  • Vendor viability

    Will the vendor survive as the market evolves? Does the vendor have the talent and organization to be successful? How committed is the vendor? What is the company’s strategic plan for the next three to five years? How consistent is the strategy of the vendor with your strategy? What is the vendor’s product road map? How does the vendor plan to fulfill its customer service needs now and into the future? Who are the vendor’s competitors? How dedicated is the vendor to this product, compared to its other products?

  • Partnerships

    Who does the vendor partner with to provide an expanded solution?

  • References

    Who are the vendor’s customers? What types of projects have they done with other customers? (Get descriptions where possible.) Can they provide references? Are there external reviews, endorsements, articles, published comparisons?

Weighting your criteria

Not all criteria should be valued equally in your final scoring; some issues will be more important than others, depending on your needs. Begin by sorting criteria into categories:

  • Must have

    If the product does not meet “must have” criteria, it is not eligible for selection. However, it may receive a partial score if the tool provides a necessary functionality, but not necessarily to the level of functionality desired.

  • Should have

    If a product does not meet “should have” criteria, it should not be eliminated immediately, but its ranking will be lower, which may eventually eliminate it.

  • Nice to have

    “Nice to have” criteria are just that—nice to have. If a product does not meet “nice to have” criteria, its ranking is not significantly reduced. But products that meet more of your “nice to have” criteria will be ranked higher.

Next you need to give each category a weight. For example:

  • Must have: 5

  • Should have: 3

  • Nice to have: 1

Each of the weights represents a range of scores. The scoring is used to identify how well the feature meets your need. For example, a “must have” score could be 3 out of a possible 5. This means that the tool has the desired feature, but it does not merit a full score for its functionality (for example, it may be very hard to use). Therefore, a 5 represents full functionality and a 1 represents minimal functionality. As you rank each criterion, score it against the total value of the criteria and record why it did not receive a full score. Table 13.3 illustrates how to record your scores, using searching as an example. In this example, searching is considered “must have,” so its maximum ranking could be 5. The product doesn’t receive full marks because it does not have the full functionality requested. However, the missing functionality is not critical, so it loses only one point. The stars make the ranking very easy to see at glance. You may want to create a simpler chart to provide an at-a-glance summary of the criteria ranking, and leave the detailed criteria and reason for the ranking score to an appendix in your report.

Table 13.3. Ranking criteria

Criteria

Ranking

Reason

What types of searching are supported?

Full text?

Boolean?

Natural language?

Index?

Keyword?

Structural?

Metadata?

✦✦✦✦

Can perform full text, keyword, structural, and metadata serching. Cannot search on the index or across output files.

Standard CMS metadata such as check-in/check-out data; create, modify, obsolete?

 

Can you search across output files (for example, web, PDF) as well as content?

 

Group criteria according to their category—”must have,” “should have,” and “nice to have,” in that order. If a product has a very low score or a zero on many of the “must have” criteria, it should be eliminated. You may find that if all the tools score low in the “must have” section that the products have not reached the maturity level you desire, or that you may have to do some customizing to make the product properly meet your needs. If all the products score reasonably well in this area, look at the total score and move on to the “should have” category. A poor score on any of the “should have” criteria should not eliminate the product, but an overall poor score in comparison to the other products should eliminate it or classify it as only a “maybe.” The “nice to have” scores can be used as a tie breaker or to give a product an edge over a closely ranked product.

Creating a list of potential vendors

There are hundreds of options for any type of tool, making it difficult to know where to start looking. Gather information from many sources to help you create a list of potential vendors. Some good sources include:

  • Conferences

    Industry-related conferences, or conferences that focus on key topics such as content management, are a very good place to start. Many have exhibit areas full of vendors interested in selling you their products. Come armed with your criteria and questions and plan to spend a lot of time in the exhibit hall watching demonstrations and asking questions. Sometimes vendors have presentation sessions where you can get a better understanding of their products. Attending sessions where products are compared by objective speakers is also very valuable. Ask other conference attendees which products they use or are planning to use. If a vendor has a product that looks like it may meet your needs, have their representatives get back to you with more information and a customized demonstration at your site.

  • Web conferences or Webinars

    Web conferences or Webinars are a great way to learn about products or subjects of interest. Many vendors give regular conferences. Sometimes they are specifically designed to demonstrate their product, but often they are on a related subject. They are a great way to get a lot of information for only the cost of your time (no travel costs).

  • Electronic mailing lists

    Subscribe to electronic mailing lists about tools, your type of business, or the type of task you hope to accomplish. Although you can get flooded with a lot of irrelevant information, electronic mailing lists are an excellent place to ask about the products that others use.

  • Magazines

    Magazines frequently review products on an annual or semiannual basis. Many also feature products in each issue.

  • Web sites and online discussion boards and newsgroups

    There are web sites that list available products of a certain type. They rarely provide any comparative information, but they do provide a series of links to vendor sites.

Narrowing down the list

After you’ve created a list of potential vendors, you need to narrow it down.

  1. Request a demonstration. Give the vendor advance notice if you want something custom, and be prepared to send them your materials to work with. (A simple canned demo, though, is certainly an easy thing to request and should be considered first.)

    Invite the vendor to your site or request a web demonstration to provide an opportunity to specifically address your questions. If you don’t plan to send out a request for proposal (RFP), use the demonstration to ask all your detailed questions. Allocate a minimum of two hours (potentially, three to four) depending on the complexity of the product and the level of detail in your criteria.

    Review your criteria in light of these demonstrations to determine whether you need to revise them before you create an RFP.

  2. Send out an RFP that includes your detailed criteria.

    If you are purchasing a large product like a content management system, sending an RFP to potential vendors enables you to gather a lot of information and directly compare vendor responses. Give vendors as much information about your desired project as possible to enable them to respond to your needs. Have them sign a non-disclosure agreement, if appropriate, to ensure your information is kept confidential, but that enables you to provide them with the appropriate level of detail.

  3. Evaluate the responses.

    Evaluate the responses to the RFP. Add weighting (values) to your criteria to help you determine an objective numeric ranking for each vendor. If you don’t issue an RFP, use the custom demonstration to evaluate the product against your criteria.

  4. Pick the vendors that most effectively meet your criteria.

    Add up the scores for all the vendors, eliminating those with low scores or glaring gaps in their functionality. Reduce the list to three vendors, reviewing the results with others in your organization who may have some insight into selecting tools.

  5. Ask for a content-specific demonstration.

    Ask each selected vendors to demonstrate their product, using some of your selected content. Ensure that the selected content is representative of the materials you eventually want to create/manage/deliver. If possible, have vendors incorporate your content into their tools while you watch. This enables you to see what it really takes to work with your content. If you see only the final results, you may not see some of a tool’s complexities.

  6. Narrow your selection further.

    After the demonstration, narrow your selection to one or at most two vendors.

  7. Conduct a proof-of-concept.

    Before purchasing a product, you need to evaluate it thoroughly. Don’t believe everything you read (vendor marketing materials), hear (demonstrations, presentations), or see (demonstrations). Use this information as a starting point, then conduct a proof-of-concept to make sure the tool can perform to your expectations. Proof-of-concept is described in more detail in the following section. Ask for an evaluation copy and try the product yourself.

  8. If the product performs well during the proof-of-concept, purchase it.

Proof-of-concept

Before committing to purchasing a product, evaluate it through a proof-of-concept. If you have narrowed your selection to two tools, you may want to conduct a face-off, where both tools are tested and the results are compared. The proof-of-concept makes sure that the product meets your organization’s needs and provides the desired functionality. Some organizations have been positive that a product would meet their needs, then during a proof-of-concept discovered glaring problems that were not revealed during the initial evaluation. It is better to determine the problems before you buy. A proof-of-concept can also point out what changes or customization are required before you can implement the product in your organization.

Set this up as a test situation with the software running on a test system (that is, not a live system you are currently using) to ensure that there are no incompatibilities with your current software. Most content management systems require a separate system after they are installed for full implementation.

A proof-of-concept involves testing the product in your organization in a small implementation. It is important to identify a series of tasks and outcomes to:

  • Test what the product does

  • Validate your assumptions about the product

  • Test the product’s capability to meet your specific needs

The tasks should be comprehensive enough to effectively test the product’s functionality, but not so comprehensive that it results in a tremendous amount of work and cost for your organization.

To perform a proof-of-concept you must get vendor support. Vendor support includes:

  • Evaluation copy of the software

    An evaluation copy is typically a time-limited but fully functional version of the software, or it may not be time limited, but have restricted functionality. It may be provided to your organization at a nominal fee or free of charge.

  • Consulting services

    You probably do not have the resources or skills in-house to install and configure the software appropriately. These skills can be learned, but typically not in the timeframe of the proof-of-concept. Negotiate a reasonable consulting fee with the vendor to have them install and configure the software to meet your requirements. These requirements should be limited to the scope of the proof-of-concept only, not full configuration.

  • Training

    Participants in the proof-of-concept will need training to understand how to use the product. If the vendor already has a scheduled training session planned close to your test date you may be able to negotiate low-cost or no-cost participation. Otherwise, you may need a session scheduled just for your needs. This will certainly entail a cost.

  • Technical support

    Technical support to answer questions and troubleshoot problems will be required during the test period.

Do not let the vendor actually conduct the proof-of-concept. The vendor knows the product too well and can rapidly create work-arounds when there are problems. You want to see the product “warts and all” to ensure that it will meet your needs. Give yourself the resources to break the program, find its limitations, and make mistakes that require correcting so you can test the program’s ability to handle changes and failure recovery.

You can conduct your proof-of-concept in two ways: as a functionality test or as a combined usability and functionality test. A functionality test usually involves a small team using sample materials to walk through test scenarios. This enables you to test the features to ensure they meet your functionality. A combined usability and functionality test involves representative users of the new system testing the materials in a simulated environment. A usability test is more complex because it involves training the participants. In addition, you have to design the test so that users with minimal training can effectively participate. A usability test is critical before you pilot the product, but may or may not be necessary before you purchase the product. However, if there are concerns about users accepting the product, a usability test that tests the key features of the system at the proof-of-concept stage may be appropriate.

Track the results of the proof-of-concept throughout the test period. If you are doing a usability test, collect results through questionnaires, observation, and participant questions. Create a method for tracking (discussed in the following section) both the test and the results. Use your original criteria list, but this time rank the functionality based on your test. Analyze the results and summarize your findings in a report. Decide whether the product is the most appropriate purchase for your requirements. If it is close but needs some customization, talk to the vendor to see what this will cost. If it isn’t the correct product for your organization, evaluate another one.

One size fits all?

The remaining chapters in Part IV “Tools and technologies” (Chapters 14-18) introduce the different types of software required to support your unified content strategy and the many different variations within each type of software (for example, web content management versus integrated document management). Your evaluation of the products is certain to reinforce the idea that there are many different options among which to choose. Many vendors have chosen to make “best-of-breed” products that meet a very specific requirement in your organization and do it very well. In isolation, best-of-breed products make a lot of sense because they meet your specific needs at that particular time. Unfortunately, best-of-breed products may reinforce silos (for example, web content functionality only, not enterprise content management), delegating certain tasks and deliverables into separate areas, each with its own best-of-breed tools specific to its tasks.

Some vendors create products that address many different needs within the organization (for example, enterprise content management) with the understanding that their product must serve many needs. Some critics argue that generalized products are mediocre in their effectiveness and performance (jack of all trades, master of none). However, some vendors have succeeded in developing effective broad-based solutions. Sometimes they succeed through the effective design of their product, but often they succeed through the acquisition of other products that complement each other to produce a tightly integrated suite.

Where possible, select a suite of tools that meets your enterprise content needs. If your organization decides to go with best-of-breed tools, the key to a successful unified content strategy lies in your ability to share content among the systems. To support these systems in sharing content, your content and consequently your tools should include the following:

  • Common information models (structures) and tagging

    Common information models with common tag standards (for example, both systems refer to the first heading as “Heading 1,” rather than “Heading 1” in one information model and “Title” in another model) make it possible for elements of content to be interpreted consistently, and they make it possible for content that has been created or maintained in one system to be reused transparently with content created or managed in another system.

  • Common metadata

    Common metadata ensures that the meaning and classification of elements is the same among systems. Common metadata makes it possible to automatically retrieve, reuse, and track content regardless of the system (see Chapter 9, “Designing metadata”).

  • Compatibility with other systems

    The capability to integrate with other systems (for example, authoring, other content management) is important to ensure that content can be shared. If a tool has built-in compatibility with other systems or a provides the facility to integrate with other systems, the tool will be more effective in ensuring that content can be shared.

Summary

Selecting the correct tool can be a difficult and lengthy process. However, arming yourself with as much information as possible, developing detailed criteria, and testing the product before you buy can help you make a successful selection.

Identify your criteria for selection.

  1. Develop a weighting system for your criteria.

  2. Develop a list of vendors to investigate.

  3. Request a custom demonstration from vendors that interest you.

  4. Send selected vendors an RFI or RFP that includes your detailed criteria and ask them to respond to your questions.

  5. Evaluate the responses or compare the custom demonstration against your criteria.

  6. Pick three vendors that most effectively meet your criteria (best ranking).

  7. Ask vendors to use a sample of your content and create a content-specific demonstration for you.

  8. Narrow your selection further to one or two vendors.

  9. Conduct a proof-of-concept to test the required functionality and determine whether the product meets your needs.

  10. Purchase the product if it performs well in the proof-of-concept.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset