Objective Goal Roadblocks
AutoPlan QA Projects scheduled in AP.
  • Stability of AutoPlan as a tool
AutoTeam 100% time tracked in AutoTeam
  • Time constraints / business trips
DP Umbrella Operational Defect Tracking System
  • Stability of DPU.
  • Systems administration.
  • Time to train.
  • What to track
Battlemap Have 100% of code paths tested.
  • McAfee training is expensive.
  • In-house training may be difficult and take away time from other tasks.
  • Possible licensing issue
ACE Insight Use ACE Insight to track where data elements are used.
  • Time to train.
  • ACE only works with C code.
  • Possible licensing issue
Softguide Evaluate a tool for measuring customer satisfaction.  
QA Automator (1) Have an automated test tool.
  • Time to train.
  • Only 3 licenses available.
  • Test cases have to be updated continuously.
  • Licenses are obsolete
QA Automator (2) Run regression tests automatically. Tool may not be suitable as a test harness.
ClearCase Have all builds made from ClearCase.
  • Lack of knowledge base.
  • Depends also on SW developers
Regression Test Machine Have a dedicated regression test machine.
  • Not budgeted.
  • Continuing system administration support

Tools Objectives, Processes, and Metrics


  • Objective: Have all QA Projects scheduled in AutoPlan
  • Process:
    • The project managers will enter their projects into AutoPlan.
    • The QA Manager will enter in internal QA projects into AutoPlan.
  • Metrics: Number of projects tracked by AutoPlan.


  • Objective: Have each team member’s time tracked in AutoTeam
  • Process:
    • Set up QA resources in the database.
    • Schedule QA infrastructure project in AutoPlan.
    • Get AutoTeam installed on each team member’s PC.
    • Train team members on using AutoTeam
    • Conduct periodic audits to assure that people are using the product.
  • Metrics: Percentage of hours tracked in AutoTeam.

DP Umbrella

  • Objective: Have an operational defect tracking system
  • Process:
    • Install DP Umbrella.
    • Decide what information we wish to track with each defect.
    • Train the team on DPU use.
    • Develop Crystal Reports to report these bugs.
  • Metrics: Number of defects tracked.


  • Objective: Have 100% of code paths tested.
  • Process:
    • Get Battlemap up and running.
    • Attend formal training to learn the application.
    • Teach staff on the application.
  • Metrics:
    • Number of executables analyzed by Battlemap.
    • Percentage of code paths tested.

ACE Insight

  • Objective: Identify where each data element is being used.
  • Process:
    • Install ACE Insight.
    • Train selected staff members on its use.
    • Interview development team leads.
  • Metrics: Percentage of data elements identified.


  • Objective: Evaluate a tool for measuring customer satisfaction.
  • Process: Work with Cherie to see what this tool can deliver.
  • Metrics: Recommendation for use of the tool.

QA Automator (1)

  • Objective: Have an operational automated test tool.
  • Process:
    • Install QA Automator
    • Train Staff.
  • Metrics:
    • Percentage of forms automated
    • Percentage of fields automated.

QA Automator (2)

  • Objective: Run regression testing automatically.
  • Process: Evaluate QA Automator as a possible test harness
  • Metrics:
    • Number of regression tests run.
    • Research and recommendations for another tool if required.


  • Objective: Test only code that has been through configuration management.
  • Process:
    • Participate with development on drafting the configuration management plan.
    • Learn how to extract builds from ClearCase
    • Train the staff.
  • Metrics: Number of conforming builds.

Regression Test Machine

  • Objective: Have a dedicated test machine.
  • Process:
    • Identify the type of machine needed and software.
    • “Scrounge” around the organization for spares.
  • Metric: An operational regression test machine

Back to Strategic Plan