MANAGING INFORMATION SYSTEMS: A PRACTICAL ASSESSMENT TOOL

[Pages:6]MANAGING INFORMATION SYSTEMS:

A PRACTICAL ASSESSMENT TOOL

Preview Version February 1999

By the Information Technology Resources Board



FOREWORD

The Information Technology Resources Board (ITRB) is pleased to issue Managing Information Systems: A Practical Assessment Tool. This instrument is designed to assist Federal agencies in understanding how to strategically apply information technology to achieve their missions and deliver services and products.

The Assessment Tool contains a broad array of questions in nine areas from which to evaluate information technology systems: mission and vision, customers, business focus, executive direction, capital planning, project management, performance management, acquisition, and architecture. These questions reflect the ITRB members' extensive on-thejob experiences, as well as insights gained from assessments of critical information systems across the Federal government during the past several years.

This is a preview version of the Assessment Tool, which will continue to be enhanced over time. Comments or suggestions for improving it should be sent to:

Ginni Schaeffer Interagency IT Strategies Division

1800 F Street N.W. Room 2227

Washington, DC 20405

Additional information on the ITRB is available at

Arnold Bresnick Chair Information Technology Resources Board

ACKNOWLEDGEMENTS

Information Technology Resources Board (ITRB)

Arnold Bresnick, Chair Valerie Wallick, Vice Chair Sandra Borden Kenneth Byram Kevin Carroll Kay Clarey Wayne Claybaugh Mary Ellen Condon Mark Day Joanne Ellis George Hyder Ken Heitkamp Myron Kemerer Mike Laughon Jean Lilly Eric Mandel Emory Miller

Department of Agriculture Department of the Navy United States Coast Guard Federal Aviation Administration Department of the Army Department of the Treasury Social Security Administration Department of Justice Environmental Protection Agency Department of Agriculture Office of Personnel Management Department of the Air Force Nuclear Regulatory Commission Department of the Interior Internal Revenue Service Department of Commerce General Services Administration

The ITRB wishes to acknowledge the efforts of the management staff who assisted in the development of this tool.

Management Staff

Jake Asma Sandra Hense Avis Ryan Ginni Schaeffer

General Services Administration General Services Administration General Services Administration General Services Administration

Contents

Purpose... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... . 1

Background... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... . 1

Assessment Framework... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... .. 2

Who Should Complete the Assessment... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... .. 2

Evaluating Results Strategy Results... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ........ ... ... ....... . 3 Leadership Results.... ... ... ...... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ...... ... ... ... ... ... 4 Technology Results... ... ... ... ... ... ..... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ...... ... ... ... ..4

Strategy Mission and Vision... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... 5 Customers... ... ...... ... ... ... ... ... ...... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ..... ... ... 8 Business Focus... ... ... ... ... ... ... ..... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ...... ... ... . 10

Leadership Executive Direction... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... 12 Capital Planning..... ... ... ... ... ... ..... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... 15 Project Management ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... 18 Performance Management ... ... ..... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ..... 20

Technology Acquisition... ... ... ... ... ... ... ... ... .... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... . 24 Architecture... ... ... ... ... ... ... ... ..... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... .... ... ... . 26

Purpose

This Assessment Tool, which provides an evaluative framework and a checklist, has been developed by the Information Technology Resources Board (ITRB) to assist Federal government organizations in gaining a better understanding of how to strategically apply information technology to achieve their missions and deliver services and products. Mission performance and service delivery are top concerns for Federal agencies as compliance with the Clinger-Cohen and Government Performance and Results Acts requires a strong emphasis on performance improvement, accountability for achieving results, and cost efficiency. Technology is an essential enabler for these improvements.

This is an experienced-based instrument that incorporates key issues and experiences culled from ITRB assessments of major information systems and related processes. Designed for executives and other senior level members of agencies who rely on a specific IT system or an entire IT infrastructure to deliver services and products, this tool helps agency management understand how they are leveraging technology to improve mission accomplishment. The assessment can be used as an effective way to gauge perceptions of how well IT systems aid business performance.

The assessment tool was developed to focus on specific areas in need of attention. It provides value to respondents both by reading it and

completing it. The assessment presents respondents with broad guidance with which to evaluate an organization from strategic, leadership and technology perspectives. Completion of an assessment will focus an organization on the key issues identified through the ITRB's experience.

This preview version of the Assessment Tool highlights the key issues identified through the ITRB's experience. Use of the instrument and feedback will further our collective knowledge in these areas.

Background

The ITRB was formalized in July 1996 by Executive Order 13011 to provide peer assessments of mission critical information systems by: identifying critical issues and findings; framing these in management and technical perspectives; and by recommending actions to mitigate risks or resolve issues. ITRB members are experienced practitioners from across the Federal government who bring broad program, technical, and acquisition management expertise to managing and developing major IT systems. The ITRB's activities promote measurable improvements in mission performance and service delivery through the strategic application of information technology.

1

Assessment Framework

The assessment framework includes three broad perspectives: strategy, leadership, and technology. Within the strategy section, respondents understand where an organization is going, and answer questions about mission, vision, customers, and business focus. Within the leadership section, respondents consider how an organization is guided and answer questions about executive direction, capital planning, project management, and performance management. Within the technology section, respondents assess how well an organization's systems support performance and answer questions about acquisition and architecture.

I. Strategy

Questions about where the organization is going.

? Mission and vision ? Customers ? Business Focus

II. Leadership

Questions about how the organization is guided.

? Executive Direction ? Capital Planning ? Project Management ? Performance Management

III. Technology

Questions about the organization's systems.

? Acquisition ? Architecture

Who Should Complete the Assessment

This assessment should be completed by Federal government executives, senior program and project managers, project management team members, as well as support contractor managers, support contractor team members, and customers (which may include internal and external customers, end-users, and stakeholders) who rely on Federal IT systems. If a question does not apply or the respondent doesn't know the answer, the question should be skipped.

DEFINITIONS

Customer: A direct or indirect beneficiary of a technology related process or program. May include internal and external customers, end-users, and stakeholders.

Top Executive: A member of executive management of a Federal agency (political or career).

Senior Manager: An official with general responsibility for a number of Federal programs or projects.

Program/Project Manager: One who is directly responsible for acquisition, service delivery, or operations.

Project Team Member: One who participates directly in the design, implementation, or maintenance of a Federal program or system.

Support Contractor Manager: One who leads a team of contractors responsible for acquisition, service delivery, or operations.

Support Contractor Team Member: A member of a contractor team who participates directly in the design, implementation or maintenance of a Federal program or system.

2

Respondents should answer questions for their selected role only. It is also important that for a specific assessment, all respondents answer their questions within the same context, either:

? A specific, identified IT system, or,

? The general IT infrastructure of an organization.

At various points in the assessment tool, respondents are prompted to answer certain questions only if they influence key decisions about the use of technology in their selected respondent role. Examples of such decisions include participation in:

? Capital investment decisions about technology

? Decisions about the acquisition strategy for technology

? A decision about whether a new technology is compliant with the agency architecture.

To increase candor, it is recommended that responses remain confidential.

Evaluating Results

Collecting, analyzing, and comparing results among respondents within an organization is highly recommended. The greatest value comes from comparing responses among all role respondents to obtain a clear picture of trends, patterns, and common themes. This provides useful insights into problem areas.

Because each agency is different, there is no single approach to ensure improvement. Issues highlighted in the assessment offer insight into the crucial steps that executives, managers, and organizations need to take to effectively address problem areas.

To understand the organizational impact of the responses, review the answers marked YES and NO on the completed assessment for each category within the three broad perspectives: strategy, leadership, and technology.

Strategy Results

If most of the responses were YES in the strategy section, this indicates that the organization has a solid grasp on how IT systems link to the agency's overall business objectives as articulated in the mission, vision, and respective management plans. This also would indicate that customers (which may include internal and external customers, end-users, and stakeholders) are well integrated into the process of determining the agency's strategic direction.

Carefully examine the topic areas where responses were NO to find those that will require additional attention. If most responses were NO, then more attention should focus on clarifying the agency mission and vision. Examine the customer communications and feedback processes to strengthen their role in mission accomplishment, and further clarify business objectives and priorities.

3

Leadership Results

If most of the responses were YES in the leadership section, this indicates that the organization is being guided and managed effectively in the critical areas identified. Leaders of the agency are integrating its business priorities through established processes for managing technology and technology performance.

Carefully examine the topic areas where responses were NO to find those that will require further attention. If most responses were NO, then more attention should be focused on whether agency leaders are managing the business effectiveness of information technology. Focus on capital planning and reevaluate processes for making significant technology investments that support mission achievement.

Project management and performance management processes should be examined. Agency leadership should identify issues with existing or developing performance measures to strengthen performance management systems. Agency leadership should also identify issues around how projects are chosen, whether they support the agency mission and performance goals, and whether the organization's leaders are significantly involved in choosing and monitoring projects.

Technology Results

If most of the responses were YES in the technology section, this indicates that the agency is effectively integrating its business processes and goals with its information technology architecture. In addition, acquisition and contract administration processes are playing an effective role in acquiring and managing IT systems.

Carefully examine the topic areas where responses were NO to find those that will require additional attention. If most responses were NO, take a closer look at acquisition processes to gauge whether the agency's vision is reflected in clearly defined requirements and deliverables. Examine the agency's IT architecture to discern how well it focuses on work processes, information flows, and technical standards to provide services that achieve the agency's business objectives.

4

................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download