Defense Civilian Intelligence Personnel System (DCIPS ...

[Pages:32]UNCLASSIFIED

Defense Civilian Intelligence Personnel System (DCIPS): Evaluation of Army Employee Performance Plans

Technical Report #769

Prepared by: Kara R. Jeansonne, Kevin G. Smith, and Meredith L. Ferro

January 29, 2013

PDRI, An SHL Company 3000 Wilson Boulevard, Suite 250

Arlington, VA 22201 voice: (703) 276-4680

fax: (703) 276-7567 e-mail: info@ UNCLASSIFIED

UNCLASSIFIED

Table of Contents

Table of Contents..................................................................................................................... 1 List of Tables............................................................................................................................ 2 Introduction .............................................................................................................................. 3 Evaluation Methodology.......................................................................................................... 5

Sampling ................................................................................................................................ 5 Evaluation Procedure............................................................................................................. 7 Results...................................................................................................................................... 9 Similarity of Objectives ..........................................................................................................13 Progression of Objective Difficulty.........................................................................................13 Leadership Objectives...........................................................................................................13 Recurring versus Non-Recurring Objectives..........................................................................14 Narrative Justifications ..........................................................................................................15 Qualitative Themes ...............................................................................................................19 Conclusions ............................................................................................................................20 Research Questions..............................................................................................................20 Limitations ...............................................................................................................................22 References ..............................................................................................................................23 Appendix A: Evaluation Criteria............................................................................................24 Appendix B: Performance Rating Descriptors.....................................................................31

1 UNCLASSIFIED

UNCLASSIFIED

List of Tables

Table 1: Performance Plan Sample Characteristics ................................................................... 5 Table 2: Specific Element of SMART Framework......................................................................10 Table 3: Measurable Element of SMART Framework................................................................10 Table 4: Achievable Element of SMART Framework.................................................................11 Table 5: Job Relevance Element of SMART Framework...........................................................11 Table 6: Organizational Relevance Element of SMART Framework..........................................12 Table 7: Time-bound Element of SMART Framework ...............................................................12 Table 8: Recurring versus Non-Recurring Objectives................................................................14 Table 9: Adequacy of Information Provided in Objective Self-Assessments ..............................15 Table 10: Adequacy of Information Provided in Objective Rating Official Assessments ............16 Table 11: Adequacy of Information Provided in Performance Element Self-Assessments.........18 Table 12: Adequacy of Information Provided in Performance Element Rating Official Assessments ............................................................................................................................18

2 UNCLASSIFIED

UNCLASSIFIED

Introduction

Given the significance of the information provided in the performance plans and appraisals for civilian employees covered under the Defense Civilian Intelligence Personnel System (DCIPS), the Undersecretary of Defense for Intelligence (USD(I)) and the Office of the Director of National Intelligence (ODNI) have jointly pursued rigorous evaluations of DCIPS implementation across the Defense Intelligence Enterprise. The purpose of this evaluation is to determine the extent to which performance plans and appraisals from the Army were prepared according to relevant performance management guidance and aligned with training.

For employees covered under DCIPS, performance plans and appraisals are used to: (a) document the key outcomes/results an employee is expected to achieve during the period of performance, (b) provide ratings on the extent to which the employee accomplished these outcomes and on the manner in which work is performed (as defined by common performance elements), and (c) provide narratives that justify these ratings.

Employees and supervisors mutually agree on an employee's performance objectives at the start of the period of performance. Then, the objectives get translated into meaningful projects or day-to-day activities. During the period of performance, at least one mid-term review of employee performance relative to the performance objectives and performance elements is conducted. At the end of the period of performance, employees complete a narrative related to the work that was conducted over the year. Then, the rater completes a final assessment during which he or she reviews that employee's self-assessment, evaluates the employee's performance relative to the objectives and elements, provides a numeric rating for the objectives and elements, and completes a narrative supporting the rating of record. Then, a higher level review is conducted. Lastly, the ratings provided in the performance appraisal are used in the pay pool panel process to determine final ratings and performance-based bonus payout.1

The content included in the performance plans and appraisals as well as the processes for their development and use were designed to support DCIPS' stated objectives:

1. Ensure alignment between individual performance objectives and the higher-level mission and objectives of the Intelligence Community (IC).

2. Ensure ongoing feedback between employees and supervisors regarding progress toward objectives and relative to standard behavioral elements.

3. Provide a basis for measuring and assigning accountability for individual and organizational performance for the accomplishment of these objectives.

4. Provide a fair and equitable process for appraising and evaluating employee performance within and across IC elements.

5. Maintain adherence to merit system principles2.

1 Except for NGA where the ratings still inform performance-based raises. 2 Department of Defense. (2010). DoD Civilian Personnel Management System: Defense Civilian Intelligence Personnel System (DCIPS) Performance Management (1400.25-V2011).

3 UNCLASSIFIED

UNCLASSIFIED The overall purpose of the evaluation study was to determine the extent to which Fiscal Year 2011 (FY11) Army performance plans and appraisals were prepared according to established performance management guidance and aligned with training. The research questions that guided each evaluation were:

To what extent do the objectives adhere to the "SMART+" framework (i.e., specific, measurable, achievable, relevant, time-specific, can be exceeded)?

To what degree do the objectives align with the intelligence mission, goals, and objectives of the Army?

To what extent are the objectives consistent by occupation and level (i.e., are employees in similar jobs and at similar levels held to the same standards)?

To what extent do objectives represent long-term outcomes versus recurring activities related to day-to-day work (i.e., recurring vs. non-recurring objectives)?

To what extent is the information provided in the self-assessment narratives adequate to support the performance ratings?

To what extent is the information provided in the raters' appraisal narratives adequate to support the performance ratings?

The remaining sections of this report detail the evaluation methodology, the results and conclusions of this study, and recommendations for evolving performance management at Army.

4 UNCLASSIFIED

UNCLASSIFIED

Evaluation Methodology

This section describes the process by which the samples of performance plans and accompanying appraisals were drawn and the procedures followed to evaluate them.

Sampling

PDRI, USD(I), and Army agreed upon a desired sample size of 300 plans and appraisals for the current study. These determinations were made considering a variety of factors including the size of the organization, the cost associated with reviewing the plans and appraisals, comparability to other evaluation studies, and the extent to which generalizable conclusions could be inferred based on the number of plans reviewed. For the current study, the sampling strategy was to achieve representativeness such that the conclusions drawn would reflect the agency as whole, rather than specific subpopulations (e.g., only Analysts). PDRI requested that Army draw a random stratified sample of plans and appraisals using several important background variables (e.g., occupation, gender) to meet the aforementioned goals of the study.

PDRI raters reviewed 300 plans and appraisals covering the FY11 administration cycle. The characteristics of the sample evaluated are presented in Table 1.

Table 1: Performance Plan Sample Characteristics

n

Percent of Sample

Work Category

Technician/Administrative Support

25

8.3%

Professional

228

76.0%

Supervision/Management

47

15.7%

Work Level

Entry

13

4.3%

Full Performance

169

56.3%

Senior

112

37.3%

Expert

6

2.0%

Pay Band

1

13

4.3%

2

18

6.0%

3

154

51.3%

4

110

36.7%

5

5

1.7%

5 UNCLASSIFIED

UNCLASSIFIED

Table 1: Performance Plan Sample Characteristics

n

Occupational Series

80 ? Security Administration Series

57

83 ? Police Series

1

85 ? Security Guard Series

1

86 ? Security Clerical and Assistant

15

101 ? Social Sciences Series

7

132 ? Intelligence Series

110

134 ? Intelligence Aid and Clerk Series

2

201 ? Human Resources Series

2

301 ? Miscellaneous Administration and Program Series

6

303 ? Miscellaneous Clerk and Assistant Series

1

318 ? Secretary Series

2

340 ? Program Management Series

1

341 ? Administrative Officer Series

1

343 ? Management and Program Analysis Series

12

346 ? Logistics Management Series

1

391 ? Telecommunication Series

1

501 ? Financial Administration Series

1

544 ? Civilian Pay Series

1

560 ? Budget Analysis Series

3

802 ? Engineering Technical Series

1

810 ? Civil Engineering Series

1

855 ? Electronics Engineering Series

6

905 ? General Attorney Series

1

1040 ? Language Specialist Series

2

1071 ? Audiovisual Production Series

1

1084 ? Visual Information Series

1

1102 ? Contracting Series

3

1301 ? General Physical Science Series

2

1310 ? Physics Series

1

1320 ? Chemistry Series

2

1410 ? Librarian Series

1

1411 ? Librarian Technician Series

1

1412 ? Technical Information Services Series

2

1603 ? Equipment, Facilities, and Services Assistance Series

1

1670 ? Equipment Services Series

1

1701 ? General Education and Training Series

1

1712 ? Training Instruction Series

20

1750 ? Instructional Systems Series

2

2001 ? General Supply Series

2

Percent of Sample

19.0% 0.3% 0.3% 5.0% 2.3% 36.7% 0.7% 0.7%

2.0%

0.3%

0.7% 0.3% 0.3%

4.0%

0.3% 0.3% 0.3% 0.3% 1.0% 0.3% 0.3% 2.0% 0.3% 0.7% 0.3% 0.3% 1.0% 0.7% 0.3% 0.7% 0.3% 0.3%

0.7%

0.3%

0.3%

0.3%

6.7% 0.7% 0.7%

6 UNCLASSIFIED

UNCLASSIFIED

Table 1: Performance Plan Sample Characteristics

n

2003 ? Supply Program Management Series

1

2005 ? Supply Clerical and Technician Series

1

2010 ? Inventory Management Series

1

2210 ? Information Technology Series

20

Sex

Percent of Sample 0.3%

0.3%

0.3% 6.7%

Male

195

65.0%

Female

105

35.0%

Race/Ethnicity

American Indian or Alaskan Native

0

0.0%

Asian

6

2.0%

Black

47

15.7%

Hispanic/Latino

6

2.0%

Native American or Other Pacific Islander

1

0.3%

Two or More Races

11

3.7%

White

229

76.3%

Evaluation Procedure

The first step in the evaluation process was to finalize the set of evaluation criteria against which the effectiveness and adequacy of the performance plans and appraisals would be assessed. These criteria were developed to answer the pertinent research questions presented earlier in the report and were leveraged in previous evaluation studies of similar systems (i.e., National Security Personnel System ? NSPS). The current criteria are very similar to the criteria used during reviews at Defense Intelligence Agency (DIA), Naval Intelligence, National Geospatial Intelligence Agency (NGA), National Security Agency (NSA), USD(I), and the National Intelligence Civilian Compensation Program (NICCP) implementation at ODNI. The final criteria are presented in Appendix A of this report.

The study team was composed of six Industrial/Organizational Psychologists who, on average, had approximately four years of experience. After the evaluation criteria were finalized, the study team members were trained on assessing the plans using the established criteria to ensure there was consistency in team members' rating approach. Training began with a review of background information on Army performance management and discussion of the rating criteria, research questions, and evaluation methodology. Then, each assessor independently evaluated the same subset of performance plans using the evaluation criteria. The assessors then met as a group, compared ratings, discussed any discrepancies, and came to agreement on a final rating for each performance plan.

7 UNCLASSIFIED

................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download