On May 17 & 18, 2008, a workshop was conducted in …



[pic]

Development of Professional Certification

for

Community Resource Specialist –Aging/Disabilities (CRS-A/D)

The Alliance of Information and Referral Systems (AIRS at ) is a nonprofit association of more than 1,000 organizations that provide I&R across North America. In addition to the creation of international standards, AIRS offers a professional credentialing program for individuals working within the I&R sector of human services.

AIRS Certification

Certification is a measurement of documented knowledge in the field of I&R reflecting specific competencies and related performance criteria, which describe the knowledge, skills, attitudes and work-related behaviors needed by I&R practitioners to successfully execute their responsibilities.

The AIRS Certification Program is operated in alignment with national standards for credentialing organizations. There are currently more than 4,400 AIRS Certified practitioners in the United States and Canada.

Candidates may only apply to take the certification examination following the establishment of eligibility which is based on a combination of professional experience combined with educational background (for example, at least 1 year of I&R employment for applicants with a Bachelors or higher degree).

The examinations are administered by approved proctors. Once secured, AIRS Certification must be renewed every two years following an application requiring proof of continuous professional development.

AIRS Certification provides value to individuals, organizations and communities.

From Aging to Aging/Disabilities

In 2015, AIRS in partnership with the National Association of States United for Aging and Disabilities (NASUAD) and the National Association of Area Agencies on Aging (n4a), conducted a sectoral analysis that resulted in the credential, and the knowledge needed to obtain it, becoming a qualification that included ‘Aging and Disabilities’ as opposed to its original status as a qualification only related to ‘Aging’.

From I&R Specialist to Community Resource Specialist

Professional qualifications need to change with the times they live in and this has led to another major enhancement. Again, AIRS worked with its national partners, and a survey of more than 500 certification holders revealed the following:

|Does your I&R/A work involve you moving between different roles? (click all that are applicable): |

|Answer Choices |Responses | |

|Yes, I engage in service coordination |54% |279 | |

|Yes, I work with the client and family at length to fully determine their needs |58% |300 | |

|Yes, I engage in person-centered decision-support |67% |346 | |

|Yes, I help set up their assessment appointments |36% |186 | |

|Yes, I assist clients to complete applications and forms |60% |309 | |

|Yes, I engage in case management |31% |162 | |

|No, I am focused solely on I&R/A work |15% |75 | |

| |Answered |516 | |

The bottom line was that only 15% of our certification holders were “only” engaged in I&R/A work. The remainder were extending their expertise over a number of additional areas of responsibility.

Clearly, this required more exploration and in 2018, we launched our regular psychometric process to create new exams with this consideration paramount.

The process stretched over several months and contained nine distinct stages (summarized below and described in more detail within the document).

Stage 1: Job Task Analysis

Stage 2: Job Task Analysis Validation

Stage 3: Assessment of Existing Questions

Stage 4: New Question Development

Stage 5: Question Review

Stage 6: Cut Score Development

Stage 7: Exam Creation

Stage 8: Exam Review

Stage 9: Final Exam Simulation

Volunteer Subject Matter Experts (SMEs)

Central to the process was the participation of nearly 50 volunteers drawn from a wide range of aging and disabilities organizations and responsibilities (from directors to front-line staff) from 25 states who served as subject matter experts.

Each stage required the work of various volunteer groups chosen to reflect the diversity of our sector, both in terms of the type of work performed, the geographic region, the nature of the agencies (small and large/urban and rural), and in terms of the individuals themselves (from 2 to 20 years of I&R experience, different levels of education, and cultural backgrounds). Most groups also included members for whom English was a second language.

AIRS would like to thank all of the following individuals who volunteered for this challenging and highly confidential process, and their organizations for allowing their participation.

| | | | |

|Name of subject matter expert |Organization |City |State |

|Amber Chapin |Aging and Disability Resource Center of the Wolf River Region |Shawano |WI |

|Angel Eakins |SeniorAge Area Agency on Aging |Springfield |MO |

|AnnMarie Abbott |Olympic Area Agency on Aging |Aberdeen |WA |

|Becky Romshek |Aging Partners |Lincoln |NE |

|Bill Weathers |River Valley Area Agency on Aging |Columbus |GA |

|Charles Palmer |Northeast Georgia AAA |Athens |GA |

|Chelsea Crittle |Central MS Planning and Development |Jackson |MS |

|Christi Rocha |Leyden Family Service |Franklin Park |IL |

|Cindy Pischke |ADRC of Winnebago County |Oshkosh |WI |

|Crystal Shafiabady |Sourcewise |San Jose |CA |

|David Aldrich |Olympic Area Agency on Aging |Aberdeen |WA |

|Deborah Danner-Gulley |Area Agency on Aging, District 7 |Wheelersburg |OH |

|Debra Morgan |Valley Area Agency on Aging |Flint |MI |

|Diana G Barnard |Oregon Medicare Savings Connect-Multnomah County |Portland |OR |

|Elizabeth Brown |Sourcewise |San Jose |CA |

|Eram Abbasi  |Maryland Department of Aging |Baltimore |MD |

|Eric Rymer |Allegheny County Area Agency on Aging |Pittsburgh |PA |

|Gina Fox |Area 12 Agency on Aging |SONORA |CA |

|Holly Anderson |Deep East Texas COG |Jasper |TX |

|Jalawnda Bailey |Lt. Governor's Office on Aging |Columbia |SC |

|Joyce Cameron |ServiceLink - Belknap County |Laconia |NH |

|K Alison Hammond |Prince William Area Agency on Aging |Woodbridge |VA |

|Kimberly Rivard |Northeast Kingdom Council on Aging |Saint Johnsbury |VT |

|Laverdia McCullough |Tennessee Commission on Aging and Disability, |Nashville |TN |

|Lynda Southard |Cajun Area Agency on Aging |Lafayette |LA |

|mary bengston |kenosha adrc |Kenosha |WI |

|Melissa Ladd Patnode |Central Michigan 2-1-1 |Jackson |MI |

|Mitchell Forrest |Central Illinois Agency on Aging, Inc. |Peoria |IL |

|Nadine Autry |The Freedom Center, Inc. |Frederick |MD |

|Nanette Relave |NASUAD |Washington |DC |

|Olivia Harvey |Mountain Empire Older Citizens Inc |Big Stone Gap |VA |

|Patricia E. Correa |Christopher & Dana Reeve Foundation |Short Hills |NJ |

|Patti Mueller |Waukesha County Aging and Disability Resource Center |Waukesha |WI |

|Rachel J Anderson |Garfield County Aging and Disability Resource Center |Pomeroy |WA |

|Rachel Kaehny |ADRC of Washington County |West Bend |WI |

|Rebecca Rostron |ServiceLink Resource Center |Claremont |NH |

|Robyn Kistler |Care Connection for Aging Services |Salisbury |MO |

|Sally Williams |Macon County Health Department |Decatur |IL |

|Samantha Gardner |NASUAD |Washington |DC |

|Sarah Milhouse |Multnomah County ADRC |Portland |OR |

|Sharon Williamson |Department of BHDD |Savannah |GA |

|Shay Brandon |CICOA Aging & In-Home Solutions |Indianapolis |IN |

|Tasha Jones |Connections Area Agency on Aging |Council Bluffs |IA |

|Terri Free |North Central Flint Hills Area Agency on Aging |Salina |KS |

|Wenda Black |Harvey County Dept on Aging |Newton |KS |

|Youa Xiong |ADRC-CW |Wausau |WI |

Stage 1: Job Task Analysis

The first and most crucial stage involves the development of a Job Task Analysis (JTA). In order to create a testing instrument (i.e. an exam), you must know what needs to be tested. What does an I&R Specialist working in the Aging and/or Disabilities sectors, actually do? What skills and knowledge is needed to carry out those tasks competently? Which parts of the job are the most important?

The additional challenge of the JTA was to ensure that the skills and knowledge identified as necessary to handle the focused work of AAAs and ADRCs reflected the changing nature of their profession.

Our guides through this and all of the subsequent stages were AIRS’ certification consultants, Michael Hamm and Dr. Gerald Rosen.

Michael Hamm managed the accreditation program for the National Organization for Competency Assurance (NOCA) for eight years and is an American National Standards Institute (ANSI) certified auditor for compliance determination. He is a national authority on certification and accreditation – indeed he has literally written the book on it, authoring Fundamentals of Accreditation.

Our psychometrician, Dr Gerald Rosen, is a consulting psychologist specializing in the design and administration of testing programs, test validation, statistical analysis, and vocational testing. Dr. Rosen has more than 20 years of experience in testing and measurement, and has produced numerous professional publications on these topics. He has managed the development of many national certification and licensure programs. His background also includes a slice of I&R as he once worked at a crisis center in Pennsylvania.

The draft document established 7 Domains, 19 Tasks, 37 Knowledge Areas and 24 Skills Sets.

The Tasks were weighted in terms of their relative importance to the job and to the client (for example, “Screening/Assessment” was determined to comprise 19% of the overall job).

Stage 2: Job Task Analysis Validation

However, the JTA only represented the combined views of those SMEs who attended the session. Did those conclusions hold true for everyone else?

The draft document was shared with more than 1,600 current CRS-A/D participants providing them with a chance to make general and specific comments through an online survey.

The results validated the draft and enhanced it through several minor word improvements and subtle adjustments to the weightings of the various tasks. Here are the final approved domains:

|Domains |Subject of Domain |Weight of Domain |

|1 |Rapport |19% |

|2 |Screening/Assessment |20% |

|3 |Identification of Resources and Preferences |19% |

|4 |Information, Assistance, Referral and Advocacy |18% |

|5 |Documentation |11% |

|6 |Follow-up |9% |

|7 |Ethics, Professional, and Legal Issues |4% |

The document was approved by the AIRS Certification Commission. To view the full JTA report:

.

Stage 3: Existing Question Assessment

Once we knew what needed to be tested, how many of our existing CRS-A/D questions were still relevant and how many new questions would be required and in what areas?

Another volunteer SME team was assembled.

The challenge for this group was to go through the entire CRS-A/D question database of more than 300 items and decide which questions should be kept and which should be removed. Any retained items had to be assigned to a Domain/Task within the new JTA. At the end of this process, we were able to quantify how many new questions were needed and in which areas (for example, we needed 12 new questions that tested the understanding of documentation).

Only about 60% of the existing questions were deemed still relevant.

Stage 4: New Question Development

We now needed to write over 100 new questions and subject matter experts to write them.

These volunteers received training on question writing techniques.

This was the most challenging stage. Writing good questions is hard. But what is really hard is writing good ‘wrong’ answers. This involves coming up with three distracters for each question that are plausible without being misleading while still leaving one answer that is obviously correct (but only obvious to the individuals that properly understand the issue).

The group spent two days together engaged in individual and collaborative exercises that facilitated the process of question writing.

The session resulted in a final bank of about 300 new and revised questions. Each question was linked to a specific Domain within the new JTA and had a verifiable source.

Stage 5: Question Review

It was now time for the entire item bank to undergo an extensive review from volunteers that had not been involved in any earlier stage.

As a result, some questions were completely eliminated while the majority benefited from detailed editing. Very few questions emerged unaltered.

Stage 6: Cut Score Development

A cut score (or pass mark) is not a random number. It should represent the percentage of answers that most accurately reflects the ability of an individual who is competent in the issues being tested.

A cut score is never perfect. But it should be the score that eliminates as many false positives and false negatives as possible (that is, it tries to ensure that people who are competent, pass and that people who are not yet as competent, do not pass). Within this context, an exam might have a pass mark as high as 95 or as low as 40 if those marks represent the “border line” between someone with the desired amount of understanding and someone who has yet to reach that level.

The AIRS Certification Program, in common with many examinations, uses a methodology known as modified Angoff ratings to determine cut scores. Basically, this involves SMEs assigning a ‘degree of difficulty’ number to each question in the item bank. Technically, each exam can have a different cut score depending on the difficulty or otherwise of the questions within each exam. However, AIRS generates a mathematical formula to ensure that each exam contains the same balance of difficult and easy questions (that is, each exam has the same cut score).

As a natural part of this process, further changes were made to many questions to improve their clarity and some questions were eliminated.

Stage 7: Exam Creation

AIRS now had a database of more than 300 extensively reviewed questions but no actual examinations.

Using mathematical models from the cut score analysis, three new exams were created with an identical (to within 2 decimal points) balance of difficult/easy questions that accurately reflected the weighting of the JTA (for example, 10% of the questions involved a knowledge of the issues surrounding follow-up). By the end, each new exam had a cut score of 75.

Stage 8: Exam Review

But there were still two more stages required to ensure each exam was a fair test unto itself (for example, an exam did not contain two questions that addressed the same issue with similar words … or did not contain a question that inadvertently gave away another answer).

This process involved yet another opportunity to confirm and further hone item clarity and accuracy, and even at this stage, a handful of questions were removed as the SME team verified that there is one and only one correct answer for each question.

Stage 9: Final Exam Simulation

In the final stage, each of these three new exams underwent a simulation exercise which involved having ‘experienced CRS-A/D holders who had not been involved in earlier stages, take the exam under ‘real’ conditions. This enabled us to analysis the performance of each individual question in terms of whether it was clear to examinees and did not inadvertently lead competent individuals to choose an incorrect answer.

The process also ensured that we could be confident that each of the three exams returned equal results. As a bonus, this group of volunteers provided further suggestions and recommendations on the need to change certain questions.

Continuous Cycle

The process of reviewing JTAs, creating new questions and refreshing existing exams occurs about every 4 years, takes about 12 months to complete and follows national standards for credentialing organizations.

Appendix

Benefits of AIRS Certification

For the individual, AIRS Certification:

• Adds professional recognition to what you do. It addresses the misconception that I&R people “just answer phones.”

• Provides a transferable qualification. Many job postings state a preference for applicants with AIRS certification. As a consequence, there is much more mobility of I&R staff.

• Some agencies provide a pay increase for Certified staff.

For the agency, AIRS Certification:

• Builds confidence among staff – they believe more in their skills if they have been validated by an external body.

• Enhances agency quality assurance and consistency of service levels within your own I&R.

• Helps funders and other stakeholders understand and appreciate the professionalism involved in I&R. It shows that there is an emphasis on quality as the competencies of I&R positions have been defined it and are being externally tested.

• For the majority of its frontline staff is, along with AIRS Accreditation, often one of the criteria for securing and maintaining funding.

• The process of studying forces people to understand the context in which they perform their job and the skills that they need in a more systematic way.

• The alignment of training resources (e.g. ABCs of I&R and online training) with the Standards and the Certification process provides a continual enhancement of service.

For the general public and the human services sector, AIRS Certification:

• Enhances agency quality assurance and consistency of service levels between different I&Rs.

• Improves customer service. Staff are aware of the requirements for quality performance and are more ready and capable of meeting them.

................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download