Pediatrician Perspectives on Feasibility and Acceptability ...
Pediatrician Perspectives on Feasibility and Acceptability of the MOCA-Peds 2017 Pilot
Laurel K. Leslie, MD, MPH,a,b Adam L. Turner, MPH,a Amanda C. Smith, MA,c Victoria Dounoucos, PhD,c Murrey G. Olmsted, PhD,c Linda Althouse, PhDa
BACKGROUND AND OBJECTIVES: The American Board of Pediatrics (ABP) certifies that general and subspecialty pediatricians meet standards of excellence established by their peers, immediately after training and over the course of their careers (ie, Maintenance of Certification [MOC]). In 2015?2016, the ABP developed the Maintenance of Certification Assessment for Pediatrics (MOCA-Peds) as an alternative assessment to the current proctored, closed-book general pediatrics (GP) MOC examination. This article is 1 of a 2-part series examining results from the MOCA-Peds pilot in 2017.
METHODS: We conducted quantitative and qualitative analyses with 5081 eligible pediatricians who registered to participate in the 2017 pilot; 81.4% (n = 4016) completed a quarter 4 survey and/or end-of-year survey (January 2018) and comprise the analytic sample.
RESULTS: The majority of pediatricians considered the MOCA-Peds to be feasible and acceptable as an alternative to the proctored MOC GP examination. More than 90% of respondents indicated they would participate in the proposed MOCA-Peds model instead of the examination. Participants also offered recommendations to improve the MOCA-Peds (eg, enhanced focus of questions on outpatient GP, references provided before taking questions); the ABP is carefully considering these as the MOCA-Peds is further refined.
CONCLUSIONS: Pilot participant feedback in 2017 suggested that the MOCA-Peds could be implemented for GP starting in January 2019, with all 15 subspecialties launched by 2022. Current and future evaluations will continue to explore feasibility, acceptability, and learning and practice change as well as sustainability of participation.
abstract
aThe American Board of Pediatrics, Chapel Hill, North Carolina; bTufts University School of Medicine, Boston, Massachusetts; and cRTI International, Durham, North Carolina
Drs Leslie and Olmsted, Mr Turner, and Ms Smith contributed to the conception, design, and evaluation of the Maintenance of Certification Assessment for Pediatrics pilot and its formative evaluation, the acquisition of the data, analysis, and interpretation of the data, and drafted the initial manuscript; Dr Dounoucos contributed to the analyses and interpretation of the qualitative data and drafted the initial manuscript; Dr Althouse contributed to the conception, design, and evaluation of the Maintenance of Certification Assessment for Pediatrics pilot and its formative evaluation; and all authors reviewed and revised the manuscript, approved the final manuscript as submitted, and agree to be accountable for all aspects of the work.
DOI:
Accepted for publication Sep 5, 2019
Address correspondence to Laurel K. Leslie, MD, MPH, The American Board of Pediatrics, 111 Silver Cedar Court, Chapel Hill, NC 27514. E-mail: lleslie@
WHAT'S KNOWN ON THIS SUBJECT: The Maintenance of Certification Assessment for Pediatrics (MOCA-Peds) is a Webbased knowledge assessment developed by The American Board of Pediatrics as an alternative to an examination at a secure test center. Both the model and Web-based platform were developed with extensive input from practicing pediatricians in 2015?2016.
WHAT THIS STUDY ADDS: This study is 1 of 2 articles in which authors report results from the MOCA-Peds 2017 pilot conducted with .5000 pediatricians. We examine the feasibility and acceptability of the MOCA-Peds for generalists and subspecialists maintaining their general pediatrics certification and summarize their recommendations.
To cite: Leslie LK, Turner AL, Smith AC, et al. Pediatrician Perspectives on Feasibility and Acceptability of the MOCAPeds 2017 Pilot. Pediatrics. 2019;144(6):e20192303
Downloaded from news by guest on October 6, 2021 PEDIATRICS Volume 144, number 6, December 2019:e20192303
ARTICLE
The American Board of Pediatrics (ABP), 1 of 24 boards affiliated with the American Board of Medical Specialties (ABMS), provides certification for general and subspecialty pediatricians after completion of training and throughout their career through its Maintenance of Certification (MOC) program. In 2013, Hawkins et al,1 on behalf of the ABMS, published an article describing the MOC's theoretical basis. Continuing certification aims to address gaps in the quality of medical care related to the exponential increase in new medical knowledge, knowledge lost over time, and gaps and delays in practice change in response to new knowledge.2?5 In their article, Hawkins et al1 highlighted the role of the ABMS and its member boards in continually examining their programs to "assure the public and the profession that they are meeting expectations, are clinically relevant, and provide value to patients and participating physicians, and to refine and improve them as ongoing research indicates."
In 2015, the ABP hosted a Future of Testing Conference to consider changes to its assessment programs. Presentations highlighted the role of assessments in both quantifying a priori medical knowledge as well as increasing opportunities to refresh previous knowledge and learn new information.6 Coupled with emerging theories on how adult physicians learn7,8 and technological advances in assessment,9?12 the ABP Board of Directors made the decision to design and pilot an alternative assessment approach to the existing proctored, closed-book MOC examination.13 The Maintenance of Certification Assessment for Pediatrics (MOCAPeds) was modeled after the American Board of Anesthesia's (ABA) recently piloted assessment, termed MOCA-Minute,14,15 and was envisioned as quarterly delivering multiple-choice questions to
pediatricians through a Web-based interface with immediate feedback after answering to encourage learning.
In 2015?2016, the ABP partnered with RTI International (RTI), a nonprofit research institution, to obtain input from certified pediatricians on the proposed MOCA-Peds' model to be tested in 2017?2018 and early prototypes of the Web-based platform through user panels and focus groups. The model had 3 primary goals: maintain production of a valid assessment of medical knowledge, incorporate opportunities for learning, and improve the experience for pediatricians participating in continuing certification.16 User panels and focus groups conducted in 2016 recommended the ABP focus on offering opportunities for learning that incorporated flexibility and accessibility. Proposed features included (1) 80 total multiple-choice questions per year, delivered quarterly through a Web-based platform (with an alternative mobile interface); (2) flexible use of resources (eg, books, Internet); (3) immediate feedback regarding each question (eg, rationale, references, peer-benchmarking); and (4) review of completed questions and answers through a question history page.
User panels and focus groups also provided feedback on the Web-based platform to ensure usability of the interface. The user panel and focus groups raised concerns about the potential stresses associated with a timed, Web-based assessment conducted continuously over a pediatrician's career. Policies put in place to address these concerns consisted of (1) learning objectives for each year's questions to permit studying if desired, (2) consistent time allocation to answer each question (5 minutes), and (3) opportunities to improve retention of
information by receiving 2 questions for each learning objective. (For additional details, see . mocapeds.)
A joint RTI-ABP research team proposed a formative evaluation process in 2017?2018, with initial piloting of the model and Web-based platform in 2017 followed by refinement of both in 2018. Because one-fifth of subspecialists maintaining at least 1 subspecialty also took the general pediatrics (GP) MOC part 3 examination in 2014?2016, the ABP was committed to developing an alternative GP assessment that would be feasible and acceptable for both general pediatricians and subspecialists maintaining their GP certification. In addition, the ABP hoped a similar model might work for the MOC examinations for its 15 subspecialties. With this article, we thus address the following questions. Is the MOCAPeds feasible and acceptable to both general pediatricians and subspecialists as an alternative to the GP MOC examination? Will this same model be feasible and acceptable as an alternative for subspecialty MOC examinations? In the partner article, "Pediatrician Perspectives on Learning and Practice Change in the MOCA-Peds 2017 Pilot," the authors describe in more detail participant reports related to learning and practice change.
METHODS
Design and Methodologic Approach
The research team employed elements from the construct, inputs, processes, and products evaluation model17 and the intervention mapping model,18 which are both formal frameworks for program planning and design, production, implementation, and evaluation. These approaches are focused on the measurement of practical aspects of implementation to ensure success, including feasibility and acceptability,
Downloaded from news by guest on October 6, 2021
2
LESLIE et al
with iterative refinement and testing based on feedback from program developers (the ABP) and end users (pediatricians). We defined feasibility and acceptability as whether pediatricians can easily participate in the MOCA-Peds as designed (eg, technological capacity, time) and whether the MOCA-Peds meets pediatricians' expectations (eg, satisfaction, quality, intent to use in the future).
Procedures
After review by the ABP's Research Advisory Committee and an exempt designation by the RTI Institutional Review Board, all pediatricians (n = 6814) eligible to take the GP MOC examination in 2017 were contacted to participate in the pilot and enrolled via a registration survey. In total, 5081 pediatricians enrolled. As part of participation, pediatricians were sent surveys after completion of each quarter's questions on elements under development; 2017 MOCAPeds participants were also randomly selected to participate in focus groups to (1) provide feedback on the results of the quarterly surveys and (2) review any recommended changes to the Web-based platform before implementation of those changes. In January 2018, a voluntary end-of-year survey was administered to those pediatricians who completed the 2017 pilot and met the passing standard.
Analytic Sample
Of the 5081 pilot participants, 4238 responded to the quarter 4 and/or voluntary end-of-year survey and composed the analytical sample for this article; 62.2% (n = 2634) completed both surveys, 32.6% (n = 1382) completed the quarter 4 survey only, and 5.2% (n = 222) completed the end-of-year survey only. Available demographic characteristics (sex, age, medical school type, certification status) were examined for differences between the analytic sample and participants not completing these surveys. No significant differences were observed (data not shown; P . .05).
Survey Instruments
The ABP's Certification Management System (CMS) provided baseline demographic characteristics (age, sex, medical school type, certification status [general pediatricians versus subspecialists]); the registration survey captured additional characteristics identified in focus groups that might impact feasibility of the MOCA-Peds, specifically clinical hours worked, comfort with technology, and library access for articles. Surveys after quarters 1 to 3 and user panel and focus group results are not reported in this article. The quarter 4 survey targeted participants' experience over the entire pilot year and included domains related to feasibility and
acceptability (Table 1). The end-ofyear survey garnered feedback on the 5-year model proposed to be implemented in 2019, which specified (1) alignment of the MOCA-Peds with the 5-year MOC cycle, reducing confusion with different examination and MOC deadlines; (2) automatic dropping of the 4 lowest quarters in an individual's 5-year cycle to accommodate work and/or life events; and (3) a dedicated fifth year in the MOC cycle as an optional year to take the proctored MOC examination should an individual not meet the MOCA-Peds passing standard in years 1 to 4. In addition, it included 1 open-ended item "What one change would you make to MOCA-Peds to provide more value to your clinical practice?" the quarter 4 and end-of-year surveys also investigated subspecialty issues, specifically, (1) the feasibility and acceptability of the GP version of the MOCA-Peds for subspecialists and (2) the MOCA-Peds platform as an alternative to the MOC examination for 15 subspecialties.
Analyses
Sample characteristics were calculated by using data from the ABP's CMS and the registration survey. Response scales for key variables of interest were collapsed into categories (eg, responses "strongly agree" and "agree" were recoded to "agree," and the responses "strongly disagree" and "disagree"
TABLE 1 Working Definitions of Feasibility and Acceptability and Sample Outcomes of Interest for the MOCA-Peds 2017?2018 Pilot
Term
Definition
Sample Outcomes of Interest
Feasibility Whether the MOCA-Peds can be implemented as designed (eg, sufficient Hours worked
time and technological capacity of pediatricians to participate; fit with Comfort with technology
daily life activities without undue burden)
Academic library access to read current journal articles
Perceived time to complete questions (individually and per quarter)
Mechanisms in place to address (1) work and/or life events and (2)
potentially not meeting the MOCA-Peds passing standard after 4 y
of participation
Acceptability Whether the MOCA-Peds meets the expectations of participating
Quality of the MOCA-Peds questions
pediatricians (eg, satisfaction)
Overall satisfaction with the MOCA-Peds components and model
Decreased anxiety with the MOCA-Peds compared to the proctored
examination
Intent to continue to use once the MOCA-Peds was offered as an
alternative to the proctored examination
Downloaded from news by guest on October 6, 2021
PEDIATRICS Volume 144, number 6, December 2019
3
were recoded to "disagree"). x2 tests were conducted to determine statistically significant differences between general pediatricians and subspecialists. SPSS version 25 (IBM SPSS Statistics, IBM Corporation) was used for all quantitative analyses.19
Qualitative responses to the openended question regarding 1 recommended change were evaluated by using thematic analysis techniques.20 After an initial review of open-ended responses, a set of 4 broad categories was identified. A second round of thematic analysis began that involved reviewing the open-ended text to ensure the existing code assigned to a response was appropriate and examining themes with relatively few counts to determine if they could be reclassified. The third and final round of analysis was focused on narrowing the scope of the existing categories to 3 categories and determining if there was a need for subcategories. The first round of analyses was conducted by 2 of the authors (M.G.O., V.D.) independently; this was followed by a review of the coding by both authors and then the second round of coding by the same authors. For the final round of coding, a third author (L.K.L.) reviewed the coding, and where there were differences, the authors worked to adjudicate the coding to ensure consistency. In this process, 9 subcategories were identified, including counts and sample quotes, providing a more specific classification of responses.
RESULTS
Sample Description and Feasibility of Web-Based Assessment
The majority of the analytic sample comprised general pediatricians (73.9%); the remaining sample was subspecialists. Ages ranged from 35 to 75 years; the majority were female
(66.7%) and American medical graduates (78.0%). With respect to feasibility, the average hours worked per week varied, with the largest group working 40 to 50 hours/week (32.8%). The majority displayed a high comfort level with technology, and just more than half (56.2%) had access to an academic library. The results, in particular the comfort with technology, indicate that the MOCAPeds is feasible because diplomates have access to and are comfortable with technology (Table 2).
Acceptability: The MOCA-Peds Question Format
Quarter 4 survey respondents agreed or strongly agreed (hereafter "agreed") that questions aligned with the learning objectives (88.2%), assessed clinical judgment (81.8%),
and were relevant to the practice of GP (81.5%) and to their specific practice setting (59.1%). For most, the time to answer questions was sufficient (78.4%). Differences were noted by certification status, with general pediatricians reporting a higher rate of agreement than subspecialists for all survey items except for question relevance to GP, for which subspecialists reported higher agreement (Table 3).
Acceptability: The MOCA-Peds as an Alternative Assessment
Acceptability with the MOCA-Peds as an alternative assessment was also measured on the quarter 4 survey (n = 4016) (Table 4). Approximately three-quarters (73.7%) agreed that the MOCA-Peds was an adequate assessment of fundamental GP
TABLE 2 Characteristics of Analytic Sample by Certification Type (N = 4238)
Sample Characteristics
Certification Type Status
GP Certificate Only, %
GP and $1 Subspecialty Certificate(s), %
Sex
Male
Female Age group
30?39 40?49 50?59 601 Medical school graduate type
American Medical
International Medical Average hours worked per wka
,20 h per wk 20 to ,30 h per wk 30 to ,40 h per wk 40 to ,50 h per wk 50 to ,60 h per wk $60 h per wk Comfort with technologyb
Using a smartphone
Using a tablet
Using a laptop or desktop computer Finding clinical information online
Completing CME online
Academic library access
Yes No
29.4 70.6
18.3 47.4 29.6 4.7
80.3 19.7 n = 3080 5.2 10.9 21.5 34.3 19.0 9.2 n = 3130 80.8 72.9 91.0 82.0 82.7 n = 3130 48.6 51.4
44.3 55.7
25.9 40.5 28.9 26.3
71.5 28.5 n = 1098 2.2 4.5 8.8 28.8 30.6 25.1 n = 1106 86.2 79.1 96.5 91.7 90.5 n = 1106 77.8 22.2
N = 4238 reflects the total number of participants that responded to either the 2017 pilot quarter 4 survey and/or the 2017 pilot end-of-year survey. Demographics are pulled from the ABP CMS; work characteristics are pulled from the 2017 pilot registration data set. CME, continuing medical education, a Participants instructed to include all time spent in administrative tasks, professional activities, research, medical education, and direct patient care and to exclude time on call when not actually working. Reference period = past 6 mo. b Respondents reporting moderately, vary, or extremely comfortable.
Downloaded from news by guest on October 6, 2021
4
LESLIE et al
TABLE 3 Participant Report on Feasibility and Acceptability of MOCA-Peds Questions by Certification Type (n = 4016)
Survey Item on Quarter 4 Survey
Certification Typea
Disagree, %
Neither Agree nor Disagree, %
Questions aligned with the learning objectives
GP
2.7
8.9
Subspecialist
2.1
10.1
Total
2.5
9.3
Questions assessed my clinical judgment, going beyond factual recall
GP
6.9
10.3
Subspecialist
6.7
14.3
Total
6.8
11.4
Questions were relevant to GP
GP
7.9
12.3
Subspecialist
3.4
10.1
Total
6.7
11.8
Questions were relevant to my practice, if practicing
GP
11.9
19.9
Subspecialist
34.5
32.0
Total
17.8
23.1
I had enough time to answer each question
GP
13.1
8.0
Subspecialist
14.1
8.9
Total
13.4
8.3
--, not applicable. a Certification type includes maintenance of GP certificate only (GP) or maintenance of GP certificate plus $1 subspecialty certificates (subspecialist). b P values indicate the level of significance of comparisons between the GP and subspecialist participants on the survey items.
Agree, %
88.4 87.8 88.2 82.8 79.0 81.8 79.8 86.4 81.5 68.2 33.5 59.1 78.8 77.0 78.4
Pb
.32 -- -- .002 -- -- ,.001 -- -- ,.001 -- -- .45 -- --
practice knowledge and helped one to stay current in GP (79.6%); most were satisfied with it as a replacement for the current part 3 examination (93.1%), would prefer to participate in the MOCA-Peds (88.7%), and were likely to
recommend the MOCA-Peds to a friend (92.8%). Both general pediatricians (71.4%) and subspecialists (90%) perceived the MOCA-Peds as feasible for subspecialists to maintain their GP knowledge.
With respect to potential anxiety associated with continuous assessment, the majority (88.7%) agreed they felt less anxiety participating in the MOCA-Peds compared to the proctored examination, and their anxiety about
TABLE 4 Participant Report on Acceptability of the MOCA-Peds as an Alternative Assessment to the Proctored Examination
Survey Item on Quarter 4 Survey
Certification Disagree, Neither Agree Agree, Pb
Typea
%
nor
%
Disagree, %
The MOCA-Peds is an adequate assessment of the fundamental knowledge used in everyday GP
9.2
general pediatric practice.
Subspecialist 7.4
Total
8.7
The MOCA-Peds program helps me stay current in GP.
GP
6.0
Subspecialist 5.2
Total
5.8
I am overall satisfied with the MOCA-Peds as a replacement for the previous part 3 testing model GP
2.2
for MOC.
Subspecialist 1.3
Total
2.0
I prefer to take a continuous assessment like the MOCA-Peds over the current, secure testing GP
5.5
center examination.
Subspecialist 5.7
Total
5.6
I am likely to recommend the MOCA-Peds to a friend or colleague instead of taking the secure GP
2.3
testing center examination.
Subspecialist 1.3
Total
2.0
Participation in this new MOCA-Peds model is a feasible method for subspecialists to keep up to GP
2.7
date with GP knowledge.
Subspecialist 3.1
Total
2.8
I am less anxious about taking an examination using the online MOCA-Peds system compared to GP
3.6
taking a proctored examination at a secure testing center.
Subspecialist 3.1
Total
3.4
My level of stress about maintaining my GP certification through the MOCA-Peds has decreased GP
7.9
during the pilot.
Subspecialist 6.9
Total
7.6
16.5
74.3 .005
20.6
72.0 --
17.6
73.7 --
14.8
79.2 .58
14.2
80.6 --
14.6
79.6 --
5.2
92.6 .12
4.4
94.3 --
5.0
93.1 --
6.1
88.4 .20
4.6
89.7 --
5.7
88.7 --
5.4
92.4 .12
4.7
94.0 --
5.2
92.8 --
25.9
71.4 ,.001
6.9
90.0 --
20.9
76.3 --
7.4
89.0 .23
9.0
88.0 --
7.8
88.7 --
11.2
81.0 .52
11.8
81.3 --
11.3
81.1 --
--, not applicable. a Certification type includes maintenance of GP certificate only (GP) or maintenance of GP certificate plus $1 subspecialty certificates (subspecialist). b P values indicate the level of significance of comparisons between the GP and subspecialist participants on the survey items.
Downloaded from news by guest on October 6, 2021
PEDIATRICS Volume 144, number 6, December 2019
5
................
................
In order to avoid copyright disputes, this page is only a partial summary.
To fulfill the demand for quickly locating and searching documents.
It is intelligent file search solution for home and business.
Related download
- let s talk about the alphabet soup of abp moc cme
- pediatrician perspectives on feasibility and acceptability
- moca peds development of a new assessment of medical
- board certification in pediatrics
- moca peds 2017 pilot summary american board of pediatrics
- prep product catalog cps
- the best pediatric cme cpd for the best pediatric care
- new feature for moca peds participants
- maintenance of certification assessment for pediatrics
- moca peds pilot summary american board of pediatrics
Related searches
- journals on technology and education
- essay on science and technology
- xfinity best deals on tv and internet
- articles on love and relationships
- studies on technology and education
- speech on science and technology
- articles on school and society
- research on technology and education
- books on finance and investing
- sociological perspectives on education
- perspectives on personality pdf
- interactionist perspectives on the family