Georgia Data Management, Assessment & Reporting System



GEORGIA

PROFESSIONAL

STANDARDS

COMMISSION

Dr. F. D. Toth

Executive Secretary

Georgia Data Management, Assessment & Reporting System

PSC Steering Committee &

BOR Exploratory Committee

Joint Working Session Report

June 23-24, 2003

table of contents

Intended Outcomes: 3

Activities: 3

I. Understand the stages of developing an electronic assessment system 3

Step 1: Analysis of Current Process 3

Group Comments: 3

Step 2: Identification of Key Steps in Assessment System 3

Group Comments: 3

Step 3: List of Attributes 4

Group Comments: 4

Step 4: Standardization 4

Group Comments: 4

Step 5: State Reports 5

Group Comments: 5

II. View CampusTools Reports and Assess their Value to IHEs 5

Report 1: Publications by Departmental Faculty (Pie Chart) 5

Report 2: Retention Rate of Teachers over Time (Bar Chart) 6

Report 3: Teaching Experience of Entering Graduate Candidates (Bar Chart) 6

Report 4: Unit Admissions by Program (Line Chart) 7

Report 5: Unit Percentage Expense Share by Category 7

III. Define Transition Points Data 8

Activity 1 8

Activity 2 9

Key Differences in Transition Points 10

Terminology Differences 10

General Questions and Discussion Points 10

IV. Clarify Key Questions 11

V. Identify Next Steps 11

Appendix A: Agenda 12

Appendix B: Questions Generated from Small Group Discussions May 21, 2003 Meeting 14

Appendix C: Identifying Transition Points Worksheet #1 18

Appendix D: Georgia State Univeristy List of Data Elements 20

Appendix E: Armstrong Atlantic State University Department of Early Childhood Education Professional and Ethical Behavior Assessment 23

Appendix F: CampusTools Sample Reports 26

Appendix G: Tk20 Initial Client Survey 31

Appendix H: Meeting Evaluation Summary 43

Intended Outcomes:

I. Understand the stages of developing an electronic assessment system

II. View CampusTools reports and assess their value to IHEs

III. Define Transition Points data

IV. Clarify key questions for any vendor; specifically Tk20

V. Identify next steps

Activities:

Understand the stages of developing an electronic assessment system

Step 1: Analysis of Current Process

• Identification of current methodology, facilities, resources

• Identification of changes to methodology

• Tk20 Initial Client Survey (Appendix G)

Group Comments:

• This work will take place at the institutions

• It will help if a Louisiana representative (not associated with the company) can be at our next meeting to help identify gaps in our process and framework.

• Tk20 should make changes to the system to accommodate our needs rather than asking us to change our methodology to fit CampusTools.

Step 2: Identification of Key Steps in Assessment System

• Transition Points

• Standards

• Artifacts

• Evaluation Instruments

• Institutional Reports

Group Comments:

• What does PSC need from IHEs to keep us on schedule with NCATE?

• What is needed over and above NCATE requirements?

• What does BOR need?

• How often?

• Induction data incorporates the 3-years after graduation; DOE’s involvement in this is key.

← Invite Steve Preston to be a member of the Steering Committee.

• Is PSC/BOR interested in artifacts?

• Artifacts more so than numbers?

• Transition from INPUTS to OUTPUTS is key; the focus has to be on Student learning.

• We need a baseline set of data to compare the 3-year induction data to.

• Support from LEAs and RESAs is critical.

• Are high-level institutional discussions happening?

o Yes, at some.

o Faculty evaluation information is online at UGA

o Such a system has the potential for university-wide use; this might make the system “sell” better to IHE leaders.

Step 3: List of Attributes

• Student information

• Faculty information

• Standards

• Field experience artifacts

• School Information (where field experiences and clinicals take place)

Group Comments:

• Clarify that “student” = teacher-candidate

• We need a statewide identification/definition of K-12 schools listed in the program.

• Everyone has to use a common set of definitions (i.e. teacher-candidates)

• Include the Banner OIIT in this work

• Data derived from CampusTools can be used for trend analyses related to program completers and induction

o to compare with DOE data about teacher shortages, projected hiring needs, and high-needs schools.

o IHE recruiting can be based on this data and will therefore be much more efficient.

Step 4: Standardization

• Identify similar data points from institutions

• Discuss importance of dissimilar data points

• Include or discard dissimilar data points for common application

Group Comments:

• IHE data needs will differ; don’t discard any data

• We need to conduct a gap analysis to determine which data points are needed

• Consistent data definitions across IHEs are critical

Step 5: State Reports

• Identify similarities in data definitions and data requirements

• Resolve differences in data

• Created limited set of data requirements

• Create “anchor” attributes

• Define anchors

Group Comments:

• What are the federal data requirements?

• We need to create an awareness of data definition differences

• What are the AACTE requirements?

• Anchors: need a clear definition of what they are

• What data will the state have automatic/unlimited access to?

• IHEs prefer that some data not be available to state at all times and require “sending” to state at the appropriate time; timing is important with some data

• . This is another topic with which consensus is important.

View CampusTools Reports and Assess their Value to IHEs

Several CampusTools sample reports were displayed (included in Appendix F). For each report the group was asked to respond to the following questions.

• Will this report be of value to your institution?

• Will this report be valuable aggregated at the state level? Which data points could be used?

• Which data points were used to create this report?

Report 1: Publications by Departmental Faculty (Pie Chart, Appendix F, page 26)

1. Will this report be of value to your institution?

• Yes.

• This report is used to show program quality to the Board.

• This report is critical for tenure, promotion, AACTE, etc.

• This helps to answer the question, “How do we, as a state, define the quality of our faculty/programs?”

• This report can be used to show the quality of the people preparing educators.

2. Will this report be valuable aggregated at the state level? Which data points could be used?

• NCATE

• AACTE

• Definitions are critical

3. Which data points were used to create this report?

• Chapters, books, presentations

• Is this self-reported?

Report 2: Retention Rate of Teachers over Time (Bar Chart, Appendix F, page 27)

1. Will this report be of value to your institution?

• Yes.

• Compare with Praxis scores

• Identify changes to alert to problems or improvements

• Make clear what group this includes (i.e. those “counseled out”)

• Need the “Why” behind the #s

• Provides a broad picture for institutions

1. Will this report be valuable aggregated at the state level? Which data points could be used?

• Compare to national averages

• Compare to similar institutions

• Identify type of schools

3. Which data points were used to create this report?

• Program entry

• 1st year in program

• Alternative preparation

Report 3: Teaching Experience of Entering Graduate Candidates

(Bar Chart, Appendix F, page 28)

1. Will this report be of value to your institution?

• We need to know how much teaching experience; the chart doesn’t provide that information

• Other categories, such as Speech and Psychology, would be helpful

• Not as valuable as the other reports

• MAT muddies the water; leave it out of this report

2. Will this report be valuable aggregated at the state level? Which data points could be used?

• Break out by programs

• Allow targeting for marketing

• More value % preparing for or planning for NBPTS

3. Which data points were used to create this report?

• Raw #s

• # of faculty with P-12 experience

Report 4: Unit Admissions by Program (Line Chart, Appendix F, page 29)

1. Will this report be of value to your institution?

• Enrollment trends

• Upcoming needs

• Identify hiring needs

2. Will this report be valuable aggregated at the state level? Which data points could be used?

• Trend with retirement projections data from DOE

• Break into content areas

Report 5: Unit Percentage Expense Share by Category

(Bar Chart, Appendix F, page 30)

1. Will this report be of value to your institution?

• Align to Standard 6 and compare to other units on campus

• Not a priority

2. Will this report be valuable aggregated at the state level? Which data points could be used?

• Trend analysis—budget cuts; how we are making up for them

• Student teaching costs

• Percentages are helpful

3. Which data points were used to create this report?

• Unit $ spent

• How will this data be entered?

Define Transition Points Data

Activity 1

In small groups, participants were asked to define transition points and identify the activities that occur at each. Transition points were identified as Beginning, Middle and End. Groups were asked to share their lists with the entire group and those responses were recorded and are shown below.

|Beginning |Middle |End |

| |Data | |Data Collected | |Data Collected |

|Events |Collected |Events | |Events | |

| |Praxis I |Admission to Teacher |Dispositions | |Praxis II |

|Admission to |GPA |Education |Efficacy scale |Licensure |Portfolio |

|university |# hrs completed successfully| |Interview | |Academic profile |

| |Criminal background check | |Praxis II (before stud.| | |

| |Areas A-E/core curriculum | |tchg.) | | |

| |and part of F | |Portfolio | | |

| |Pass Regents exam | |GPA | | |

| |SAT | |Methods and all | | |

| |Application (interview, | |coursework | | |

| |writing sample, pre-prof. | |Advisor approval | | |

| |Experience) | |Background check | | |

| |Unique student identifier | |Insurance | | |

| |Faculty recommendations | |Philosophy statement | | |

| |English proficiency | | | | |

| |Math proficiency | | | | |

| | |Application to | | | |

|Praxis I Score | |Student Teaching | |Induction | |

|Admission to College| | | | | |

|of Education | |Field Experience | | | |

|Admission to Program| |Student Teaching | | | |

| | |Program Completion | | | |

A discussion ensued about the terms “transition points” and what they convey. NCATE uses that terminology but does not specify a minimum number of transition points. The group discussed the fact that this terminology makes one think of timing. The process is often thought of in terms of number of courses, the time at which a student reaches a transition point, or the passing of tests. The group agreed that the focus has to be on outcomes/outputs, not inputs—what the aforementioned issues imply. A part of this discussion centered on the fact that alternative preparation programs cannot be held to the same “transition points” because of the condensed timeframe in which teacher-candidates matriculate through the program. The term “Benchmark” was discussed and the NCATE definition was displayed for the group to consider. The group arrived at a consensus to use the term “Benchmarks” rather than “Transition Points” to imply that outcomes will be considered rather than inputs and that timing is not important.

Activity 2

Participants were encouraged to think in terms of program completion outcomes and again in small groups, list the evidences, or data that would indicate teacher-candidates’ successful acquisition of knowledge, skills and dispositions. Results of those discussions are listed below.

|Knowledge |Skills |Dispositions |

|Grades in core courses |Field experience evaluations |Haberman Assessment |

|Pre-test & remediation |Cultural diversity assessment (pre & post tests |Student Teaching evaluations – outside|

| |incorporated with field experience assessments) |instrument |

|Candidate Performance Instrument (KSU |Candidate Performance Instrument (KSU sample |Candidate Performance Instrument (KSU |

|sample available at |available at ) |sample available at |

|) | |) |

|M.A.P. (external, nationally recognized |M.A.P. (external, nationally recognized |M.A.P. (external, nationally |

|instrument) |instrument) |recognized instrument) |

|GPA | |Ethics evaluation |

|Content GPA | |Professional Dispositional Statements |

| | |(a self-diagnostic tool used at UGA) |

| | |Professional and Ethical Behavior |

| | |Assessment (AASU tool, Appendix E) |

| | |Content Faculty evaluation of |

| | |teacher-candidates’ dispositions; a |

| | |prescriptive tool followed by |

| | |remediation if necessary. |

Throughout Activities 1 and 2, running lists were maintained of key differences in transition points and terminology differences. Those lists are below.

Key Differences in Transition Points

• Praxis II use

• Timeline—by program and by type of student

• Definitions

• BOR & PSC

• Admission requirements and enrollment

• Live –vs.- timed data

• Quality

• Assessments for outcomes

Terminology Differences

• Teacher education

• Admission

• Student teaching

• Intern

• Practica

• Field Experience

• Mentor

• Supervisor

• Cooperating Teacher/Collaborating Teacher

• Transition Point (consensus to change to Benchmark(s))

General Questions and Discussion Points

• Should we follow the NCATE terminology across all programs and institutions?

• It is difficult to identify where graduates are in the induction process; is there any way to use the Program Completer report, combined with certification reports to help alleviate this problem? The HiQ system may be a possible solution, but it needs to be looped to program completer data.

• We need a common set of codes that identify the reason(s) a teacher leaves the district or profession

• DOE could very easily require that information be submitted on some kind of form when keys are turned in (before the last paycheck is released).

• We need some kind of summative evaluation instrument to indicate teacher success in the classroom.

• Transition points are not as important as collecting data on K.S.D.

• Get an ERD/data map from Tk20

• E-mail Tk20 initial client survey documents to meeting participants

• Georgia has more required field experiences before student teaching than Louisiana

• Value-added concept: is something in place to address those candidates that don’t meet benchmarks at the usually expected time? It may take longer for some people than others. The goal should be to recognize potential and put remediation steps in place to help the teacher-candidate grow between benchmarks. That progression/growth between benchmarks represents the “value-added”.

• Why is this system worth $90.00 to our teacher-candidates?

• How have Louisiana IHEs adapted? We need to see their use of the product at various IHEs (small, large, private, public).

• Allow access to CampusTools prior to the next meeting

• Challenges and questions lie around the issue of standardization of data, related to performance issues across IHEs

• Need to outline a common set of K.S.D. based on NCATE and BOR principles.

• Benchmarks will replace Transition Points in our terminology.

• How will policy issues be resolved by BOR and PSC?

• 900-hour field experience and 120 cap requirements—these two issues seem to debilitate; we need to look at quality as opposed to a number of hours.

• Get an IHE person who worked on this process to our next meeting. Help with process; identify gaps.

• Will standardization of data produce data that will be used to compare programs and institutions?

Clarify Key Questions

Participants were asked to read the set of questions derived from the May 21st meeting and prioritize them by category, in terms of how important they are in the decision-making process. Prioritizations will be shared at the next meeting.

Identify Next Steps

1. KSU to receive Tk20 with server

a. Explore system with sample data

b. Evaluate the system

2. Standards in the system

3. Collect Knowledge, Skills and Dispositions evaluations and assessments from IHEs

4. Create a set of common definitions that all IHEs will adopt

5. Send out report from this meeting; make reports available on the PSC website

6. Get data maps from Tk20

7. Share add-on functionality needs with Tk20 (reports and queries)

Appendix A: Agenda

Georgia Data Management, Assessment & Reporting System

PSC Steering Committee and BOR Exploratory Committee

Joint Working Session

June 23-24, 2003

Georgia Southern University

AGENDA

Monday, June 23

1:30 - 2:00 Introduction, Project Status and Meeting Outcomes

2:00 - 2:30 Developing an Electronic Data Collection System

2:30 - 3:30 CampusTools Demonstrations and Discussions

3:30 - 3:45 Break

3:45 - 5:30 Defining Transition Points and Designing an Electronic Assessment System

5:30 - 8:00 Dinner & discussion

Tuesday, June 24

8:30 - 10:00 Identifying Transition Points Data

10:00 - 10:15 Break

10:15 - 12:00 Converting qualitative performance measures into quantitative data

12:00 - 1:00 Working Lunch: Meeting Wrap-up and Next Steps

Appendix B: Questions Generated from Small Group Discussions May 21, 2003 Meeting

Questions Generated from small group discussions

Key: G = Government (PSC/BOR/DOE)

I = Institution

T = Tk20

|Who Should | |

|Answer |Question Category: Company & Product Performance |

|T |Is assessment data on the system available from Louisiana? |

|T |What guarantee do we have that Tk20 will be around in the future? |

|T |How long has Louisiana used the system? |

|Who Should | |

|Answer |Question Category: Cost |

|T |What is the cost model? |

|T |Is the $10,000 fee a one-time fee? If not, how long will it be in effect? |

|T |What are the fees for upgrades? |

|G |Will the PSC pay for the product for the institutions? |

| |Can we charge $90/year per student? Can it be assessed through a “Lab Fee”? Will this need to be approved by a |

|G |state agency? |

| |Will teacher-candidates have to pay a fee after graduation from colleges of education? |

|T | |

|T |What is the charge for tech support? Will the Institution or Tk20 provide it? |

|Who Should | |

|Answer |Question Category: Functionality |

|T |Which product roles are in development now? (state and/or university users) |

|T |Will program portfolios and unit portfolios be possible? |

|T |Will it accommodate qualitative data (i.e., admissions video, dispositions)? |

|T |How can the institutions access data from reports generated across the state? |

|T |How will the system deal with graduates who do not teach right away? |

| |Are system checks and balances in place for assuring data accuracy? If so, what are they? |

|T | |

| |Will we have access to data from outside sources, such as Praxis II subtest scores, SSN to locate places of |

| |employment post-graduation, public school achievement data to access impact on student achievement (crt, |

|T |standardized tests, discipline reports, IEP, ISFP data)? |

|T |How is student data transferred from one institution to another? |

| |Will the system be flexible enough to simplify reporting to multiple agencies…SPAs, PSC, NCATE, and SACS? |

|T | |

|T |Will it be individualized for the needs of each institution? |

| | |

|Who Should | |

|Answer |Question Category: Functionality (continued) |

| |Can components (i.e. surveys) be sent to people external to institutions (teachers, reporting agencies, etc.)? |

|T | |

| |Some students can perform well on portfolios, but not teach well. How will the product accommodate/address this? |

|T | |

|T |How automated is the account management system? |

|T |Address privacy issues and security. |

|T |How long will the data be archived? |

|T |When will statistical enhancements be available? |

|T |Is there, or can there be an advising component? |

|T |Can we track students who leave the state? |

|T |Will the system be linked to IHE websites? |

|T |Will the system track students after they leave an IHE? |

| |How can this system be used to apply to school psychologists, counselors, etc.? |

|T | |

| |Is it adaptable to other needs? (advising, certification, graduation, credit hours reports) |

|T | |

|Who Should |Question Category: General Use and Statewide Implementation Issues |

|Answer | |

| |Are there common definitions for terms dealing with programs, applicants, candidates, recent graduates, faculty |

|G |(such as “unit”). Who will set these? |

| |How do we account for various admissions requirements across institutions? |

|G | |

|G |How do we quantify candidate performance? |

| |The wide variety of programs and conceptual frameworks need to be considered. |

|G | |

| |Will the preparation programs’ transition points be standardized throughout the state? |

|G | |

| |Can it, must it, will it be tied to a tiered certification system or other state-level accountability system? |

|G | |

| |How will we assure that we will receive feedback about our graduates from school principals and mentor teachers? |

|T/G | |

| |Can we pilot the software with a private institution? (Berry College is participating in the Pilot Project.) |

|G | |

| |Will the use of this system create the need for additional clerical staff? (Problems with students self-reporting)|

|I | |

|G |Is this an attempt to make institutions look more alike? |

|G |Will it be mandated for each institution? |

|G |Will RESAs & LEAs be required to use this? |

|G/I |How will it be evaluated? |

|Who Should | |

|Answer |Question Category: Implementation, Training & Support |

| |What type of personnel and support services (internal/external) will be available? |

|T | |

| |What are the technology support personnel requirements that institutions must provide? |

|T | |

|T |Will training be provided? Startup and incremental? |

| |Are there going to be multi-level training sessions? (technical, faculty, student) |

|T | |

|T |Who will train IHE faculty and students? |

|T/G |What is the proposed time frame for implementation? |

|Who Should | |

|Answer |Question Category: Reporting |

| |Will it answer the question, “does the Conceptual Framework align with state and national standards”? |

|G/I | |

| |With the data warehouse in place and connected to IHEs, why will universities need to create a report? Why not |

|G |have agencies download data from each university? |

|G |Who will determine the data for Tk20? (triggers/reports) |

|G |What types of information are we required to report? |

| |How will political issues that arise as a result of a “poor” performing or private institutions be addressed? |

|G | |

| |Who will be responsible for determining common needs across agencies…DOE, PSC, and BOR? |

|G | |

| |How will measures for PSC, NCATE, BOR, Title II, P-16, Professional organizations be aligned? |

|G | |

| |Will it answer the question, “How well are our programs doing for students and faculty”? |

|G/I | |

|Who Should | |

|Answer |Question Category: System & Hardware |

|T |Can each institution individualize their system? |

|T |What hardware resources are needed? |

|T |Will data backup systems be in place? |

| |Is server capability protected? How often are backups performed? Where is the backup data stored? |

|T | |

|T |Will it interface with Banner or other student information systems? |

|T |Is it compatible with infrastructures already in place? |

|T |Will data sources actually merge successfully? |

| |How does Tk20 connect with DOE school-based assessment or student data? |

|T | |

Appendix C: Identifying Transition Points Worksheet #1

Identifying Transition Points

Worksheet #1

Transition Point: ____________________________________________________________

____________________________________________________________

|Data to be collected |Format & Types of Instruments |Evaluator(s) |

| | | | |

| | | | |

| | | | |

| | | | |

| | | | |

| | | | |

| | | | |

| | | | |

| | | | |

| | | | |

| | | | |

| | | | |

Appendix D: Georgia State Univeristy Sample List of Data Elements

Content Knowledge

a. Grades in key courses

b. Praxis II scores

c. Assignments scored with rubrics

d. Analysis of lesson and unit plans

e. Observation by faculty and clinical teachers

f. Portfolios

g. Reflection tasks

h. National Board Certification

i. Others

Teaching/professional performance

a. Analysis of lesson plans

b. Observation by faculty and clinical teachers with rating scale

c. Observation by faculty and clinical teachers with rubrics

d. Self and peer observation/report with rating scale

e. Self and peer observation/report with rubric

f. Observations by building administrators with rating scale

g. Observations by building administrators with rubric

h. Interviews of self, peer, faculty, clinical teachers, administrators, pupils

i. Reflection tasks

j. Portfolios

k. Non-teaching professional performance indicators (list)

l. Supervisors’ evaluations

m. National Board Certification

n. Others

o. List all sources used for INTASC or NPBTS assessment reports

Pupil Outcomes, Learning, Change

a. Criterion tests (describe design and analysis)

b. Normed tests (describe test/s and analysis)

c. Student work samples (projects, portfolios, reports) (describe assessment procedures, e.g., rubrics)

d. Students’ grades

e. Anecdotal reports from other teachers, parents, etc.

f. National Board Certification (?)

g. Others

Dispositions

a. Interviews (entrance, in-stream, exit, follow up)

b. Psychometric scales

c. Observations with rating scales

d. Observations with rubrics

e. Writing sample

f. Peer assessments (e.g., in cooperative tasks)

g. Logs/reports of professional development activities

h. Reflection tasks

i. National Board Certification

j. Others

Unit Data Sources

a. Biodata

b. Major/field

c. SAT/ACT

d. GRE

e. Transcript analysis

f. GPA

g. Praxis I

h. Praxis II

i. INTASC assessment reports

j. Other

Appendix E: Armstrong Atlantic State University Department of Early Childhood Education Professional and Ethical Behavior Assessment

DEPARTMENT OF EARLY CHILDHOOD EDUCATION

PROFESSIONAL and ETHICAL BEHAVIOR ASSESSMENT

Successful teacher candidates must exhibit not only knowledge and performance competencies but also an appropriate disposition for teaching. The AASU candidates are expected to abide by the Georgia Professional Standards Commission Code of Ethics and demonstrate the highest qualities of character and ethical, professional behaviors that include, but are not limited to, personal integrity, respect and a commitment to all students, parents, faculty and colleagues.

CHECK ONE:

|_______ |All the teacher candidates in this class or field experience consistently exhibit the appropriate disposition and professional|

| |and ethical behaviors needed to be an effective educator. |

|_______ |With the exception of the following candidate who has not exhibited the expected behavior(s) that is checked below, the |

| |remaining teacher candidates in this class or field experience demonstrate the appropriate disposition and professional and |

| |ethical behaviors needed to be an effective educator. |

| |Name: | |(Use a separate form for each candidate.) |

| | | | |

| |_____ |1. |Dependable, prepared, prompt, and uses time wisely, |

| |_____ |2. |Follows school and program procedures and policies, |

| |_____ |3. |Takes initiative and responsibility for his/her own learning, |

| |_____ |4. |Accepts and implements suggestions for improvement, |

| |_____ |5. |Interacts appropriately with instructors, supervisors, peers, and students during university and |

| | | |school-based experiences, |

| |_____ |6. |Offers assistance and works cooperatively during university and school-based experiences, |

| |_____ |7. |Makes decisions based on reflection, |

| |_____ |8. |Maintains proper professional dress and hygiene, |

| |_____ |9. |Uses correct and proper oral and/or written language and |

| |_____ |10. |Other. |

| | | | |

|CLASS OR PRACTICUM: | |SEMESTER: | |

| | | | |

|SIGNATURE: | |DATE: | |

This form is completed every semester by all faculty and clinical supervisors and the documented area(s) of concern is discussed with the candidate. If two or more faculty and/or supervisors complete this form about a candidate then a written remediation plan will be implemented. A Candidate who fails to show substantial improvement during the following semester will be dismissed from the program.

4/2000-ECE Dept.

Appendix F: CampusTools Sample Reports

Appendix G: Tk20 Initial Client Survey

Tk20 Initial Client Survey

INSTITUTION DATE

TITLE OF RESPONDENT

1) Approximately How many candidates are currently enrolled in the following programs at your institution?

a. Undergraduate teacher preparation program

b. Master’s of education program

c. Doctoral education program

d. Practitioner and/or post-baccalaureate programs

2) APPROXIMATELY HOW MANY OF THE FOLLOWING ARE EMPLOYED IN YOUR UNIT?

a. Tenure-track and Tenured Professors

b. Full-time Instructors

c. Part-time Adjunct Professors

3) YOUR INSTITUTION IS WHICH OF THE FOLLOWING?

a. Public institution b. Private institution

4) WHAT IS THE DATE SCHEDULED FOR YOUR UNIT’S NCATE REVIEW?

Month Year

5) wHAT INFORMATION ON STUDENT FIELD EXPERIENCES IS SYSTEMATICALLY COLLECTED and analyzed?

(Check all that apply)

a. Evaluation of candidate by cooperating teacher

a1. collected a2. analyzed

b. Evaluation of cooperating teacher by candidate

b1. collected b2. analyzed

c. Evaluation of field experience site by candidate

c1. collected c2. analyzed

d. Candidate reflection on field experience

d1. collected d2. analyzed

If collected, briefly describe mechanism for collecting the reflections and how they are used:

e. Number of students taught in field experience

e1. collected e2. analyzed

f. Race/ ethnic makeup of students

f1. collected f2. analyzed

g. Subjects taught by candidate

g1. collected g2. analyzed

h. Exceptionalities of students

h1. collected h2. analyzed

i. Gender of students

i1. collected i2. analyzed

j. Other

j1. collected j2. analyzed

6) Briefly describe how information on the diversity of a candidate’s field experiences is collected and analyzed individually and aggregately.

a. Collected:

b. Analyzed individually:

c. Analyzed aggregately:

7) Are any of the data provided by field-based

personnel on student field experiences collected electronically, that is, submitted on disk, via email, or to a specific server location?

a. Yes b. No

If yes, briefly describe.

8) Is there an established mechanism allowing candidates, faculty and administrators to electronically communicate with and be contacted by field-based personnel, as opposed to via mail, telephone or personal contact?

a. Yes b. No

If yes, briefly describe the mechanism.

9) HAS YOUR UNIT IDENTIFIED CANDIDATE OUTCOMES THAT ARE ALIGNED WITH YOUR CONCEPTUAL FRAMEWORK?

a. Yes b. No

a.

9.1 IF YES, WHAT ARE THE TOOLS CURRENTLY BEING USED TO ASSESS THESE CANDIDATE OUTCOMES?

__________________________

10) DOES YOUR CURRENT SYSTEM of assessment use a portfolio approach, that is, a collection of a candidate’s work which can be used to demonstrate his or her skills and accomplishments, in order to evaluate the knowledge, skills, and dispositions of teacher candidates

a. In the unit’s undergraduate program?

a1. Yes a2. No

b. In the unit’s graduate program?

b1. Yes b2. No

c. In the unit’s practitioner/post-baccalaureate program?

c1. Yes c2. No

11) Regarding your unit’s candidate assessment system, are any of the data used for candidate assessment gathered electronically, that is, submitted by students on disk, via email, or to a specific server location?

a. Yes b. No

If yes, briefly describe.

12) Is there an established mechanism allowing candidates to electronically communicate with and be contacted by faculty and administrators, as opposed to via mail, telephone or personal contact?

a. Yes b. No

If yes, briefly describe the mechanism.

13) Is an instructional technology course required for

a. Initial candidates? Yes No

b. Advanced candidates? Yes No

14) Does your unit assess the “infusion” of Technology, that is, how well faculty and students utilize technology For teaching and learning?

a. Yes b. No

If yes, what tools are currently being used for this assessment?

15) On a scale of 1 to 5 with 1 being “not familiar at all” and 5 being “very familiar,” how familiar are you with NCATE Standard 2: Assessment System and Unit Evaluation?

(Circle a number)

1 2 3 4 5

not familiar at all very familiar

16) on a scale of 1 to 5 with 1 being “not familiar at all” and 5 being “very familiar,” how familiar are you with NCATE Standard 3: Field Experience and Clinical Practice?

(Circle a number)

1 2 3 4 5

not familiar at all very familiar

17) on a scale of 1 to 5 with 1 being “not well at all” and 5 being “very well,” in your opinion, how well is your institution currently meeting the expectations of NCATE Standard 2: Assessment System and Unit Evaluation?

(Circle a number)

1 2 3 4 5

not well at all very well

18) On a scale of 1 to 5 with 1 being “not familiar at all” and 5 being “very familiar,” in your opinion, how well is your institution currently meeting the expectations of NCATE Standard 3: Field Experience and Clinical Practice?

(Circle a number)

1 2 3 4 5

not well at all very well

19) NCATE Expects a unit to not only collect data on candidate AND program performance, but also to analyze it and then make changes to the programs based on the findings. What processes, if any, does your unit currently employ to analyze data and then use that data to “loop back” and improve programs?

20) What mechanisms are currently used to systematically evaluate candidate knowledge, skills, and dispositions at admission, at transition points and at program completion?

Undergraduate Program

Admission

Data Collected:

ACT/SAT

High School Transcript

2. Entrance to Teacher Education Program

Data Collected:

3. Entrance to Student Teaching

Data Collected:

4. Exit of Program

Data Collected:

Graduate Program

Admission

Data Collected:

GRE scores

Undergraduate Transcript

2. Candidacy

Data Collected:

3. Exit of Program

Data Collected:

Practitioner/ Post-Baccalaureate Program

Admission

Data Collected:

ACT/SAT

High School Transcript

2. Candidacy

Data Collected:

3. Exit of Program

Data Collected:

Appendix H: Meeting Evaluation Summary

Participants were asked to rate small and large-group discussions and activities using the following scale:

Not Informative 1 – to Very Informative 4

Not Useful 1 – to Very Useful 4

Tallies of participants’ ratings are shown in the table below.

Meeting Activities/Discussions |Not Informative |  |  |Very Informative |Not Useful |  |  |Very Useful | |Scale |1 |2 |3 |4 |1 |2 |3 |4 | |Developing an Electronic Data Collection System |  |2 |4 |6 |  |3 |3 |4 | |  |  |  |  |  |  |  |  |  | |CampusTools Demonstrations & Discussions |1 |4 |4 |3 |1 |6 |  |3 | |  |  |  |  |  |  |  |  |  | |Defining Transition Points |  |2 |4 |6 |1 |2 |3 |4 | |  |  |  |  |  |  |  |  |  | |Identifying Transition Points Data |  |1 |5 |6 |  |1 |3 |6 | |

Participants were also asked to respond to open-ended questions. Responses and comments are below.

What additional support and/or tools do you need to evaluate a data collection tool (such as CampusTools)?

• Will it work with UGA system? If not, are there additional costs associated with marking it work?

• Needs to interface with SPSS and qualitative software (Nudist, Ethnographer, etc.)

• Hands-on work

• Access to the program for a longer period of time.

• Software evaluation tool/instrument.

• A rep. (faculty, Dean) from a college/university in the state of LA.

• Actually have access to CampusTools.

What additional support and/or tools do you need to assess the viability of such a reporting system for your programs?

• Capacity--UGA enrollment is 4,700+ (grad and undergrad)

• Hands-on and demonstrations

• Definitions, standardized rubrics

• Input of IT

• Actually have access to CampusTools.

What suggestions for improvements would you make for Committee sessions?

• We need to see actual application and be able to manipulate it to see what does work and what does not work. What is already there; what needs to be added, etc. I think we all agree we need a database application to assist with report generation.

• Electronic meetings

• Representative from institution that currently uses this system and who helped develop the framework for his/her institution.

• I like the idea of having committee members work on, think about, and research information regarding meeting prior to us coming together as a group.

• Not all in Statesboro, please.

Additional comments?

• What else is out there? It would be useful to know what competition is available.

• Information needed about Tk20:

o place of incorporation

o latest financial information

o size of staff, etc.--do they have capacity to handle LA and GA institutions?

o solvency issues

o list of Board of Directors, company officers, etc.

• I think whatever state system we develop, it needs to be simple and elegant for the widest buy-in. Anything that is too complex or cumbersome will not be effective.

• Is there some way qualitative data analysis software integration; Nudist perhaps?

• Thank you for shifting the focus and abandoning the agenda.

-----------------------

[pic]

................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download