NQC 26 - Item 09c - Att C - Materials used in workshops



Workshop materials

Communication and Dissemination NQC products: validation and moderation

June 2011

Contact

NQC Secretariat

TVET Australia

Level 21/390 St Kilda Road Melbourne Vic 3004

Telephone: +61 3 9832 8100

Email: nqc.secretariat@.au

Web: .au

Disclaimer

This work has been produced on behalf of the National Quality Council with funding provided through the Australian Government Department of Education, Employment and Workplace Relations and state and territory governments. The views expressed herein are not necessarily those of the Australian Government or state and territory governments.

Acknowledgement

This work was undertaken by Andrea Bateman and Chloe Dyson from Quorum Australia Pty Ltd as part of a project commissioned by the National Quality Council in 2011 with funding through the Australian Government Department of Education Employment and Workplace Relations and state and territory governments and were used as resource materials to support the national workshops conducted as part of that project.

Documents are reproduced in this document as Word documents to enable users to adapt them within their own contexts

©Commonwealth of Australia 2011

This document is available under a “Preserve Integrity” licence.

For details:

All other rights reserved. For licensing enquiries contact sales@.au

Workshop materials

Communication and Dissemination NQC products: validation and moderation

Activity 1: Consulting with industry

In your groups discuss what input employers (you might wish to specify a vocational area) could provide to develop valid assessment tools and processes.

For the following scenarios, note down 2/3 questions you could ask employers and how the responses will inform the development or review of assessment tools and/or processes.

***********************************************************************

Scenario 1

You are consulting with employers who have agreed to take students on work placement:

Questions:

Impact on assessment tools and/or processes:

***********************************************************************

Scenario 2

You are introducing a new VET qualification. All assessment will be carried out at the school so you need to construct a simulated environment for assessment purposes. What questions can you ask employers to assist you to develop an environment that will meet AQTF requirements? Refer to the next page for the AQTF definition of a simulated environment.

Questions:

Impact on assessment tools and/or processes:

A simulated work environment[1]

The requirement for a unit of competency to be assessed in a simulated workplace environment may be identified either within the unit of competency itself or within the relevant Training Package Assessment Guidelines.

A simulated workplace may be required for the following reasons:

• The learner may not have access to a workplace.

• The available workplace may not use the relevant skill, equipment or process.

• Conducting assessments may be disruptive or interfere with work requirements, e.g. there may be ethical, privacy or confidentiality issues to consider.

• It may not be appropriate to apply the skills in the workplace due to potential risks such as health and safety or equipment being damaged.

For the purposes of assessment, a simulated workplace may be described as one in which all of the required skills are performed with respect to the provision of paid services to an employer or the public can be demonstrated as though the business was actually operating.

In order to be valid and reliable, the simulation must closely resemble what occurs in a real work environment.

The simulated workplace should involve a range of activities that reflect real work experience. The simulated workplace should allow the performance of all of the required skills and demonstration of the required knowledge.

It is critical that when a simulated workplace is being set up, the assessor is thoroughly familiar with the competency standard/s as well as experienced in the current circumstances and environment of the workplace.

In deciding whether a simulation or an assessment environment has been adequately set up, the following should be considered.

Are there opportunities to:

• Test the full range of equipment

• Use up-to-date equipment and software

• Reflect times and deadlines

• Show the complexity of dealing with multiple tasks

• Involve prioritising among competing tasks

• Deal with customers, including difficult ones

• Work with others in a team

• Communicate with diverse groups

• Find, discuss and test solutions to problems

• Explore health and safety issues

• Answer practically oriented, applied knowledge questions

• Show the level of written and verbal expression sufficient for, but not exceeding, the work requirements.

Activity 2: Self Assessment

The following self-assessment is useful for the assessor when reviewing the administration, scoring, recording and reporting components of an assessment tool.

Check to see that the tool has the following information documented to enable another assessor to implement the tool in a consistent manner.

|Major component |Type of information |

|The Context |The purpose of assessment (e.g., formative, summative) |

| |Target group (including a description of any background characteristics that |

| |may impact on performance) |

| |Unit(s) of Competency |

| |Selected methods |

| |Intended uses of the outcomes |

|Competency Mapping |Mapping of key components of task to Unit(s) of Competency (see Template A.2) |

| | |

|Information to candidate |The nature of the task to be performed (how). This component outlines the |

| |information to be provided to the candidate which may include: |

| |Standard instructions on what the assessor has to say or do to get the |

| |candidate to perform the task in a consistent manner (e.g., a listing of |

| |questions to be asked by the assessor). |

| |Required materials and equipment. |

| |Any reasonable adjustments allowed to the standard procedures |

| |Level of assistance permitted (if any) |

| |Ordering of the task(s) |

|Evidence from candidate |Describe the response format – i.e., how will the candidate respond to the |

| |task (e.g., oral response, written response, creating a product and/or |

| |performance demonstration) |

| | |

|Decision making rules |Instructions for making Competent/Not Yet Competent decisions (i.e., the |

| |evidence criteria) |

| |Scoring rules if grades and/or marks are to be reported (if applicable) |

| |Decision making rules for handling multiple sources of evidence across |

| |different methods and/or tasks |

| |Decision making rules for determining authenticity, currency and sufficiency |

| |of evidence. |

|Range and conditions |Location (where) |

| |Time restrictions (when) |

| |Any specific assessor qualifications and/or training required to administer |

| |the tool. |

|Materials/resources required |Resources required by candidate |

| |Resources required by the assessor to administer the tool |

|Assessor intervention |Type and amount of intervention and/or support permitted |

|Reasonable adjustments |Justification that the alternative procedures for collecting candidate |

| |evidence does not impact on the standard expected by the workplace, as |

| |expressed by the relevant Unit(s) of Competency. |

|Evidence of validity |The assessment tasks are based on or reflect work-based contexts and |

| |situations (i.e., face validity) |

| |The tool, as a whole, represents the full-range of skills and knowledge |

| |specified within the Unit(s) of Competency (i.e., content validity) |

| |The tool has been designed to assess a variety of evidence over time and |

| |contexts (i.e., predictive validity) |

| |The boundaries and limitations of the tool in accordance with the purpose and |

| |context for the assessment (i.e., consequential validity) |

| |The tool has been designed to minimise the influence of extraneous factors |

| |(i.e., factors that are not related to the unit of competency) on candidate |

| |performance (i.e., construct validity) |

| |The tool has been designed to adhere to the literacy and numeracy requirements|

| |of the Unit(s) of Competency (i.e., construct validity) |

|Evidence of reliability |There is clear documentation of the required training, experience and/or |

| |qualifications of assessors to administer the tool (i.e., inter-rater |

| |reliability) |

| |The tool provides model responses and/or examples of performance at varying |

| |levels (e.g., competent/not yet competent) to guide assessors in their |

| |decision making (i.e., inter and intra-rater reliability) |

| |There is clear instructions on how to synthesis multiple sources of evidence |

| |to make overall judgement of performance (i.e., inter-rater reliability) |

| |If marks or grades are to be reported, there are clear procedures for scoring |

| |performance (e.g., marking guidelines, scoring rules and/or grading criteria) |

| |(i.e., inter-rater reliability) |

|Recording Requirements |The type of information to be recorded |

| |How it is to be recorded and stored, including duration |

|Reporting requirements |What will be reported and to whom? |

| |What are the stakes and consequences of the assessment outcomes? |

|Supplementary information |Any other information that will assist the assessor in administering and |

| |judging the performance of the candidate |

Identify any gaps in the tool?

______________________________________________________________________________________________________________________________________________________________________________________________________

Discuss the pros and cons of including such additional information within the tool?

Activity 3: Assessment Quality Management

What strategies does your RTO implement to ensure quality assessments within its programs?

|Approach |Strategies Implemented |

|Quality Assurance | |

| | |

| | |

| | |

| | |

| | |

| | |

| | |

| | |

| | |

| | |

| | |

| | |

| | |

| | |

| | |

| | |

| | |

|Quality Control | |

| | |

| | |

| | |

| | |

| | |

| | |

| | |

| | |

| | |

| | |

| | |

| | |

| | |

| | |

| | |

| | |

| | |

| | |

| | |

| | |

|Quality Review | |

| | |

| | |

| | |

| | |

| | |

| | |

| | |

| | |

| | |

| | |

| | |

| | |

| | |

| | |

| | |

Table 1: Examples of processes for enhancing quality assurance, quality control and quality review of assessments in vocational education and training contexts.

| |

|Assessment Quality Management |

|Quality Assurance |Quality Control |Quality Review |

|(Input approach) |(Outcome approach) |(Retrospective approach) |

| | |Examples Include: |

|Examples include: |Examples include: |Monitoring and auditing of registered training organisations |

|Industry competency standards as the benchmarks for assessment |Moderation in which adjustments to assessor judgements are|Review and validation of assessment tools, processes and |

|National assessment principles |made to overcome differences in the difficulty of the |outcomes to identify future improvements |

|Minimum qualifications for assessors (i.e. TAA40404) |assessment tool and/or severity of the judgement |Follow-up surveys with key stakeholders (e.g. student |

|Development of a Professional Code of Practice | |destination surveys, employer feedback on how well the |

|Standardisation of reporting formats | |assessment outcomes predicted workplace performance) |

|Assessment Guidelines and Policy | | |

|Documents | | |

|Benchmark examples of varying levels of performances | | |

|Assessment tool banks | | |

|Common assessment tasks | | |

|Exemplar assessment tools | | |

|Panelling, Piloting and/or Trialling of assessment tools. | | |

|Professional development programs/workshops for assessors | | |

Source: A code of professional practice for validation and moderation 2009 (NQC) (Prof Shelley Gillis and Andrea Bateman, Work-based Education Research Centre of Victoria University in conjunction with Bateman and Giles Pty Ltd)

Table 1: Comparison of the assessment quality management processes currently being implemented (or proposed)

|Case study |Assessment Roles and Responsibilities |Quality Assurance |Quality Control |Quality Review Approach |

|The TAE10 – |Assessments are conducted at the individual RTOs. |Minimum requirements for network members |No processes in place due to|Establishment of a validation network across RTOs (using |

|network of VET |Role of network is to simply provide opportunities|Establishment of a standardised checklist for tool|logistical challenges and |consensus approach) |

|Practitioners |for continuous improvement by providing |review |costs, would however like to|Meetings quarterly |

| |opportunities for sharing of resources and good |Establishment of a network validation coordinator |explore e-moderation options|At this point in time, only the tools have been reviewed, not |

| |practice, validating tools and disseminating |Delivery of professional development on assessment| |evidence of judged candidate’s work – network is looking to |

| |information across RTOs within the state |and validation | |expand its role to include this activity |

| |delivering the same qualification. | | |At network level, no authority to follow up on any |

| | | | |recommendations nor any requirement to report back to the |

| | | | |network any actions undertaken, onus is entirely on individual |

| | | | |members to implement changes to the tool in its own RTO |

| | | | |No formal process for internal review of the validation network |

|Beauty Training|Assessments are conducted at the individual RTOs, |Minimum requirements for network members |No processes in place due to|Establishment proposed of a process for quality review of |

|Package |where there is no joint delivery of assessment or |Establishment of a Network Coordinator, a role |logistical challenges and |assessment online |

|–network of |training across the member RTOs. Role of network |currently undertaken by the SA Industry Skills |costs |Proposed participation in validation by industry experts, not |

|practitioners |is to provide opportunities for continuous |Council | |connected to an RTO |

| |improvement by providing opportunities for sharing|Accessing professional development opportunities | |Only informal assessment validation between network members |

| |of resources/good practice, validating tools and |for assessment and validation | |currently |

| |disseminating information, using a collaborative |Proposed establishment of terms of reference and | | |

| |approach to raise perceptions of the Beauty |operation for the network | | |

| |Industry in SA. | | | |

|School based |Each partner organisation conducts its own |Establishment of an Assessment Quality Management |Using a consensus approach |In the establishment phase of the QAMG, meetings will be |

|traineeship – |assessments of the same cohort of students against|Group (QAMG) with representation of all partner |to moderation, the panel |monthly, with the long term view of expanding the timeframe |

|partnership |the same units of competency (or a subset of the |organisations. |will: |throughout the maintenance phase. |

|between school,|units) but come together to validate tools and |Minimum requirements for partnership members |Identify at risk students to|Validate existing tools and/or any customization using a |

|TAFE and |outcomes. |Introduction of assessor exchange to facilitate |provide early intervention |consensus approach |

|enterprise |Planning to introduce assessment panels through |the establishment of assessment panels across the |(monitor adequate student |Minutes recorded of both moderation and validation activities |

| |staff exchange |workplace and classroom contexts. |progress) |and outcomes with subsequent meetings requiring the review of |

| | |Establishment of a term of reference between |Monitor the students’ |evidence of the actions undertaken within the given timeframes. |

| | |partner organisations |adherence to their Learning |Introduction of an internal review process in which the |

| | |Panelling of new tools being developed in either |Contracts |principles underpinning the NQC (2009a) Code would form the |

| | |contexts prior to use |Resolve discrepancies |focus of the review. |

| | |Establishment of a Learning Contract for the |between the on and off the | |

| | |student outlining roles and responsibilities in |job assessments in relation | |

| | |both workplace and classroom contexts. |to competence prior to | |

| | | |finalisation of results. | |

|A small |Assessments are conducted by a small number of |Assessment Panels formed to meet minimum |No processes in place due to|Informal Assessor Partnerships among delivery teams where |

|regional |sessional trainers who are all practicing |requirements as a collective group (eg trade |logistical challenges and |distance allows (RTO is aiming to formalize this process to |

|private RTO |tradesmen and have been selected from their own |and/or Cert IV TAA04) |costs |satisfy AQTF requirements) |

|across three |region to provide training &/or assessment |Procedure for validation | |Annual validation meeting to deliver professional development |

|states |services. All have their own business and hence |Procedure for assessment and record keeping | |and review training and assessment tools/materials internally. |

| |training & assessment is not their core business. |Complaints and appeals procedure | | |

| |The RTO uses a mix of simulated and workplace |Standardized assessment tools | | |

| |assessments, underpinning knowledge assessment and| | | |

| |third party reports. | | | |

| |The Director, if required, provides supervision | | | |

| |and co-assessment processes for those | | | |

| |trainers/assessor without the requisite training | | | |

| |and assessing competencies. | | | |

|A large public |Employers participate in practical assessment |Minimum requirements for assessors |No processes in place due to|Formal Consensus validation approach as well as informal |

|RTO |where there is a workplace component |Intends to develop a set of professional teaching |logistical challenges and |assessor partnerships |

| |RTO conducts simulated assessments |standards and develop a graduate diploma in |costs |In past, focus has largely been on the assessment tool, but |

| | |teaching that will embed lesser qualifications (eg| |moving toward reviewing judged candidate evidence. |

| | |TAE10) | |Self assessment and review cycle to be undertaken by each |

| | |Policy and procedures on assessment | |teaching team both at the beginning of the year |

| | |Draft validation guidelines and forms | |(self-assessment) and end of the year (review) |

| | |Development of standardized assessment tools | | |

Ref Validation and Moderation in Diverse Settings (NQC December 2010)

-----------------------

[1] DEEWR (2010) AQTF Users’ Guide to the essential conditions and standards for continuing registration

................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download