VCE Environmental Science 2017–2022



VCE Environmental Science 2017–2022School-based assessment reportThis report is based on the School-based Assessment Audit and VCAA statistical data.All official communications regarding the Victorian Certificate of Education (VCE) Environmental Science Study Design are provided in the VCAA Bulletin. It is recommended that teachers individually subscribe to the VCAA Bulletin to receive updated information regarding the study. The VCE and VCAL Administrative Handbook and Important Administrative Dates are published on the Administration page of the VCAA website.General commentsResponses to the School-based Assessment Audit for VCE Environmental Science indicate that most audited schools had either just introduced the study or had a first-time teacher of the study. The audit responses from all other schools showed that they made a successful transition from the previous study design to the reaccredited VCE Environmental Science Study Design 2017–2022. School-based assessment enhances the validity of student assessment by providing opportunities for non-routine and open-ended environmental science contexts and applications to be explored in greater depth and breadth than is possible in an examination. The audit processThe School-based Assessment Audit enables the VCAA to check that School-assessed Coursework (SAC) tasks are compliant with the VCE assessment principles and the requirements of the VCE Environmental Science Study Design.Schools are advised to prepare their responses to the study-specific audit questionnaire using the PDF available on VASS before they begin to complete the audit online. Some questions contained in the audit questionnaire may require consultation with school leadership or other colleagues: for example, locating the school provider number or details of the school redemption policy.Schools should ensure that all requested materials are submitted for the audit. Materials potentially required for submission are listed in the VCE and VCAL Administrative Handbook. If materials are not submitted as requested, a judgment cannot be reached by the Audit Panel as to whether the school concerned has satisfied VCAA requirements for school-based assessment.The first stage of the audit requires schools to complete a study-specific audit questionnaire by providing information about assessment planning, timelines, resources, the types of assessment tasks set and the conditions under which students sit the tasks. Most audited schools designed and used tasks that met the requirements of the reaccredited study design and the VCE assessment principles. Some schools proceeded to the second stage of the audit process due to issues such as providing insufficient information about how SAC tasks were developed. In other cases, it was not clear through the audit response as to whether the task/s met the definition of the task type as specified in the study design.Most schools reported that the questionnaire was a useful undertaking as it provided an opportunity to consider or review their practice against the new study design and helped ensure there was a good overview of the whole unit in terms of the relationship between key knowledge and key science skills. Assessment planningAll schools provided an assessment timetable to students at the beginning of the school year, or occasionally during the previous year’s orientation program, to assist them in planning for assessment. In many schools, SAC tasks were used for both formative and summative purposes.Prior to each SAC task being undertaken, students should be given a clear and accurate statement of:the outcome being assessedthe selected task typethe requirements and conditions of the taskthe contribution of the task to the final outcome score.Many schools provided students with the assessment rubrics that would be used to assess the SAC task prior to the task being undertaken. For some outcomes, multiple tasks had been selected. In this case, schools are advised to consider the length of some tasks so that the total assessment time for the outcome is not excessive.Although most teachers reported using VCAA performance descriptors to assess student work, these are not mandated, and in some cases school-developed marking schemes may be more appropriate.Task developmentThe audit has shown that most teachers have paid close attention to both the key knowledge and the key skills in the reaccredited study design and have designed content-aligned assessment tasks. Many schools have used the opportunities that come with a new study design to re-think the types of case studies investigated and practical activities undertaken by students including the nature of discussion and assessment questions. A greater emphasis on practical skills has been noted with the new study design, particularly in identifying independent, dependent and controlled variables, formulating hypotheses, making predictions, considering sources of error, suggesting how validity, reliability, accuracy and precision may be improved in investigations, considering the importance of replication, and the graphical presentation and analysis of results.Overall, SAC tasks addressed a wide range of key knowledge and required students to demonstrate relevant key science skills. SAC tasks were created with an understanding of the need to differentiate performance of the student cohort and to provide the opportunity for all students to demonstrate the highest level of performance on each outcome. Prior fieldwork and laboratory activities were generally used as the basis for SAC tasks, and different task types were selected across Units 3 and 4. Many schools described how assessment tasks were developed to ensure that higher order/more complex questions were included and weighted appropriately. Reference was made to both Marzano’s and Costa’s levels of questioning as well as Bloom’s taxonomy, with a range of command terms (from ‘describe’ and ‘list’ to ‘evaluate’ and ‘design’) being used. Schools also provided assessment rubrics and discussed performance descriptors for each outcome with students prior to conducting the assessment task. The opportunity for dyslexic students to demonstrate understanding via oral tasks was also nominated by one school.Despite a broader range of assessment task types being offered across Units 3 and 4 of the reaccredited study design, most schools adopted a conventional approach to assessment with very little uptake of the newer task types. Schools are advised to refer to the online VCE Environmental Science Advice for teachers resource, which outlines different approaches to assessment tasks. This publication also includes advice about developing School-assessed Coursework tasks, the nature of scientific inquiry, sample teaching and learning activities and programs, elaborations of assessment tasks, and examples of performance descriptors.A range of methods were used to develop SAC tasks. In most cases, new tasks were developed each year by the individual teacher. Given the location-specific nature of most assessment tasks, many teachers use teacher networks and/or other Environmental Science teachers at other schools to cross-check task validity. While time-consuming to construct unique tasks each year, schools are reminded of the importance of returning SAC tasks to students to enable feedback to be provided.A significant audit issue related to schools using past VCAA examination questions and/or other schools’ SAC task questions and scenarios as part of their own SAC task. Although these had been checked for alignment with the VCE Environmental Science Study Design, schools were reminded that significant modifications must be made to all questions used to ensure the task is unique to the school. Past examination questions (including associated examination reports) are available in the public domain and therefore are accessible to students. Tasks developed collaboratively with other teachers and/or accessed through environmental education support groups must also be modified since it is possible that students can access them. A useful strategy for modification of tasks by some schools was to adapt scenarios to their own local contexts and to strengthen the newer inquiry aspects of the study design related to hypothesis formulation, identification and control of variables, data analysis, and consideration of accuracy, precision, validity and reliability. Another useful strategy was to add relevant investigation data (for example, de-identified student-generated data from previous years) to compare with students’ data from their logbooks and to comment on comparisons and conclusions that could be drawn from the data.A further audit issue related to the assessment principle of ‘balance’. A few schools included multiple examination-style assessment tasks across Units 3 and 4. Schools are advised to develop a suite of assessment tasks across each unit that provide a range of opportunities for students to demonstrate their knowledge and skills in different contexts and through different assessment task types. A significant audit issue related to over-assessment. Although all outcomes in each area of study must be met by students in order to be awarded an ‘S’ for a unit, a representative section of the set of outcomes is enough as the basis for SAC task development. Many assessment tasks were overly lengthy or tried to encompass every aspect of the key knowledge and key skills in a single task.All SAC tasks were fully supervised and material requirements were, in general, similar to external examinations requirements and included pens, pencils, highlighters, erasers, sharpeners, rulers, a scientific calculator and a clear water bottle. Sometimes students were also instructed as to what materials could not be used during SAC tasks; usually blank sheet/s of paper and mobile phones were not permitted.The VCAA performance descriptors, or modified versions thereof, provided the assessment framework for many assessment tasks. Schools are reminded that these performance descriptors are not mandatory. For tasks involving a series of questions, a marking scheme is more appropriate.AuthenticationSchools must be aware of the authentication requirements set out in the VCE and VCAL Administrative Handbook. Any work set over an extended period should include a process for authentication of student work. This applies particularly to Unit 3 work such as background research related to a threatened endemic species or preparation of flowcharts related to strategies for the management of a specific environmental science case study, and the Unit 4 Area of Study 3 student-designed investigation. Teachers must monitor and record each student’s work through to completion. The teacher may also consider it appropriate to ask the student to demonstrate understanding of the task at the time of submission of the work.All audited schools indicated that SAC tasks were completed under teacher supervision, making authentication of student work less problematic. Authentication processes also included schools requiring that student logbooks be kept at school for those assessment tasks that involved preparatory laboratory and/or fieldwork. Some schools reported that satisfactory authentication was facilitated by small class sizes and that collecting enough evidence for authentication was not an issue.Schools with multiple classes and more than one teacher indicated marking consistency was achieved by using a prepared answer sheet, discussion and/or cross-marking. For schools with only one Environmental Science class, marking validation was often achieved by working with another Environmental Science teacher or the Science Coordinator, either within the school or at a different school, to mark a sample of ‘top’, middle’ and ‘low’ student work. These practices are important to ensure an accurate student rank order is attained.One school reported a faculty-based approach to monitoring and authentication of the Unit 4 Area of Study 3 assessment task by providing students with an experimental investigation booklet that detailed a common set of stages for the investigation, methods of assessment including rubrics and allocated timeframes, and minor subject-specific modifications. All audited schools had in place thorough and appropriate processes related to authentication of student work, and student redemption of an ‘N’ outcome.Practical workPractical work is an important part of the VCE Environmental Science Study Design. Most schools followed the study design recommendations related to hours of practical work to be undertaken in each area of study, with most schools approaching the upper limits. The format of practical work ranges from 15-minute teacher demonstrations to student-led investigations that are conducted over a period of weeks. One school reported extensive use of practical work (20 hours in Unit 3 Area of Study 1) as a hands-on experiential means through which concepts could be taught and assessed.Many schools used an extensive range of ecological monitoring equipment including data loggers and relevant probes. In some schools, teachers and/or laboratory technicians constructed their own equipment: for example, an aquaponics installation, a small wind generator with wind speed meter, solar panel models, and a mini hydro-electric power generator. Other schools had limited resources and accessed resources from other schools and/or organisations such as the EPA and WaterWatch.FieldworkAll audited schools undertook fieldwork in both areas of study for Unit 3. A range of sites external to the school, sometimes involving external experts/equipment, were accessed to support learning and the development of SAC tasks including school-based fieldwork, local fieldwork and site visits to consolidate analysis and evaluation of case studies.In some cases, Unit 3 fieldwork was selected to cover both Outcomes 1 and 2. For example, investigations related to water or soil quality – both within the school and/or at local sites – were used to enable students to generate data related to biodiversity (Outcome 1) as well as environmental management (Outcome 2). Both natural and ‘built’ environments may be suitable for study.Few schools reported undertaking fieldwork in Unit 4, generally limiting practical experiences to laboratory work and simulations. With Area of Study 2 and Area of Study 3 being new inclusions in the study design, schools may consider the potential of local landmarks and points of interest as potential future fieldwork sites.Student-designed practical investigation The Unit 4 Outcome 3 student practical investigation can be undertaken at any time across Units 3 and/or 4. Almost 80 per cent of schools undertook this assessment task before or during Unit 4, as shown in the following graph:The recommended timeframe for the investigation is between 7 to 10 hours. Most audited schools followed this recommendation, often including half a day for undertaking the associated preparatory fieldwork.The audit findings also showed that, while just over 40 per cent of student investigations were based solely on Unit 3, almost a third of schools offered students a choice of topics across Units 3 and 4, as shown in the graph below:Often, a field trip preceded the investigation. A ‘coupled inquiry’ approach was also used to develop student questions following a set of general class activities that included data generation. In most cases, the single field trip provided sufficient opportunities for each student to generate their own investigation topic or question. Schools are reminded that the scientific poster title should be a question, so that ‘Human impacts on the food sources of Port Phillip Bay’s dolphins and seals’ should be re-phrased as ‘How do humans impact on the food sources of Port Phillip Bay’s dolphins and seals?’All audited schools reported appropriate management (including authentication) strategies for the student investigation. Successful strategies included: appropriate lead-in class investigations; activities and secondary research prior to beginning the investigation; class discussion of scientific measurement and the assessment rubric prior to undertaking the task; use of half-day or full-day field trips to enable students to generate primary data; teachers signing off on logbooks after each lesson; progressive marking of logbook and/or scientific poster sections; writing up poster sections under test conditions; and a summary 5-minute oral presentation to the class.Successful approaches to developing the SAC task included: providing the same overarching question to the class (e.g. ‘How does the distance from the river affect biodiversity?’), with students then working to determine how the question variables can be measured using a coupled inquiry with an initial question (e.g. ‘What methods do environmental scientists use to study biodiversity within an ecosystem?’) followed by specific questions being developed by students: for example, ‘What impact have humans had on the upper and middle storey vegetation in areas of Wombolano State Forest?’providing students with a general topic, such as biodiversity, with some students investigating the effectiveness of conservation areas while others looked at bird or invertebrate biodiversitycombining filed work sampling and laboratory work: for example, ‘Is water quality consistent across all habitats?’Some student investigations combined different areas of study in a unit. For example, the question, ‘How does the percentage of cloud cover affect the efficiency of electricity generation by a photovoltaic system?’ linked Unit 4 Areas of Study 1 and 2. It is also appropriate to link areas of study across Units 3 and 4.Each student should be assessed on their individual capacity to design, undertake and report on an investigation. In cases where schools have multiple classes or large numbers of students in a single class, it may not be practicable for each student to undertake a unique investigation. In these cases, it is acceptable for students to work in groups to generate data after they have been assessed on their capacity to design an investigation. Teachers must approve all student investigations before they are undertaken; not all planned student investigations can proceed due to issues including safety, equipment availability, time constraints and/or management of large student numbers.Sometimes student practical investigations may not yield enough data for analysis. For example, a student investigation related to the effects of different salt concentrations on the growth of Chlorella yielded no growth for all salt concentrations. In this case, time factors mitigated against the student repeating the experiment, so the student was provided with secondary data on which to base data analysis and subsequent production of a scientific poster. Assessment of the student’s capacity to design an experiment was based on the student’s original work whereas all other aspects of assessment were based on analysis and evaluation of the secondary data. Schools are reminded that an ‘investigation’ can take many forms, as outlined in the VCE Environmental Science Advice for teachers. The focus is on developing questions leading to the generation of primary data and its subsequent analysis and evaluation. Construction-type investigations, such as building and testing the effectiveness of bird boxes or developing a climate monitoring system, would be appropriate for this area of study. Any time spent working on the constructions out of class time should be recorded in students’ logbooks and may include photographs as a record of progress.Schools are advised to check the VCE Environmental Science Advice for teachers resource, which has an extensive list of possible topics. However, schools are reminded that since these resources are available in the public domain, they must be modified prior to use as an assessment task. VCAA reports Several VCAA reports are available to assist in informing teaching and assessment practices. Schools used Examination Reports, Statistical Moderation Reports and School-assessed Coursework Reports to improve the learning outcomes of students and to create assessment tasks that met the VCE assessment principles. The reports were used at the individual teacher level and, in many cases, at the departmental, faculty and/or school level. Specific informationUnit 3: How can biodiversity and development be sustained?For Unit 3 the student is required to demonstrate the achievement of two outcomes. In addition, students may undertake the Unit 4 Outcome 3 task (the design and undertaking of a practical investigation related to biodiversity or energy use from an environmental management perspective, and the presentation of methodologies, findings and conclusions is a scientific poster) across Units 3 and/or 4. Practical work averaged 6 hours for each of Areas of Study 1 and 2. Most commonly, assessment tasks involved written reports or presentations, as shown in the following graph:left21717000Area of Study 1: Is maintaining biodiversity worth a sustained effort?A few audited schools allowed some degree of choice in the SAC task by allowing students to select a threatened endemic species of interest but requiring that students responded to the same set of questions, and hence the task was equitable. A range of species were investigated including the Eastern Barred Bandicoot, the Helmeted Honeyeater, the Regent Honeyeater, and the Pink-tailed Worm-lizard. One school also considered a threatened plant species by comparing Leadbeater’s Possum with the Basalt Greenhood, an orchid.All audited schools used Simpson’s Index as a measure of biodiversity, as stipulated in the study design, with other indices being used as a comparison: for example, the Shannon-Weiner Index and Jaccard’s Index of Similarity.The most common assessment task type was a response to structured questions. High-quality data analysis type questions utilised data from students’ logbooks as well as current data obtained from relevant environmental organisations and/or their websites. Commonly the structured questions related to field trip data but in some cases ‘unfamiliar’ scenarios were used to assess students’ capacity to apply their knowledge and skills to new contexts.While multimodal tasks were not commonly chosen by schools, they provided good opportunities to assess scientific knowledge as well as communication skills. At one audited school, students selected a different threatened endemic species, conducted library research using a set of scaffolded questions, and presented responses in a set of PowerPoint slide templates, including speaking notes. At another school, students created a website on the school’s intranet with a focus on explaining the importance of maintaining biodiversity. Fieldwork on which SAC tasks were based involved both schoolyard and local environments. Schoolyard fieldwork included ant baiting, sampling using quadrats, bird identification, soil invertebrate studies and comparisons of the biodiversity of different areas and/or across different school campuses. Local fieldwork included investigation of soil and water systems, scat comparisons, snorkelling counts, and biodiversity management strategy analysis.Area of Study 2: Is development sustainable?The most popular assessment task type involved a written report or a response to structured questions. The ‘report’ took a few forms, including as a booklet and as a field report. A set of prompts or scaffolded questions were often provided to students: for example, ‘Why is the implementation of ecologically sustainable development important in today’s world?’; ‘How does <the selected environmental management project> contribute to ecologically sustainable development?’; and ‘How can the main stakeholders involved in <the selected environmental management project> influence decision-making?’In all audited schools the SAC task related directly to a field trip undertaken by students prior to the SAC tasks and/or to case studies investigated in class. In some audited schools, this was supplemented by modified VCAA examination questions. Schools analysed and evaluated a range of case studies and/or sites to explain the principles of sustainability and environmental management: for example, Barwon Water Black Rock Water Reclamation Plant, City of Bendigo Urban Stormwater Management Plan, Organ Pipes National Park, Ravenswood Interchange Development, Western Treatment Plant, and Wonga Wetlands. One innovative approach to the task was to compare the construction of the Mordialloc bypass (‘before’ example) and the outcomes of the construction of the Frankston bypass (‘after’ example). Another interesting approach that allowed for student choice was for students to undertake fieldwork in two different local wheat farms – biodynamic and organic – as a comparison with traditional farming techniques.(If conducted during Unit 3): Unit 4 Outcome 3 (Area of Study 3: practical investigation)Most audited schools based the investigation on student interest and choice, including whether the investigation utilised fieldwork (e.g. Is predator-proof fencing an effective way to conserve species?’) or controlled experiments (e.g. ‘How does the concentration of acid (simulating acid rain) affect the germination of radish seeds?’) as a methodology. This had been preceded by relevant activities such as fieldwork at Mt Rothwell Biodiversity Centre or CERES, as well as biodiversity activities in the school grounds. In another approach, fieldwork was combined with the use of databases to make comparisons (e.g. ‘Is biodiversity greater at the Portsea or Mornington piers?’) or to evaluate environmental management strategies (e.g. ‘Seahorses and blue gropers are protected in Victoria: are the management strategies for these species working?’). This latter approach involved developing students’ skills in snorkelling and using the fish count method with underwater slates as preparation for the SAC task.In some schools, ‘citizen science’ opportunities arose that were suitable as the basis for student investigations and SAC tasks; for example, one audited school participated in a regional audit of birdlife organised by Birdlife Australia. The project allowed students to use Simpson’s Index to analyse the data they obtained and to consider the difference between ‘species richness’ and ‘species diversity’. Identified strengths of the bird audit project was that it was relevant to the local area, gave clear results that could be evaluated, and allowed the students to see how science was applied to investigate science-based issues.Schools that undertook this outcome during Unit 3 used it for formative and summative purposes, particularly in developing students’ capacity to design their own investigations and to critique the investigations of others, including published research.Unit 4: How can the impacts of human energy use be reduced?For Unit 4, students are required to demonstrate the achievement of three outcomes. Outcome 1 allows a choice of at least one task from a set of six task options, while Outcome 2 allows a choice of one task from a set of four task options. Although the Unit 4 Outcome 3 task may be undertaken across Units 3 and/or 4, most audited schools undertook the task before the start of Unit 4. Practical work averaged 4.5 hours for Area of Study 1 and 4 hours for Area of Study 2. The student investigation for Outcome 3 was staged on average over 9.5 hours. left5187950Tasks selected by audited schools are shown in the following graph (note that percentages add up to a total greater than 100 since multiple task selections could be made):The audit showed that SAC tasks selected by schools were similar to those undertaken by students in previous years. The newer SAC task types, such as a model of energy/climate concepts, a graphic organiser, a media response and a reflective learning blog, provide good opportunities to assess a broader range of scientific skills and employability skills/capabilities than has been possible previously. Schools are reminded that students should be provided with a variety of opportunities and task types to demonstrate a range of skills and knowledge.Area of Study 1: What is a sustainable mix of energy sources?Assessment tasks may be combined for efficiency and to assess different skills. One audited school appropriately included a media article about the proposed diesel/gas peak load power generator for South Australia as part of ‘a response to structured questions’ task. In many audited schools, the structured questions task also involved a SWOT analysis chart for the evaluation of different energy sources, often a comparison of a non-renewable and renewable energy source of a particular purpose. Such tasks can be readily modified from year to year by changing the scenario and/or the energy sources. Data analysis tasks should require that students plot data and/or compare different data sets. Typically students were provided with previously unseen data sets and required to produce a report that recommended and justified a sustainable mix of energy sources at a particular location for a specified period: for example, for the next five years compared with the next 30 years. Data was sourced from the Clean Energy Council, CSIRO and the Australian Renewable Energy Agency.While multiple tasks may be selected, the same task type should not be selected for the same outcome. Schools should also ensure that the VCE assessment principle of ‘balance’ is adhered to by ensuring that varied task types are selected across the unit.Schools should be careful to distinguish between the SAC tasks themselves (e.g. a report or a response) and the formative preparation that leads up to undertaking the SAC task (e.g. completing practical work or discussing energy-related data from different sources using different representations). This will minimise issues associated with perceptions of over-assessment. Area of Study 2: Is climate predictable?A good range of key science skills and key knowledge were assessed through the SAC tasks in this area of study. Audited schools generally met the task length guidelines as specified in the study design although some schools indicated assessment times between 2.5 and 3 hours. These schools progressed to the next stage of the audit to clarify that the time included formative work. Assessment tasks using structured questions were often based on provided scenarios that were unfamiliar to students, or required students to research a range of web-based articles to provide responses to questions such as ‘What evidence is present in the articles that would help you argue that there is currently a change in Earth’s climate?’ Schools are reminded that work completed outside class should be accompanied by a signed Authentication Record for School-based Assessment form, available through VASS.Some schools interpreted the ‘response to structured questions’ task as a test, and often replicated the format of external VCE examinations. This task should be designed based on stimulus material with a set of questions related to relevant key knowledge and key science skills. Valid questions for this task could include interpretation of data presented in the form of tables, graphs or images and the use of this data to explain and/or describe various climate phenomena. Stimulus materials included experimental set-ups, research snippets, media items and/or inventions such as carbon storage technologies and low-carbon emission devices. It is not appropriate that this task type includes multiple-choice questions.Although simulations are appropriate practical activities as the basis of the ‘report of a student investigation’ task type, this should not be the preferred option for practical work where equipment is readily available: for example, experiments related to atmospheric monitoring. Practical experiments enable students to be exposed to data ‘outliers’ as well as the practice of experimental replication. A different assessment task type – ‘a model of climate concepts’ – would include activities such as simulations and climate modelling, including the use of secondary data. Area of Study 3: Practical investigationSpecific dates for monitoring of stages should be provided to the students before the start of the task.Audit findings were that almost 30 per cent of schools provided a choice of topic across Units 3 and 4 rather than restricting the investigation to a single topic. The same assessment rubric (often the VCAA performance descriptors, or a modified version) was used to ensure comparability of task scope and demand.While most student investigations extended scientific understanding (e.g. ‘How is the presence of migratory birds in a wetland environment related to species diversity?’), others were based on applying scientific skills (e.g. ‘How is bait type related the effectiveness of insect traps, and what are the implications for biodiversity audits?’). Many schools focused on the development of hypotheses using an ‘If … then …’ structure: for example, ‘If controlled burning does not impact negatively on environmental sustainability, then bird diversity in a controlled burn site should not be impacted when compared with a ‘control’ site.’Schools are advised to assist students in narrowing the topic of investigation so that it is manageable within allocated timeframes. Often this requires greater specificity with topic definition. For example, a question such as ‘What factors affect the species diversity of soil?’ could be narrowed to ‘How is the species diversity of soil affected by <factor>?’ following preliminary tests on soil for factors such as light, moisture, pH and nutrient concentration.Audit concerns for Outcome 3 included:investigation titles that were not phrased as a questioninvestigation questions that did not rely on practical experimentation to find an answerinvestigation questions that were too broadinvestigation questions that did not involve the generation of primary datastudents who were provided with a topic and possible variables, and therefore did not have an authentic opportunity to work independently to design their own investigationsinvestigations that were either unsafe, too expensive or did not meet ethical guidelinesinvestigation questions that could not be investigated in the allocated timeframeinvestigation questions that could simply be answered with ‘yes’ or ‘no’investigations that did not identify the variable being investigated and/or controlledinvestigations that were difficult to complete successfullytoo much time spent on researching background information related to topics outside the scope of the study designconfusion between ‘affect’ and ‘effect’ in investigation titlestoo much time being spent on poster productionschools that accepted submission of multiple drafts.Schools may provide data to students in situations where, after designing and running their own investigation, students do not generate a viable set of data that can be analysed. ................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download