Harvard University



Sociology 1130—Higher Education Policy and Service: On Campus and BeyondFinal Research PaperGauging and Engaging: Surveying Harvard’s First-Year Outdoor ProgramAbstract: Harvard University’s First-Year Outdoor Program provides incoming students with social support and self-reflection in a wilderness context. However, though it is an organization invested in leadership development and constant growth, FOP lacks up-to-date survey tools needed to properly measure the success of its programming. The central question of this action research project is to assemble an improved tool which can measure student experience on preorientation. By combining one-on-one interviews and focus groups with members of the community and experts, and by bridging past tools with modern analysis, this study formulates a new survey which will be officially implemented in August 2019. The new survey is guided by a set of key findings, summarized by the need for a tool to create space to share constructive criticism, break down and evaluate the experience quantitatively by specific practice, and remove confusing jargon from the wording of the survey. The tool will also be implemented with greater leader involvement in order to ensure that the survey is filled out at a significant rate. As a whole, this project has improved the tools available to FOP’s Steering Committee to make decisions, and should help inform a new generation of leaders as they endeavor to make FOP into the transitory and affirmative force that it sets out to be.Author: Rick LiProfessor: Dr. Manja Klemen?i?Teaching Fellow: Nicolette BardeleDate of Submission: 08 May 2019 Honor Code AffirmationI affirm my awareness of the standards of the Harvard College Honor Code. 145605514766000Rick LiI would also like to take this space to make some acknowledgements. I would like to thank Manja and Nicolette for their guidance throughout this project. I would like to thank all of my wonderful classmates and the entire FOP community for inspiring me and reminding why it is I serve as a student leader. I would like to thank all of University offices and the OIR for creating the infrastructure that makes this possible. Finally, I would like to thank our outgoing Director Paul “Coz” Teplitz for a decade of service both to this program and generations of students.Executive Summary—Blueprint for ActionFOP’s mission statement, broadly, is to provide social support for incoming students and leadership development for students on campus. Central to the structure of preorientation is the idea of critical, formative windows. Much in the same way individuals prepare for trips by packing up gear, we as a leader community work year-round to make sure that those first six days are affirming, engaging, and prepare students to tackle the trail that is Harvard University. These windows exist for the leadership of our program as well. As a Steering Committee, we turn over each year, and though we operate on a consensus decision-making model, we recognize that we do not usually have participant opinions in the room. Without robust survey tools—tools that intentionally offer space for student voice—we cannot make meaningful programmatic change. The central question of this project, therefore, is to figure out how to craft a survey tool in order to better measure student experience on FOP. This project was split up into several separate phases. The first involved reaching out to FOP community members, along with experts in administration and at the Office of Institutional Research, in order to learn about what sorts of information are important and how best to collect it. I next retrieved and evaluated existing surveys that FOP uses, keeping what was helping and building upon what was not. Using a combination of past questions and new insights gleaned from the interviews, I created a new survey. I then pilot tested this survey with a focus group consisting of past Steering Committee members, walking through question by question and hearing people’s thoughts. I also went through the survey with a couple of FOPpers from this past cycle. Their comments and suggestions went through another round of analysis and informed the final survey tool, which is provided here: summarized findings are as follows. First, the survey should ask participants about points for improvement by soliciting useful feedback but not overly pushing students to reflect poorly on their experience. Second, the survey should offer a section which gives a quantitative evaluation of specific aspects of FOP. Third, the survey should remove unnecessary jargon and make descriptions of various activities as understandable as possible. Finally, the survey should be implemented in a way which uses the leader community to drive up response rate, so that length does not have to sacrificed for data. These points are all built into the ultimate deliverable of this project, the new survey linked above. Excitingly, this survey has been cleared and will be put into practice for the next round of FOPpers this coming fall.IntroductionThe First-Year Outdoor Program, Harvard’s largest and longest-running preorientation program, centers on the idea of newness as a catalyst for transition. Its mission is to provide a space for reflection and expedite the formation of support networks for incoming students on campus. In addition, FOP seeks to provide resources, encourage honest conversation, and promote the values of equity, diversity, and inclusion before students formally begin their time at Harvard. As a member of FOP’s Steering Committee, I recognize that our program has perennial room to improve on being an inclusive and enjoyable space for incoming students. In order to help our leadership body better accomplish these tasks, my research question asks how we can build a survey tool to more effectively measure student experience on preorientation, and how we can use this information to tailor our program to participants. The work that we do is two pronged. The most visible work we engage in is preorientation: each August, we send trips of around ten participants and a couple of experienced coleaders into the New England wilderness, most often along the Appalachian Trail network. These trips showcase a wide variety of outdoors experiences such as backpacking along ridges, canoeing in Northeastern lakes, and participating in trail maintenance. Upon arriving at campsites, time is spent learning technical and interpersonal skills, talking about Harvard life, and sharing one’s own background and aspirations for the upcoming four years. However, a large part of our program is also leadership development for Harvard students, many of whom themselves do not have extensive outdoors experience. FOP as a program values reflection and growth. While each Steering Committee is energetic and believes in improving the experience of leaders and participants, we often find it difficult to know what steps we need to take. Although Steering Committee turns over each year, one fact is consistently true: there are never FOPpers in the room when we are making decisions. The information-gathering capacity of our organization is currently severely limited and complicates the process of taking action. My action research, therefore, seeks to assuage this issue. Methods-wise, the project first communicated with current leaders and related experts to find out what sorts of information would be useful to improve trips. It then entailed creating a sample survey and testing it through a focus group. The key recommendations were that an effective survey tool would offer more spaces to provide constructive feedback, break down and evaluate certain programs within FOP, and improve clarity. These findings were then incorporated into the final deliverable, which was the survey tool. The proposed recommendations have not yet been implemented, since FOP does not receive its new class of participants until the fall. However, our leadership body has signed off on the project and will be putting this survey into practice the first opportunity we have a chance to. The actionable knowledge—these findings and the tool they inform—has two main audiences. At the institutional level, the hope is that the information gathered from this project will serve as a template to be implemented by the university for preorientation programs as a whole. The key audience, however, is ultimately FOP, who will be able to use this aggregated data to gauge the effectiveness of current practices and find opportunities for new ones, a process which will be iterative and span far beyond our time in the organization. At its heart, this project makes the assertion that without robust tools—tools which intentionally call in and elevate the participant voice—our program cannot have meaningful programmatic change. Literature ReviewAs a wilderness program, Harvard’s First-Year Outdoor Program is fortunate to have a fairly long legacy of outdoor orientation programs that it can learn from. Michael Harris, a researcher studying the pre-college wilderness orientation program at Clemson University, finds that these programs are useful in providing students support as they enter onto campus (2014, 31). This closely adheres with the experiences of students at Harvard, who from past data show an overwhelmingly positive reaction to FOP. That said, knowing the benefits of these programs is the easy part; what we are interested in is finding out where the pitfalls are. In order to come to conclusions about the effectiveness of the program and how to improve it, the authors relied heavily on long one-on-one interviews, which were not conducive to large-scale data collection (2014, 21). This aligns fairly well with our own practices, where we do not have a systematized channel of feedback beyond individuals bringing up points that they feel strongly about on their own time. And while there has been some research done on the logistics of pre-orientation—Bell et al. even went so far as to survey Harvard due to its long history—there seems to be a clear gap in research with regards to participant experience (2010, 3-4). There is insufficient data regarding how students view reentry onto campus and the year post-orientation. Moreover, there is very little out there which samples why students choose not to participate and how their needs might be met. Although this project focuses on a program at Harvard, it is important to first situate our work in the context of the outdoors and the challenges that the outdoors has. Recent work has shown that, while we often like to think of America’s extensive national park system as an open way to reconnect with nature and each other, citizens face differential levels of access. Emily Mott, writing for the Vermont Journal of Environmental Law, notes that there are statistically significant disparities—particularly with regards to race—among visitors to this nation’s parks (2016, 454). This engages with other literature such as the 1989 Schlossberg piece discussed in class about the concepts of marginality and mattering, and how it is exactly in these times of transition where marginalized students can feel the most alienated (6). These findings corroborate challenges that FOP itself faces and help inform the fact that there are significant barriers which constrain the transformative nature of the outdoors. What is even more telling, however, is that Mott’s piece lacks a comprehensive plan—it cites disjointed policy actions such as celebrity endorsements and vague practices as a sort of preliminary step—to tackle this challenge (2016, 464-7). In many ways, our program suffers from the same dearth of information. While Steering Committee members know that action must be taken and have the energy to take it, they do not yet have data which guides them. What we now need is a way of collecting useful information, particularly the experiences of those who might otherwise feel distanced from our program, and put it to action. An important point to note is that programs geared specifically towards First Year Experience (FYE) are rather new, particularly juxtaposed beside the long institutional history of Harvard. Gore and Metz describe the predecessor of the modern preorientation program arising in the 1970’s, with more modern developments planned to “integrate student success skills and campus and community engagement” (2017, 2). The authors note that questions of access, some of them identical to the ones described earlier by Mott, inform the dialogue improving these types of experiences. These conversations are highly analogous to the ones that Steering Committee has each week, joining our praxis with the broader theoretical research around this topic. There is quite a large body of work which uses survey tools to gauge student experience. Jones et al. studied the first year transition of international students in the UK using a blended strategy of interviewing and surveys, and was ultimately able to draw out interesting conclusions about the role of social relationships and the International Foundation Year (2018, 8). This is very similar to how I was planning to structure my project, which gives me confidence that my action plan has been tested in the past and troubleshot before. Additionally, the challenges are the same as well: Jones et al. writes that a key limitation is that the experiences are only generalizable to the student body they were drawn from. Although this is certainly a pitfall, it is also expected given the specificity of action research for one organization, and I do not foresee it affecting my ability to develop or implement a useful tool in my community. As a whole, I found that the literature supported much of my plan and validated the need for surveys of participant experience as a characteristic of multiple programs. As described earlier, there is limited peer-reviewed research currently available on the topic of wilderness pre-orientation, especially in the specific context of Harvard. The information which does exist, however, carries through certain themes which this project holds close to its mission. Former FOP Director Brent Bell and co-researcher Brady Williams piloted a method of data-gathering by directing leaders to collect anonymous handwritten responses in an exercise exploring students’ worries and fears, which allowed the program to acquire a new touch point by which to organize its training around (2010, 48). Their research design demonstrates the power of direct feedback through surveys in order to implement change. Current leader Amy Danoff conducted a Harvard-specific project in 2017 which interviewed participants in order to understand the effects of programming emphasizing equity, diversity, and inclusion. Her use of qualitative interviewing closely tracks against the methods in this project and serves as a scaffold upon which to consider best practices. This iterative culture of understanding participant experience, evaluating practices, and building up a community from within interfaces closely with the development of servant leaders espoused by Kiersch and Peters in the leadership models explored in class (2017). It is abundantly clear that FOP cares deeply about student feedback and wishes to develop ways of procuring more, and it follows that the goal of this project is to build upon and engage in new dialogue with the existing scholarship in order to bring out actionable changes which improve student experience. MethodsOn the broad scale, this project encompasses a single cycle of action research. It involved gathering appropriate information through interviews, building the survey tool, and developing recommendations for its implementation. This is not to say that the survey will not be tested—it will eventually be publicized to incoming pre-frosh in order to inform later Steering Committees—but it is relevant to note that the project will not be able to see full use of this tool given the time constraints of the semester and the schedule that FOP runs on (most of the actual information will likely be gathered in the fall). That said, my project simulates using the tool and should provide a useful bellwether. This action research project was deliberately structured to be participatory and invite testimony from leaders on campus. The whole reason this tool is being developed in the first place is because current leaders have found it significantly difficult to structure curriculum and processing, and because current Steering Committee members have found it equally difficult to set a course for the upcoming year without ample knowledge. The community, therefore, consists of primarily the leader community and Steering Committee, because this is the group that it is setting out to serve. That said, the information ultimately gathered will be from first-years and from participants, so I would argue that the community reaches beyond simply FOP. This project moved through a series of discrete steps. Since a survey is at its core made to collect information, I wanted to identify what sorts of data points are useful. I sent a recruitment email over a FOP email list soliciting participants and conducted interviews with members of the leader community to bring together common themes. At the same time, I wanted to pair student thoughts with expert opinions, so I engaged in several expert interviews. These included an administrator in our program and two members from the Office of Institutional Research. We had discussed in class needing around six interviews to begin to tease out useful themes, so once I had collected seven interviews (four leaders, three experts) here I moved onto analysis. The analysis section of the process involved looking over transcripts (provided roughly by an online transcription tool for basic notes and transcribed by hand for key quotes) and identifying shared ideas: common themes across interviews were noted, and representative quotes were taken as evidence. Together, this evidence formed the broad basis of a series of recommendations I used to make changes, and it allowed all changes I made to be rooted in student opinion. For the second phase of the project, I wanted to build upon FOP’s past survey tools. I collected FOP’s old exit survey, which consists of a leader evaluation and a couple of open-ended questions. Since the goal of this project is to evaluate the program, I focused more on the latter than on specific leader feedback. Using a combination of old and new questions informed by common themes acquired from an in-depth analysis of the earlier interviews, I created a new survey. At this point, the project slightly diverges from the action research design. Originally, the plan was to take the edited survey and send it out directly to one or two first-year dorms. However, after a conversation with the Office of Institutional Research, I was advised to try a different option. Although sending out a survey can be useful in some cases, most of the first-years that would have received the survey would not have actually participated in FOP (only a quarter of the incoming class does), which would have further depressed an already-low response rate and provided less data. Moreover—and this was the critical rationale for switching strategies—many of the respondents would not have sufficient context to offer constructive feedback on the tool at a meta level. The recommendation was to instead run a focus group to evaluate the survey: this would then undergo another round of editing and be piloted with the actual incoming students this fall. This would provide far more useful validation of the survey, since fall response rates are traditionally high and since FOP leaders are in close contact with their FOPpers to solicit active feedback. Upon making this judgment call, I set up a focus group of Steering Committee members to evaluate the survey. Since Steering Committees often have different visions and priorities, focus group selection contained representation from three generations of the leadership body to add balance to the conversation. Their comments were again evaluated using an interview analysis. The same survey was finally tested by two individuals who went on FOP this past cycle. They went through the survey and were asked to identify any points of confusion and suggestions for improving clarity. Finally, all comments were combined into a final survey. As mentioned before, the survey has not been actually sent to FOPpers, since FOP occurs during August. However, the recommendations have been essentially implemented and packaged into a finished tool which is ready for use and has the backing of the program. Although the recommendations cannot be completely evaluated for effectiveness until after this next round of FOP, the focus group and follow-up with some past FOPpers from this cycle offer at least a preliminary vote of confidence in the tool. As a part of and leader in the communities I am working with, I am trying to be as intentional as I can with acknowledging and planning around my relationships with my subjects. There are many steps I took to adhere to ethical conduct. I maintained confidentiality almost entirely: for first-years, the responses are what is important, and I will be using either confidential interviews stored only on my computer or anonymous surveys that do not collect their personal information. As an extension of that, I will respect privacy by allowing questions to be opt-in and not probing beyond where people are comfortable; moreover, their information will be treated as privileged and not shared beyond myself (or, if shared, not associated with identifying information). I was very careful to provide informed consent at two levels, both during recruitment and before each interview or survey began: I used the provided forms in the appendix to ensure that they know why I am gathering the information and what it might be used for. It is important as a member of Steering Committee to recognize my positionality. Before I spoke to people during interviews, I affirmed to them that I understood I was in a position of power and that they should never feel as though the answers they give will have any effect on their standing in the community or our relationship. When I sent communications over the list, I was very careful to phrase everything as entirely voluntary. In all of my correspondences, I tried to be professional but affirming, avoiding an interrogatory interview dynamic. I also recognized that, as the person who designed this project, I have room for bias. To combat this, I was careful during interview and focus group selection to choose a wide cross section of the FOP community (for example, choosing multiple members from different Steering Committees during the focus group). I avoided reaching out to individuals that I knew would automatically validate or validate my opinion, and I did not share my personal opinions about the subject. I also made myself open to change if the research permitted it; the Office of Institutional Research made recommendations that caused me to change my plan from sending my survey over a listserv to instead running a focus group. Ultimately, it was these intentional steps to avoid bias which made my project stronger and more useful for our community. There are important limitations that should be considered. The first is that there is an asymmetry of information whenever we design survey tools. Since my project is primarily made for leaders, I interviewed members of the FOP community. We think very deliberately about FOP, and for that reason may have a conception of our program which is different from what our FOPpers think. This was well-evidenced by discussion of jargon which make total sense to leaders but little sense to participants outside of our circles. Another limitation is sample size. Ideally, this survey would have gone out to all of our FOP participants right after their trips, which would have given a high response rate and useful feedback. Unfortunately, because of the timing of classes versus preorientation, as well as the realities of survey fatigue at this point in the year, it is impossible to have a perfect test run of the survey until fall comes. The third is that there is always a tension between response rate and data collected. As survey-designers, we have two competing objectives, which are to get as many responses and get as much data from each response as possible. In order to make sure that students are not overburdened by an extensive survey, some of the information requested by leaders has to be left out of questions on the survey. While the survey tool created by this project looks like it will be highly effective, understanding these limitations allows us to evaluate potential shortcomings and keep improving our work. ResultsThe data collected over the course of this action research project aligned well with the general expectations that prompted the project. The survey tools currently in place are not very useful for the community at large, and are certainly not factoring into Steering Committee’s decisions; in order to properly measure student experience, significant changes in the survey must be implemented. The key findings, as outlined by the various sections below, fall into a few large buckets: (1) creating space in the tool in order to invite constructive feedback which informs programmatic improvement; (2) using an itemized policy breakdown tied to a quantitative Likert scale rather than just asking for open-ended questions regarding programs; and (3) improving clarity and removing jargon wherever possible. Outside of creating this survey, it is also important that our Steering Committee puts together a creative and effective plan of action to ensure that we are receiving a high response rate in the fall. Preliminary Interview—Unmet NeedsFrom the first round of interviews with current leaders at large and several expert opinions, it became quite clear that FOP suffers from an information gap where useful data which could improve the program is being lost. When prompted about how they currently make decisions about curriculum on trips, leaders barely referenced specific tools that FOP has offered in the past (such as the pre- and post-FOP surveys) made to gather data. Most of them preferentially chose, either due to effectiveness or knowledge of resources, to fall back on the experiences of their own training and other anecdotal pieces of evidence. Amy Danoff, a current junior in the program, cited a collection of informal data points including “how enthusiastic [her] FOPpers were, how many applied to lead, and how active the group chat is,” noting that they were all “imperfect measures of how successful a trip is.” As interviews unfolded, it was evident that most leaders were shaping their trips based on how they themselves felt and had limited outlets for feedback. Part of the reason that this was a trend was because not all of the feedback provided to leaders was actually useful. The administration of past surveys has regularly occurred right after the trips return to campus, when participant spirits are high and groups feel closest to one another. When evaluating the current survey options available, there was a tendency for FOPpers to provide overly-positive data; this is fairly constant every year and has been noticed by leaders. Andrew Aoyama, a leader who himself attended FOP, reflected, “A lot of the aspects of the survey I look back at with rose-colored glasses,” an observation he noticed was shared by a good number of his own FOPpers. These observations track rather closely with the data: a confidential conversation with a representative with knowledge of the program noted that over time, mean scores have remained constant and extremely high on a scale of one to ten. Although the program hitting high benchmarks can be seen as a sign of success, it also means that important feedback is likely being glossed over. The bottom line is that the information which is most useful for leaders is currently not getting collected. Across multiple interviews, there was general agreement that constructive, occasionally critical feedback is extremely useful for driving change. Current Steering Committee member Conlan Olson notes, “I wish we knew more about was why people are less comfortable doing these things, which we can make a lot of guesses about. I would like to know what the big hurdles are for people.” The survey that has been in place since 2014 does not offer ample space to explore what did not go well—it only has one main space for questions about improvements that are not logistics. A revamped survey, therefore, should more pointedly search for changes which could improve FOPper experience. Preliminary Interviews—An Itemized ListAn actionable step that could vastly improve how FOP’s leadership uses data is incorporating specificity into the design of the survey. Rather than relying on simply vague open-ended questions, using a Likert scale matched with targeted programs can help. In a conversation with a confidential representative from the Office of Institutional Research, this was an early point which was heavily emphasized: “A lot of times we have a survey and put the different program elements […] it allows you to get at the relative level of helpfulness.” The interview stressed the importance of using mixed methods, especially when confronting broad issues such as student experience. Program leaders may have ideological strategies for pushing their organizations forward, but the effects they exert on participants is usually through policies, and being able to analyze these at a microscale level helps leaders understand where their resources may be most useful. The suggestions from the expert opinions were well corroborated by data from current leaders. Multiple members of FOP independently brought up the idea of splitting up the survey in an itemized form. Though there was some heterogeneity in opinion as to how this could be best implemented—Aoyama pointed out that successful trips never look the same, and that quantification could be difficult—a common theme that showed up was dividing the survey into a logistics versus an interpersonal section. Doing so would improve survey experience for a variety of reasons. First, it would offer participants a better chance to voice how they felt about disparate aspects of the FOP experience; while they are given free-response questions to do so currently, they are not always able to recall all aspects or have a good idea of what is intentionally planned by leaders and what may be trip-specific. Implementing this change with a Likert scale allows data to be easily benchmarked across various years, providing helpful quantitative data to match the qualitative data currently collected. This itemization is particularly useful for Steering Committee, which must act quickly and often looks at specific policies. Being able to have a broad overview of all of the small policies that FOP uses is a simple and efficient way of understanding what should be improved in the window of time that Steering Committee has to act. Note: at this point, the previous survey was combined with the above two recommendations and formed into a prototype survey which is located in the Appendix of this paper. The improved prototype was the document used for the focus group’s evaluation. Survey Evaluation—Reflections on Previous ChangesWhen the updated survey was offered to a focus group for analysis, many of the earlier edits were affirmed by their comments. Despite serving in different contexts with different groups of people, Steering Committee members across all years agreed on the importance of increasing access to the outdoors, a question that can only be answered by hearing from students what barriers still exist. That said, there was not total consensus on all of the points discussed. Some members felt that that even in its edited conception, the survey presented too much of a positive bend. Akweley “Q” Okine, from Steering Committee ’18, pointed out that, “Right now, many of these questions are about ‘What is so great about FOP?’ That’s not everyone’s experience,” a feeling which was underscored by a member of Steering Committee ’19, Gabrielle Fernandopulle, who argued that asking participants to provide improvements and suggestions may be difficult given the lack of programmatic context they have. The general tone of the conversation seemed to indicate that it is important to create as much space for FOPpers as possible in the survey while leaving the onus of finding solutions on the actual leadership body. Evaluators also responded well to the creation of an itemized list. Most replies with regards to that change were met with additional ideas on how to make the process more specific. Amie Garcia, a member of Steering Committee ’17, stated, “I like how we separated things. I think I would maybe separate the physical and emotional safety statement,” pointing out that the former option could be due to external variables such as weather, while the latter could be a red flag for failures in leader training. In general, suggested divisions that were specific and split programming into sensible subcategories were met with support within the group. Survey Evaluation—Improving ClarityOne of the most significant points of discussion in the focus group setting was the idea of clarity. Steering Committee members, perhaps by virtue of their position, often use terms in training and in decision-making which are incomprehensible to participants. A fair number of these points made their ways into the survey and were talked about at length during the session. One contingent of members, for example, felt as though certain terms were too vague within the FOP context. Steering Committee ‘19’s Kalena Wang stated that she prefer not to use the term “processing activity” (a catch-all term used by leaders to describe a series of informal conversations about background, feelings, or experiences) as a point on the survey, as the term can feel like jargon and is actively avoided by some leaders on trips. It was agreed that, wherever possible, eliminating these sorts of terms would make the survey feel more accessible to participants and would grant better answers. Others were worried about the signaling that using certain terminology could have on the student experience. FOP as a program has added a mandatory conversation about engaging diverse backgrounds and identities to its curriculum, which is called the “Equity, Diversity, and Inclusion” conversation. Several focus group members were reticent to use this specific phrasing in the quiz. Okine pointed out that “different people manifest that kind of conversation very differently on their trips,” and that framing this sort of programming as a singular conversation might be both be alienating to FOPpers and misrepresent the priorities of leaders who might otherwise try to encourage more extended dialogue. Whenever possible, interviewees preferred to describe this content with descriptors such as “conversations engaging with diversity and background” rather than use specific titles of workshops in the survey. These points were echoed by actual participants. Jocelyn Wang, a FOPper from the Class of 2022, flagged the term “processing activity” in her read-through of the survey, commenting that, “I feel like people might not know what processing activities means. That’s FOP leader language.” The creation of a byzantine, disconnected leader lexicon is exactly the dynamic that the community is trying to avoid. FOPpers also touched on the potential issues of vagueness and overlap. Cooper Tretter, a FOPper from the Class of 2022, posited that although “processing activities that did touch on equity, diversity, and inclusion were particularly poignant and particularly memorable,” the survey—and FOP leaders in general—did not make it clear whether a conversation was one or the other. From the broad contours of the discussion both within the focus group and among past participants, it seems as though these sorts of distinctions must be made entirely clear lest they confuse survey respondents and call into question the intentionality of leaders on trips. Note: at this point, the prototype survey was updated with suggestions developed from this new wave of comments. This became the final survey, which is available either in the appendix or as the following link on Google Forms: General Evaluation—Combatting Survey FatigueThe final strong point of discussion was maintaining a high response rate. The meeting with the Office of Institutional Research served as a reminder that students quickly abandon surveys if they become too long, and some suggestions from the OIR actually involved cutting many of the open-ended questions. For any survey, this is a trade-off: the longer a survey is, the more information it provides. However, since most of the interviews for this project involved students requesting more information, and since response rates over the past few years have remained rather high, I ultimately decided to choose having a longer survey over having a quicker survey to fill out, a decision which had the backing of most of the members in the community. As a result, those implementing the survey are going to need to be very careful about how they administer it in order to keep response rates high. The OIR representatives pointed out how even incentives such as food or raffle items often cannot induce turnout because of survey fatigue. One strategy that FOP does have at its disposal is its leader network. At the start of the semester, FOPpers spend a great deal of time with one another and with their leaders, since these are their strongest support networks entering college. One past Steering Committee ’18 member, Kenton Shimozaki, described how survey responses could be ensured if pre-surveys were given in person at the start of the week. Similarly, by leveraging leaders through some guided structure—for example, a lunch break where everyone gets together and fills out the survey during the first ten minutes—we would be able to personally guarantee that the tool is being used effectively. Pushing leaders to take charge of their own FOPpers during that early critical window will help make sure that this tool is being used to its fullest extent. Discussion and ConclusionsAs a whole, this action research project has achieved its goal. It has laid out a set of concrete guidelines by which to improve the survey tools FOP has in order to better measure and analyze student experience. Past surveys, though consistently employed, have not been entirely thorough: they have been much heavier on leader evaluations than on programmatic evaluations (which is not especially helpful to Steering Committee), offer vague and few open-ended questions, and do not systematically move through the various aspects of FOP in the way we do through training. In short, these tools have been unable to give us appropriate information to make useful programmatic change. This project has managed to address some of those concerns. It is put forth a set of recommendations: (1) offering more space for criticism in a way which does not prime students to provide unnecessarily negative feedback while still giving an opportunity to be honest and critical; (2) using a specific, item-by-item list to allow participants to score certain aspects of FOP; and (3) improving clarity and removing ambiguity or jargon in the wording of the surveys. As a corollary, the survey will also be implemented with a push by Steering Committee in the fall to delegate responsibility for survey responses onto leaders as a way of combatting fatigue given an extended survey. The implementation of this project should be relatively simple. The survey has been created and will be ready to put into place this fall to begin gathering data. That said, the project was not without its changes along the way. As discussed earlier in the paper, the realities of survey fatigue, coupled with a need for proper feedback, meant that the “pilot test” for this survey was actually with a focus group; the first time it is implemented for first-years will be when it goes live, and although it should not be a problem, it would have been nice to have a consistent large body of trusted individuals (instead of a listserv that would have responses by non-participants) that I could have tested it on. I had also originally hoped to fix the pre-FOP survey, which has not been edited since the 1990’s. However, there was significant pushback within the community, since many people believe that the earlier survey is not meant to be serious. Moreover, having additional information before a trip might affect the dynamics of the trip and create assumptions based on short responses by FOPpers. In the end, I decided that just having a well-constructed exit survey would provide sufficient information for our purposes. Perhaps most excitingly, these interviews unearthed a plethora of additional directions with which to take the research. During one confidential sit-down in the Office of Institutional Research, an interviewee explained that “comparing main findings against institutional data to see if there are any interesting patterns” can allow program leaders to understand their work in context. At the moment, there is very limited crosstalk between FOP and its fellow preorientation programs. Creating a longer-lasting partnership where different programs are able to share best practices and see larger trends about mattering in these transition periods of students’ lives would allow us to take the template from what works well in our organization and create an impact beyond just our membership and participants. While looking outwards is great, it is just as crucial that we look in. A confidential interview with an administration official mentioned that as Steering Committees take on more responsibilities, it is true that “[they] aren’t looking as closely at the leader experience and ensuring that that loop gets closed.” Creating this survey tool has provided many useful lessons to gather information at a scale of hundreds of respondents. It would be relatively simple and entirely feasible to take these practices and put them back towards our community in order to ensure that our program is upholding its responsibilities to leadership development and social space for current students as well. BibliographyBell, Brent J. “A Census of Outdoor Orientation Programs at Four-Year Colleges in the United States.” Journal of Experiential Education 33, no. 1 (2010): 1-18. Bell, Brent J., and Brady G. Williams. “Learning from First-Year Fears: An Analysis of the Harvard First-Year Outdoor Program’s ‘Fear in a Hat’ Exercise.” Journal of College Orientation and Transition 14, no. 1 (2006): 47-59. Danoff, Amy. “Accessibility in Exploration: Examining the Varying Impact of FOP at Harvard.” Sociology 104 Final Paper. 15 December 2017. Gore, Paul, and A.J. Metz. “Promoting Successful Student Transition: First-Year Experience Programs.” Encyclopedia of International Higher Education Systems and Institutions. Springer, Dordrecht. 31 July 2017. Harris, Michael W. “Examining a Pre-College Wilderness Orientation Experience and Its Role in Facilitating Transitions to College.” Clemson University. August 2014. Jones, et al. “The International Foundation Year and First Year Transition: Building Capital, Evolving Habitus, Developing Belonging, Preparing for Success.” Teaching in Higher Education. 06 November 2018. 1-16. Kiersch, Christa, and Janet Peters. “Leadership from the Inside Out: Student Leadership Development Within Authentic Leadership and Servant Leadership Frameworks.” Journal of Leadership Education 16, no. 1 (2017): 148-168. Mott, Emily. “Mind the Gap: How to Promote Racial Diversity Among National Park Visitors.” Vermont Journal of Environmental Law 17 (2016): 444-468. Schlossberg, Nancy K. “Marginality and Mattering: Key Issues in Building Community.” New Directions for Student Services 48 (1989): 5-15. ................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download