Pilot Study Plan



Pilot Study Plan

For

CAREER: The Role of Representation in the

Synthesis Process

NSF Grant REU-9984484

Durward K. Sobek, II

Mechanical & Industrial Engineering

Montana State University

Bozeman, MT 59717-3800

Tel. 406-994-7140

Fax 406-994-6292

Dsobek@ie.montana.edu

August 10, 2000

Abstract

This document describes the research plan for the pilot phase of the above NSF Career Award. The pilot will be conducted in the upcoming Fall semester (September-December, 2000). Comments and feedback are welcome.

This research project is a cross-disciplinary study of the design processes used by student design teams at Montana State University. The primary purpose is to gain a deeper understanding of the human synthesis process—what makes a good design engineer. Here I give a brief overview of the grant and the research issues. The next section describes the pilot study I have planned for this fall.

Grant Overview

In the grant proposal, I hypothesize that representation plays a very central role in designers’ reasoning and creative processes, and outline an emerging theory of design representation drawing upon current research in design theory and cognitive science. However, since we do not know how dominant the effect of representation is, the impact of representation must be assessed in the context of other factors that affect the success of design projects.

The original proposal was to study senior ‘capstone’ design projects in the engineering and architecture schools at MSU. Data collection will focus on characterizing the design process of each student team, and measuring project outcomes. Design process data will include design representations, time/effort expended designing with these representations, design progressions, and timing of key decisions. These data will be collected primarily through design journals kept by the students, supplemented by direct observation. Project attribute data will also be collected on key variables such as team composition and diversity, technical skills, resources available, and advisor interactions. These data will be collected through student background questionnaire, student and advisor interviews, and direct observation of team meetings. Finally, project outcome data will be assessed from the student records of their design activity (person-hours), a design jury to evaluate design quality, and objective evaluation of student teams’ final reports.

Qualitative thematic analysis will look for patterns that distinguish “good” projects from “poor” projects. Statistical analysis will correlate design process attributes to project outcomes such as person-hours spent on the project, creativity, feasibility, and completeness.

Research Questions

I’m extremely interested in gaining insight into the fundamental, cognitive aspects of the synthesis process. Why do some people seem like naturally ‘good’ designers? Why are many very intelligent people not terribly good designers? Additionally, I would like to study these issues in situ--in other words, in the context of a ‘real’ design problem rather than in an artificial laboratory setting.

Much of the work in design does not deal with this fundamental issue. Interdisciplinary projects, industry sponsored ‘real world’ problems, good communication, teamwork, and so forth are all well and good, but somehow miss fundamentally what makes one a good designer. The question seems to go beyond just ‘some people are more creative.’ Though clearly important, creativity in the form of lots of ideas, on its own, does not seem to be sufficient.

Since design at its heart is a cognitive task, it makes sense to build on the sizeable body of work in cognitive theory. One of the emerging schools in cognitive theory is the area of distributed cognition. It says that human intellectual activity is not purely a mental phenomenon internal to one’s mind. Rather, intellectual activity results through an interaction between a person’s internal mental processes and his/her environment. This includes interactions with images, physical objects and dynamics, and other people. The result is a radically different view of cognition, that sketches, diagrams, experiences, and relationships are integral to intellectual activity not mere peripheral aids or by-products.

With this in mind, this project will specifically investigate the role that choice of representation plays in the design process. How do these choices affect the design process and outcomes? Does use of certain representations help build skills that enhance synthesis ability? What impact do computer generated representations have on human cognitive processes with respect to synthesis? We may not be able to generate conclusive answers to these questions, but we hope to generate compelling evidence to spur further investigation.

In addition to the cognitive questions (and related to it), I’m interested in design process questions. Particularly, models of engineering design are nearly always iterative in nature--the designer focuses on one idea, and modifies it incrementally until it meets the problem objectives. Part of my dissertation research indicates that a set-narrowing (rather than purely iterative) approach may be much more effective. I would like to gather quantitative evidence to test this idea. Efficient exploration of alternatives will likely involve specialized representations to develop, reason about, and compare alternatives.

Of course, these questions need to be addressed in a rich and complex set of contextual factors. The ones we will be primarily looking at are: student background/experience/skills, group dynamics and teamwork, advisor interaction, client interaction, and project complexity. A set of research questions on these “control factors” is emerging as we consider them, and will be formally incorporated as the project progresses. Specifically, a couple of questions that have emerged from this summer’s work are: Is design performance positively associated with spatial reasoning (i.e., visualization) ability? And, what effect does the advisor have on the project (mounting evidence suggests the effect is likely huge), and what makes for effective advising?

The Pilot Study

We will study our first group of projects this fall semester, and will assess the project (especially data collection methods, data quality, and resources) to decide on the future direction based on these results. Most likely we will have to limit the scope of the project through some combination of sampling and limiting it to fewer disciplines.

The course chosen for the pilot is ME 404, the ME senior capstone course. We’re expecting 20-25 students to enroll. They will be divided into teams of 3-4 students, so we’re looking at 6-8 projects. The ME 404 instructor (Dr. Mike Wells) is enthusiastic about the project and has been very cooperative.

The following pages represent the current state of the pilot study plan (admittedly, a work in progress). The first subsection itemizes the key study variables, measures, and data sources. The next subsection contains a detailed data collection plan. The third subsection, the most fuzzy, outlines our current thoughts on analysis.

Study Variables and Measures

Table 1 lists the process data variables we will study. The primary source of information will be the student design journals, where students will be asked to keep a detailed chronological record of their design activity, and how much time spent on each activity.

Table 1: Process Data

|Variable |Measure |Source |

|Representations used |Rep. Category |Student journals |

|Activities and activity sequence |Project plan (i.e., PERT chart) |Student journals |

|Design process stage |Stage category |Student journals |

|Ideation -- | | |

|sources of ideas |Qualitative; categorical |Student journals |

|consideration of alternatives |# of alternatives | |

|Design tools and methods |Qualitative; categorical |Student journals |

|Design decisions (& rationale) |Qualitative; categorical |Student journals |

|Time spent on indiv. Activities |Hours |Student journals |

Table 2 lists how I plan to measure project outcomes. Time will be tracked by self-reported times in the journals. Design quality will be assessed by an expert panel (a design jury) on a number of criteria based on the teams’ final project presentations. The quality measures are listed in order from least subjective and most subjective. The panel assessment will be cross-verified with instructor assessment and research team assessment of written reports. Customer satisfaction is a subjective assessment by the project sponsor.

Table 2: Outcomes Data

|Variable |Measure |Source |

|Time / efficiency / effort |Hours (total) |Student journals |

|Quality -- | | |

|objectives met? |Scale ratings |Expert panel |

|thoroughness |Rankings (?) |Written reports |

|feasibility | |Instructor assessment |

|creativity | | |

|elegance | | |

|Customer satisfaction |Scale ratings |Client |

Table 3 itemizes a list of “control variables” for lack of a better term. These are project characteristics outside of process and outcome variables.

Table 3: Control Variables

|Variable |Measure |Source |

|Project characteristics -- | | |

|scope |undecided |Project documents |

|deliverables | | |

|Team characteristics -- | | |

|size |# of team members |Student interviews/ |

|diversity (gender, age, ethnic) |undecided |questionnaires |

|background | |Observation |

|skill sets | | |

|cohesion | | |

|organization and roles leadership | | |

|indiv. contribution | | |

|Advisor-team relationship -- | |Student interviews/ |

|advising style |undecided |questionnaires |

|quality of interaction | |Advisor interviews/ |

| | |questionnaires |

| | |Observation |

|Client-team relationship -- | | |

|industry sponsored? |Yes/no |Student interviews/ |

|client direction: strict or open? |undecided |questionnaires |

|level of interaction | |Observation |

|level of client support | | |

Data Collection Plan

Figure 1 lists the key events for data collection, and when they will occur (approximately) over the semester. Each is described in more detail in the paragraphs following.

Figure 1: Data Collection Timeline

Kick-Off Meeting

The first day of class is September 5. Part of that class will be dedicated to a brief presentation of the project. We will also hand out the design journals and team notebooks (see #3 below) and give instructions on completing the design journals.

Final Report

All teams will submit a final written report at semester’s end. After the instructor has finished grading the reports, we will receive them and evaluate them using a Design Report Rubric developed at the Colorado School of Mines.

Student Journals

All individual students will keep a design journal to record all project-related work and activity. They will also record the time spent on each activity. A few items regarding design journal logistics follow:

• The design journal will comprise 15% of each student’s course grade. The research team will grade the journals, lifting this responsibility from the faculty advisor.

• The research team will monitor design journal progress by periodically collecting and reviewing journals. We will take care to provide feedback to encourage complete and accurate documentation, without prescribing specific data or representations. Fast turnaround will be paramount.

• I’ve designed and had printed up customized journals for this class, complete with instructions and space for time and date. The cover, instructions, and sample page are in the attached appendix. Students will be given the journals free of charge.

• To handle computer work, each team will maintain a team notebook (3-ring binder provided). At the end of each computer session, the student will print his/her work, date it, and place it in the team notebook in chronological order. They will create a journal entry as for any other activity, with date and time noted and a description of the computer work referring to the output in the team notebook. The team notebook can also be used to file shared or common information, such as client data, results of patent searches, etc.

Observation of Advisor Meetings

In ME 404, each team is assigned a faculty advisor, and meets with that advisor regularly. Most teams schedule a regular weekly meeting time. A member of the research team will be a ‘fly on the wall’ observer of every (if possible) advisor meeting.

Observations will be recorded in field notes. Our assessment is that video taping (and even audio taping) will be too intrusive, and thus interfere with faculty-team interaction and discourage faculty cooperation.

A couple of sticking points that have yet to be resolved are: 1) Faculty cooperation, and 2) small offices. We will hold an advisor meeting before the semester starts, explain the project, and hopefully gain faculty buy-in. We will ask faculty with small offices to meet in the departmental conference room.

Observation without Advisor

It would be ideal to gather some observational data of team interaction without the advisor present. At this point, we do not have a good method for obtaining these data, as the course does not currently have a project room. We are working to resolve this issue.

Student Data

Student data will come in three forms: 1) Background information, 2) midterm assessment of project, 3) end of project assessment.

Most of the background information will be collected via a questionnaire that asks for personal information, a self-assessment of design-related skills, and some open-ended questions on work experience and career aspirations. A draft version of the questionnaire is located in the appendix. Additionally, students will be asked to take a spatial reasoning test. I am currently researching an appropriate, validated test.

The midterm and end-of-project assessments will concern group dynamics, individual contribution, and advisor interaction as well as overall assessment of the project and course. These data will be collected via interview, with possibly a written peer evaluation to assess individual contribution.

Advisor Data

Two sets of data will be collected from advisors via brief interviews. The first, near the beginning of the term, will concern the advisor’s philosophy toward advising--their preferred approach assuming a near ideal situation. The second, at the end of the term, will concern the approach(es) actually used during the semester--what adjustments to the ‘ideal’ did they make when faced with non-ideal conditions. The second set of data can be cross-verified with student interview data.

Design Jury Evaluation

We will assemble a panel of ‘expert’ design engineers to evaluate the quality of each project’s design. We will design an instrument for this purpose.

In the past, ME 404 has done final presentations using a formal oral presentation format. So the original plan was to have the design jury watch all the presentations, and evaluate each one in series. However, we are considering changing to a “design show” format where each project team will set up a poster display with product demo’s as appropriate. Then jurists can circulate and interact face-to-face with the teams to conduct their evaluations.

Analysis Plan

The data collection planned will result in a large amount of data to analyze. How that analysis will be carried out is still in its embryonic stage. The pilot study is intended to help us make these decisions, once we see what the data actually looks like. We will probably conduct qualitative thematic analysis first, followed by quantitative analysis based on the time data. The unit of study will be the project, so we will have to figure out a way to aggregate individual team member’s data to the project level.

For the qualitative analysis, we will need to develop a coding scheme. My initial thought is to code based on the representations we see, categorizing them by design stage, for example. We can parse the data in other ways, such as by design tool or method, by activity type (planning vs. idea generation vs. selection, etc.), or by decision.

From a quantitative standpoint, some sort of time series analysis would seem appropriate. I want to preserve the stochastic nature of the data, and the iterative nature of the processes, so this will take some serious thought and will likely be the focus of research activity next summer.

Parallel Activities

Parallel to project planning activities (the results of which are contained in the preceding pages), I have been engaged in a number of complementary activities. First, I’ve been seeking out other researchers whose work considers cognition in design. I’ve also been reading up on relevant cognitive theory.

Second, I’ve been interacting with two colleagues from MSU’s education research department. They have been advising on research methods, and helping me develop grounding in relevant, current cognitive science work.

Third, I’ve been working with a graduate student in the last weeks of this summer, getting him up to speed on design theory and appropriate research methods. He has a mechanical engineering undergraduate degree, so has domain knowledge in the course we are piloting. He does not, however, have any background in the research methods needed for this project, so a good deal of training is still required.

Finally, I have visited MIT and the Center for Engineering Learning and Teaching (CELT) at UW this summer, briefed interested design researchers on the project, and received valuable input. I hope to squeeze in a visit to the Center for Design Research (CDR) at Stanford sometime this fall.

Appendix

ME 404 Design Journal

___________________________________________

Name

___________________________________________

Team

Mechanical Engineering 404C

Dr. Mike Wells, Instructor

Fall 2000

Department of Mechanical and Industrial Engineering

Montana State University

220 Roberts Hall

Bozeman, MT 59717-3800

Instructions

Please use the following pages to keep a complete record of all activity related to ME 404. We want to track the evolution of your thinking about this design problem, from beginning to bitter end. We want everything -- the good, the bad, and the ugly -- recorded on these pages. Thoroughness is paramount.

Here’s some instructions about journaling:

1. Record the date in the upper left corner. Start each day on a new page.

2. Use ink. Do not erase. Delete an entry by neatly drawing a single line through it.

3. Use consecutive pages. Do not remove pages, and do not skip pages.

4. Record the start time for the activity in the left column.

5. Record journal contents in the main column. This is your individual design journal, so record only your activities or group activities in which you participated (see below for computer work).

6. When you’ve finished the activity, record the end time in the left column. Record a new start/end time every time you change activities. The difference between start and end times should represent total time spent on that activity (including journal entry writing).

What should I put in my design journal?

In a word, everything related to your design project … good, bad, and ugly. Some things to include are:

|Sketches |Customer needs/requirements |

|Doodling |Project objectives |

|Brainstorming |Specifications |

|Half-baked ideas |Math calculations |

|Work-in-progress |Design alternatives |

|Scratch work |Design reviews |

|Data collection |Decision criteria |

|Class notes |Decision process and results |

|Meeting notes |Decision rationale |

|Sources of ideas |Evaluation results |

How do I handle computer work like ProE, AutoCAD, or MathCAD?

Your team should keep a team notebook (3-ring binder provided). Whenever you do computer work, at the end of the session, print it out, date it clearly, and place it in the team notebook. Keep the team notebook contents time ordered.

For your design journal entry, simply write a reasonably detailed description of the activity, and reference the printout in the team notebook. Use the same date and time recording procedure as for any other entry.

How will my design journal be graded?

Your journal grade will be based on thoroughness: how well have you documented your process? Is the record complete?

We will not grade your journal on whether ideas or ‘good’ or ‘bad,’ or on the amount of time spent on a given activity. Great designers go through lots of crummy ideas before they find one that works (Thomas Edison tried literally hundreds of filament designs before he found ‘the one’ that made electric light possible). The time data simply give us a way to model the design process quantitatively, so accuracy is important. But outside of the thoroughness criteria, you will not be graded on how much time you put into this course.

Why am I doing this?

For starters, you won’t get an ‘A’ in senior design without a good journal grade. Remember, it constitutes 15% of your final grade. So do a good job.

Second, keeping a design journal is an excellent habit to carry forward into your professional career. You will find that it records ideas and information that you would’ve forgotten otherwise; that the act of writing things down helps you think through problems more thoroughly, systematically, and efficiently; that a detailed record of your professional activity will come in handy on lots of occasions such as promotion time or in a legal dispute; that having an accurate estimate of how much time you spent on an activity is often essential information, like when calculating billable hours for a client or creating a project plan. So start the habit now, and you’ll benefit for many years to come.

Third, by taking this course and creating an excellent record of your activities, you are participating in an exciting research project to learn what design process characteristics are associated with good project outcomes. This information will help future engineering students at MSU and nationally become better design engineers. And who wouldn’t be all for that!

Finally, keeping a journal is fun. You’ll get lots of chuckles when you read through it at semester’s end. Have at it, do good work, and have a great time!

Student Background Survey

ME 404, Fall 2000

Please fill out the information below as accurately as possible. Your response to any question is voluntary, and any personal information you provide will be kept strictly confidential. We ask for your name so that we can correlate to future data we’ll collect. Thank you!

Name: ____________________________________________________ Age: ________

Are you a Montana state resident? Yes No

Gender: Male Ethnicity: Caucasian Hispanic

Female Black Native American

Other:_________________

Cumulative GPA: _______

What’s been your favorite engineering course so far? ____________________________

What courses are you taking this semester? ____________________________________

_____________________________________________________________________________

Self-assessment questions: Strongly Strongly

Disagree Neutral Agree

I am good at mathematical modeling. 1 2 3 4 5

I am skilled at using (circle one) AutoCAD/ProE. 1 2 3 4 5

I am a good designer. 1 2 3 4 5

My technical writing ability is better than most. 1 2 3 4 5

I enjoy freehand sketching. 1 2 3 4 5

I can deliver a solid oral presentation. 1 2 3 4 5

I have the potential to be an excellent engineer. 1 2 3 4 5

Please describe your internship/summer job experience(s). Include the company name, time, and briefly the nature of the work.

Please describe significant experience “working with your hands” (e.g., shop experience, carpentry, wood work, auto repair, etc.).

Why did you choose this major?

What do you hope to do with your engineering degree? What kind of work?

-----------------------

[pic]

[pic]

................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download