| TITLE PAGE | TABLE OF CONTENTS | BIBLIOGRAPHY | AUTHOR'S HOME PAGE |
Another written introduction to the exhibit read, in part:
Multimedia Playground '95 is designed to provide both technophobes and aficionados of all ages the opportunity to have direct personal experience with emerging digital technologies. It features such sections as The Internet Roundtable, Virtual Environments, a Digital Snack Bar, and a Media Workshop, where you can create your own multimedia projects. The exhibition also includes presentations, panels and discussion groups. Get started with a beginner's area, continuously staffed with real live human beings whose mission in life is to introduce you to technology. Check out over 50 recently released commercially available CD-ROM titles for both Macintosh and PC platforms. Never tried a CD-ROM before? Explainers will help you through. Still haven't surfed the Internet? Or haven't experienced the slightly outer-space quality of seeing and speaking to strangers around the globe with real-time video images? Find out about creating short-term multimedia projects on-site. Don't miss the public panels. Or the opportunity to experience some of the latest developments in human-to-computer interaction. Multimedia Playground '95 provides hands-on access to some of the most compelling artistic, educational, and commercial applications of multimedia. This year's Playground explores the ways in which these new technologies are stimulating creative partnerships particularly within the areas of art, science, education and commerce-as well as encouraging new models for information gathering and communicating (http//:www.exploratorium.edu/January_1995.html).
My project was situated in the "beginners area" mentioned above. This was, perhaps, a misnomer for the area which contained displays meant to elucidate the inner workings of computers. My project was installed on a PowerPC 6100 sandwiched between two displays: To the left was a display demonstrating how hard drives work (the display was donated, I believe, by Quantum). On the other side of my project was a plexiglass case containing a disassembled computer (a Mac II). Also in this subject area, but positioned under the Information banner a few yards away, was the "computer dissection" table. This "dissection" was a live demonstration in which a computer CPU was opened up, disassembled and reassembled.
For the first few weeks my program shared a computer with a Time Warner CD-ROM entitled How Computers Work. Either program could be accessed through At Ease, a Macintosh utility program which limits some system functions so that they can't be tampered with in a museum (or school or home) environment. About half-way through the run of the exhibit the Time Warner CD was stolen. From that point on my project, securely installed on the internal hard disk, was the only program running on the PowerPC 6100.
In designing the questionnaire I tried to follow advice given to me by professional survey writers; their advice was to always begin with a question that will interest the respondent and get his or her intelligence engaged. My first couple of questions were designed to do this rather than to specifically address any testing issues. I was also advised to not ask that users divulge their age; if user age was important I should use other parameters to make an approximation. My questionnaire posed questions about the respondent's background, computer literacy, and desire to know more about the subject of computers. A combination of open and close-ended questions were asked. After the user completed the first screen of this survey the Computer Magic program was launched automatically. The user was asked to return to the survey when he was ready to evaluate the program. Responses to the questionnaire were stored as records in a Hypercard stack and were later exported to a spreadsheet and database for analysis. This method of data collection was effective since it was not dependent on my supervisory presence at all times - but it also had its down-side. Although many museum-goers responded to the first screen of questions the vast majority of these people did not complete the second screen of evaluative questions. The answers to the first screen of questions were primarily intended for background information, the answers to the second screen were intended to yield far more significant data. It was frustrating to be left with numerous incomplete datasets. Even with this problem, there were twenty-one complete sets of data available for analysis when the Exploratorium project concluded. Even though actual participation in the survey did not conform to my original design, I found that the large number of pre-use responses yielded interesting information about the population who visited this exhibit and tried out my program.
The survey questions underwent several changes during the course of the exhibit. The first version of pre-use questions were as follows, with the possible responses listed below each question:
1. Name one thing that computers can't do that you wish they could:
(open ended response)1a. Do you think computers will ever be able to do that?
(Yes - No)2. What is your occupation? (if you're a student give grade level, if you're in college give your major):
(open ended response)3. Approximately how old were you when you first used a computer?
(open ended response)4. Do you use a computer at work or school?
(Yes - No)5. Do you have a computer at home?
(Yes - No)6. How would you describe your level of knowledge about computers?
(Beginner - Experienced User - Can write programs)7. Did you learn about binary mathematics in school?
(Yes - No - Don't remember)8. Questions or comments for the author (please leave an e-mail address if you want a response):
(open ended response)The first two questions were simply meant to be attention-getters - questions that would engage the user and get him thinking. Questions two through five gathered information about computer use and background. Question seven was included as an attempt to determine whether there was a link between knowledge of binary math and better understanding of computer science. This question was later removed; I reckoned that no pattern would emerge since most users were failing to answer the post-use questions. Users were asked to give their initials so that pre and post responses could be merged into one record during analysis.
Here is the first version of post-use questions:
1. Which part(s) of the program did you like the most?
(open ended response)2. Which part(s) of the program did you find confusing?
(open ended response)3. Is there something about computers that you know now that you didn't know previously?
(Yes - No)3a. If so, what?
(open ended response)4. Does programming remind you of any other process you're familiar with?
(open ended response)5. Does digital encoding remind you of anything else?
(open ended response)6. What else would you like to learn about computers?
(open ended response)7. Any questions or comments?
(open ended response)Some of the questions asked in this evaluative questionnaire (the post-use questions) were based on software evaluation guidelines (Gomoll, 1990). The questions which asked users to compare programming and encoding with other processes reflected my own interest in uncovering appropriate metaphors for these processes. These questions were also meant to be a measure of how well users were able to use existing mental models to explain processing. I reasoned that if they were able to create new connections to existing models - and if these models seemed applicable - it would indicate that my software was effective in generating new understanding.
At first there was a technical problem with the questionnaire; I found that a sub-routine used to clear answer fields for the next user also had the unpleasant side-effect of obliterating the previous user's responses to most questions. By the second week in February changes had been made to the scripting so that saving and storing user responses was handled correctly. After February 18 I examined the data collected up to that point and changed the content of several questions; in some ways I consider the period prior to this date a pilot phase of the project. After February 18 pre-test question seven was changed to read:
7. Are you interested in knowing more about how computers work?
(yes - no)and question eight became: 8. If you have ever tried to find out more about how computers work, describe how you did it below:
(open ended response)These questions were included to confirm my premise about the Exploratorium audience; I supposed that the Exploratorium would provide an audience which already has an interest in this subject matter. Question eight was a way to determine what kinds of resources these people would turn to in order to further their knowledge of this subject.
These field notes also include notes I took during several interviews. These interviews were conducted with exhibit attendees who were asked to use and evaluate my program. I asked the exhibit manager for the use of a specially configured computer in the "Special Projects" area of the exhibit to conduct these interviews. The configuration of this computer included output to a large (35 inch) presentation monitor. It was my hope that the program would attract more attention if the large display monitor was used. Multimedia developers were encouraged to use this computer to demonstrate their latest software or for evaluative situations like my own. This type of arrangement was consistent with the goals of the Multimedia Playground which encouraged cooperation between developers and end-users. The exhibit planners hoped to involve the general public in the future development of multimedia. The exhibit manager cleared me to use this demo set-up for my interviews. In the interest of conducting more efficient interviews I prepared questionnaire forms and program evaluation forms. My plan was to compile many interviews in this setting; however, I soon discovered that in an exhibit environment interviews of this kind were nearly impossible. I realized very quickly that meaningful data would not emerge from such interviews because the attention span of most people in this environment is way too short; despite my careful planning not a single question and answer form was used. During the course of the exhibit I discovered that my project wasn't the only one to run into this attention span problem. Part of the problem was the location of the display; my program was positioned in an area which received little traffic so that it was difficult to pull a crowd even with the large monitor working like a billboard. When I was able attract someone's attention, another problem emerged; any depth of questioning in this environment felt intrusive. In addition, a practical problem impaired my interview efforts: the volume level of ambient sound in the exhibit hall was so high that it was virtually impossible to use an ordinary tape recorder to collect interviews. In spite of these problems I did interview several people and recorded their reactions. During these interviews I observed the users as they used my software and interrupted them to explain the project only if necessary. Unfortunately, this process did not yield sufficient data for any meaningful analysis, so another approach was needed.
Ultimately I found that the questionnaire format, although it yielded interesting data, was not a good match for at least one objective of this project. Although many of the questionnaire responses were interesting, the distance between questioner and respondent was problematic. I found that I wanted to ask follow-up questions. I wanted to be present, listening to the users as they navigated the computer interface. Furthermore, the questions specifically designed to evaluate the software (the "post" questions) went unanswered by the majority of respondents. Since part of my evaluation required that I determine whether this prototype was effective, I decided to add another phase of evaluation to my project.
I chose to conduct the second phase of my project in a school environment. To facilitate this I restructured my methodology to conform to the needs of a middle school computer lab. But first a site had to be chosen. A teacher's guide was prepared and sent to several Bay Area computer lab directors and math teachers. This guide described the software and project including an explanation of the pedagogical objectives of each section (the teacher's guide appears in Appendix F). Pam Miller, computer lab manager at the Mill Valley Middle School, responded that she would be interested in testing out the program. Mill Valley is a predominantly upper-middle class community located in southern Marin County, in Northern California. The town is about twenty minutes from downtown San Francisco and there are many high-tech companies located nearby. MVMS is a relatively small school of 660 students in grades six through eight, The school is physically laid out in an open classroom plan which was common for schools built in the 1970s. The school's facilities include two computer labs, a Macintosh lab and an IBM lab. In the Macintosh computer lab about two dozen computers are used by students to complete projects assigned by their regular subject teachers; for example, computer generated presentations may be used by a science class to deliver a report. The computers in this lab are integrated into the curriculum as production tools. The other lab, an IBM (DOS based) lab, is used primarily for teaching "keyboarding" skills.
The school's Macintosh computer lab contains roughly twenty-four Macintosh LC575s but at the time I tested the software (April 1995) none of these computers were networked together. The LC575, while not as fast as the model used during the Multimedia Playground, was perfectly adequate for running my program. Pam Miller offered to help me coordinate my schedule with the various students and the computer lab schedule; she also helped me find a distribution of students of different computer literacy levels and grade levels, both boys and girls. The Mill Valley Middle School runs a "legal" computer lab, which means every piece of software in the lab is properly licensed - not an easy thing to maintain in this setting. The school has site licenses for Microsoft Works, Hyperstudio and Filemaker. Since the lab must purchase all of the software on their computers and, of course, budgets are tight, they are always looking for ways to stock their computers with additional legal software. In some cases they have agreed to help with software evaluations so that the students will be exposed to new instructional software. It was my understanding that I would leave my software for them to use even after the testing phase had ended.
The Macintosh computer lab is open after school Monday through Thursday from three o'clock to four-thirty. No programming classes are taught at the school, but some highly motivated students use the "LOGO-like" scripting language in Hyperstudio. Hyperstudio is a multimedia authoring program similar to Hypercard; it is the closest they will get to programming in grades six to eight. Students using the computer lab during the after school lab hours seemed highly motivated; many students were regulars who were creating their own elaborate multimedia presentations using Hyperstudio. Some of the interviews were conducted during this hectic after school lab time and the setting with its background noise and activity was somewhat distracting for the study participants. When possible, interviews were conducted during lunch hour when the lab was officially closed. Lunchtime was ideal because both the students and I were able to focus on the software better in the more controlled environment.
The evaluation methods I used at the Mill Valley Middle School were substantially different from those used at the Exploratorium. To better focus attention on the content of the program I constructed a workbook which outlined tasks for the students to perform. The workbook was constructed so that activities were associated with each of the four sections of the program. Completion of these activities was used as a motivator for the students interviewed. In addition to the guidance provided by the workbook, the students were given direct supervision which included navigational hints when necessary. I began each testing session by asking several questions about the user's experience with computers and I summarized the purpose of the project. I briefly told them what they would be expected to do: students were to work in pairs to complete the workbook tasks. The software itself was altered slightly between phase I (Exploratorium) and phase II (MVMS) so that the screen leading to the questionnaire would be bypassed.
My evaluation methodology for this was, once again, a hybrid of software evaluation techniques and ethnographic study. My field notes contain my observations of the students using the program as well as my interviews with the students. In these semi-structured interviews, questioning occurred both before and after the program was used. Both open ended and close-ended questions were asked. The interview questions evolved over the course of the weeks during which I conducted these interviews; as I saw patterns of behavior or unusual responses I adapted my questions to probe these matters further. The interviews were documented using a small audio tape recorder. I recorded the answers to my questions before and after the students finished the workbook tasks and I recorded the entire student session with the program. After leaving the site each day I transcribed what I considered to be the most significant parts of these interviews and noted student observations and reactions to the program where these seemed important. The transcriptions of these sessions appear in Appendix H; the workbook I designed follows the interviews in Appendix G.
My original plan was to try to interview fifteen pairs of students, allowing them twenty minutes to run through the workbook and answer the interview questions. This time period proved inadequate and my plan was modified immediately to accommodate the needs of the students; most pairs required approximately forty minutes to complete the workbook activities and questions. In order to give all students adequate time to complete the requested activities I reduced my target number of interviews; even so, I was working within rigid time limitations. Students were not able to spend more than forty minutes during the lunch hour or after school. Due to these time constraints I actively discouraged the students from spending too much time on any one screen and from reading the Info screens unless there was a direct reference to the workbook activity. All participants received navigational instructions and varying degrees of guidance. All students were told to position the cursor over the help button and read the directions - if these directions proved inadequate I intervened and provided more guidance.
Although questions varied and were modified to fit the particular working methods of each pair of students I did try to ask each student some basic questions. I tried to discover:
Copyright © 1996 by Lisa H. Weinberg