Demystifying Computer Science: An Approach Using Interactive Multimedia
New York University, Gallatin School, MA Thesis

| TITLE PAGE | TABLE OF CONTENTS | BIBLIOGRAPHY | AUTHOR'S HOME PAGE |


Chapter 7: Middle School Project

Introduction
Interview Process
Problem Solving
Drawing and Encoding
Programming and Hardware
XOR
Software Evaluation: Interface Issues
Software Evaluation: Pedagogical Issues


Introduction

In order to provide a framework for student testing of Computer Magic I created a workbook. This workbook directed the students to complete specific tasks using the Computer Magic software. Printed screen shots were used in the workbook to show the students which screens were needed for each exercise. The directions were for students to complete the "To Do" activity for each screen pictured. Before giving students the workbook I asked them a few basic questions about their prior experience with computers. I also told them a little bit about the purpose of the program and their role in evaluating it. Once the students had completed the tasks in the workbook I asked them a few more questions and gave them the opportunity to have any of their questions answered. A small tape recorder was used to capture responses to my questions and comments made while the students were solving the workbook problems. The highlights of these taped sessions were transcribed and appear in appendix H.

The twenty-one students interviewed for this phase of the study were enrolled at the Mill Valley Middle School in Marin County, California. The grade level and gender distribution are indicated in the table below:

Male Female
Grade 6 Patrick, Tyler,
Aaron
Allison, Sophie,
Erica
Grade 7 John, Will, Breen,
Sam
Jane, Alyssa
Grade 8 Brian, Ian, Sam,
Jacob, John
Lauren, Joyce,
Caitlin, Kayetana

One of the first questions I asked students was whether they had access to a computer at home. As in the Exploratorium survey, the proportion of students who had computers at home was overwhelming. Of all the students who took part in this evaluation only one did not have a computer at home. Furthermore, it sounded as though the absence of a computer at that one home was a temporary aberration; the student's family had owned one previously and was now in the process of getting a new one. Because of the availability of computers, performance - or lack of it - on the assigned tasks can not be attributed to a lack of familiarity with computers. Most of these students have some experience with both Macintosh and DOS/Windows platforms. All of these students have been required to use computers in school and were specifically familiar with the computers in this lab.

I tried to have the students work in pairs, but in one case I worked with a student whose partner had to leave early. This student (Alyssa) was motivated and interested but working alone she had trouble with some things that another student would have been able to help her with. My assistance produced an unbalanced partnership; I could see that she was relying on my knowledge to complete the exercises rather than pushing herself. It was a convincing demonstration for the benefits of peer collaboration. Working in pairs, students were able to help each other and make up for each other's weaknesses. Alyssa was constantly asking questions; if she had been working with another student most of her questions could have been answered by the other student - but another student wouldn't have known all the answers so Alyssa would have been forced to participate in the discovery process.

The dynamics of working in pairs produced interesting learning behaviors. Erica and Sophie worked so cooperatively that they seemed to be holding each other back rather than propelling each other toward success - it seemed more important to share the success equally - even more important than achieving the success. They were adept at following directions, always remembering to get "help" if they were stuck. They passed the mouse back and forth very consciously not wanting to be unequal partners - or be accused of hogging the mouse. Although these two worked well as a team, the dominant voice of the team was the better manager (Sophie), not necessarily the student who was more proficient (Erica). I saw a similar partnership evolve between Aaron and Tyler. Aaron seemed to understand the directions better but since Tyler had control of the mouse, problem solving took a bit longer.

Two of the girls interviewed, Erica and Caitlin, seemed particularly reluctant to talk about computers although both seemed to have a natural inclination for programming. These two did not work together but of the students who lacked programming experience they were most adept at solving the programming puzzle. When asked for a description of the difference between hardware and software, Erica shrugged off her ability and interest in computers and responded "I'm not really into computers." Perhaps she just said that because she didn't know the answer - just a few minutes earlier she had told me that she uses computers everyday, at night "to play on and look into new things and I use them at school too... " Caitlin, also told me "I'm not really into computers," yet Caitlin's explanations proved especially useful because she solved the workbook problems so efficiently. Subsequent to my interview with Caitlin I was told by the computer lab teacher that Caitlin has had unpleasant experiences with computers: she'd created elaborate work only to lose it to technical problems beyond her control. She'd spent considerable time recreating work. I was also told that Caitlin, possibly as a result of these bad experiences, acts totally helpless in lab. Although it's difficult to tell without further evaluations, such denial of ability and interest could be attributable to the adolescent perception that playing with computers is for boys.

Interview Process

On one of my very first trips to the Mill Valley Middle School a student approached me in the computer lab and asked whether I could explain the difference between hardware and software to her. She had heard the terms used but didn't have a clue as to how they applied to the computers that she was using in the lab and at home. She asked this question as though she was embarrassed that she didn't know the answer - as if it were a simple concept that everyone else knew. I had a hunch that she was not alone and I added her question to my list of introductory interview questions. After asking many students this question I realized that very few of them knew the correct answer. A revealing misconception was that "Hardware is what's on your hard drive" (Breen) because it illustrates how the terms themselves can be misleading. Actually, software is what's on your hard drive, and your hard drive is a piece of hardware. With this kind of confusing jargon, it's not surprising that even those with a reasonable grasp of the differences were insecure in their understanding. Only one of the students I asked was able give a response that I could recognize as valid; Ian explained the differences as "Hardware is stuff that is actually part of the computer, software is the stuff you install." By the last few interviews I had settled on a simple way to describe the difference between hardware and software: if you can hold it in your hand then it's hardware. I told Tyler and Aaron "Hardware is something you can touch, you can hold it in your hand. Software you can't [hold]... Like, you can hold a hard disk in your hand, but you can't hold Hyperstudio in your hand." It's the similar to the idea that you can't hold a story in your hand - but you can hold a book.

During one of the interviews (Sam and Breen) I was asked to explain the difference between RAM and ROM. I realize that there is an entire vocabulary of confusing terms used to discuss computers. As computer jargon enters the mainstream culture it is evident that the terms are being used without widespread understanding of their meaning. Terms like hardware, software, RAM, ROM, digital - even the word programming, are routinely misused.

The misuse of the word "programming" was a noticeable pattern in these interviews. Among the students there was a tendency to say they had "programmed" something when, in fact, they had simply used an existing program - the expressions were used interchangeably (Allison, Patrick, Kayetana). For example, Kayetana, an eighth grader, confused changing the settings on a program with programming. Allison told me that she was "programming a whole bunch of stuff about our family into a program called Family Tree Maker" when what she really was doing was entering data. This particular mix up could have been sorted out neatly by using David Harel's cooking metaphor to explain the difference: the program - in this case software to build a family tree - is like a recipe, the ingredients are the names and relationships in Allison's family. The inability to separate programming from program use was also evident in descriptions students gave of the type of program they would want to write for themselves.

Interviews revealed that students in this age group were very focused on the computer as a provider of games. In response to "what kind of programs do you use?" students often named a game or type of game. An interview with two sixth graders (Allison and Patrick) indicated that it was difficult for them to imagine creating a program that wouldn't be some kind of game. Possibly this focus is due to the limited exposure they've had to software production; in the computer lab at their school the attempts made by students to create ambitious programs with Hyperstudio invariably resulted in some sort of game. Other students, Sophie, for example, focused on multimedia applications. The kinds of programs they'd like to make were clearly not the kind of projects that could be completed by a single programmer. It is difficult to determine whether they had any understanding of what might be involved in producing a new multimedia tool. Erica seemed have an intuitive grasp of how programming could be used, but her suggestion for what kind of special program she'd write was less flashy; she suggested "Just a special kind of research thing for something specific... " Erica said that she uses America Online and it seems likely that her program idea is related to use of the computer as an information retrieval tool. With the expectations students have of creating high-end multimedia programs it seems likely that they will be frustrated in any beginning programming class.

Problem Solving

After interviewing the third pair of students I changed my process slightly and the change had a positive impact on the quality of feedback I was to receive from the users. The change was made out of necessity; when I arrived at the computer lab to interview the fourth pair of students I realized that I had forgotten to bring copies of the workbook with me. I was forced to improvise. I asked students to perform the task I described for them - tasks which corresponded to the exercises from the workbook. Then, when they'd completed each exercise I asked for a description of the problem solving approach they had employed. This actually provided more information about the type of learning that was occurring. Asking students to articulate the process produced an additional benefit: it seemed to help them understand the significance of each task - it was sort of an exercise in metacognition. One particularly productive variant of this question was: "If you were going to tell someone how to do this, how would you describe how to do it?"

Many of the students were inclined to see how far trial and error methodology would get them before they applied analytical thinking. For example, Ian and Brian used a problem solving method which involved clicking on almost everything on the screen; finally when that didn't work they tried to crack the real problem. The trial and error approach was clearly a case of cognitive economy - also known (and frowned upon) as thinking only when necessary. Perhaps this was just a sign of laziness, but their approach had some merit; using trial and error as the first method could help them learn about the interface and reveal shortcuts. To some students "trial and error" embodies the systematic application of a method. As Aaron was setting to work on the programming problem he said "Let's do it systematically, do it one by one." To him this meant trying each line in the slot and solving the puzzle by process of elimination. In truth, this would have yielded a solution fairly quickly since there weren't that many lines or slots; however, he wouldn't have gained any insight about how programs are constructed.

Another problem solving technique used on the programming exercise was to group lines together that had to do with the same subject thereby breaking down the eight possibilities to a smaller number. For example, Jane and Kayetana reasoned "Well, these three have to do with memory... " This method was effective in isolating the variables, then within these sub-groups they might reason through the process described if they understood it, or apply trial and error, if they didn't understand the process. In a number of cases I told students what the first line of the program was to get them started. Once they had the idea that the computer needed to know where the cursor was, the rest of the program seemed surprisingly easy to assemble. One cause for this surprising ease was the application of a simple type of associative reasoning; once they knew that the first line included information about the cursor they looked for a second line that also mentioned the cursor. Then once they knew that memory addresses were involved in the process, they looked for lines that mentioned memory addresses. Of course, this was essentially a multiple choice situation; if they had been asked to write the program from scratch this problem solving technique would have been ineffective.

I wondered how much of an advantage the students with extensive computer experience would have in solving the problems. Jacob and Sam were experienced computer users - the two of them sounded like they knew more about programming than I do. At least one of them was learning the C++ programming language outside of school. On the Encoding activity they figured out the process but were sloppy with their addition and lost time redrawing pixels because of it. They solved the programming problem very quickly once they figured out what the first line should be - they didn't reason through the process or get them right on the first try but they were so quick with the mouse it didn't matter. It seemed as though previous computer experience helped them figure out how to use the different interfaces quickly. Perhaps experience allowed them to apply the trial and error method quickly so that their approach was fast even when it was not well reasoned.

Drawing and Encoding

All of the students completed the first exercise successfully; they were able to draw a picture and then use the program to zoom in and out on their drawing. While this exercise was not difficult, it did prepare them for the next exercise which demonstrated how the drawing might be encoded with ones and zeros. The "Painting by Numbers" activity was designed to demonstrate how current drawing programs hide some processes with a graphical user interface. However, for the Middle School phase of the project, the workbook exercise put a different spin on the "Painting by Numbers" activity: it became an exercise in the programming concept of "repetition," also known as looping.

In George Milbrandt's recent article about teaching programming he stated "... it is necessary to consider the use of problems with solutions that contain one or more of the three basic programming structures: sequence, selection and repetition" (Milbrandt 1995, 27). The programming exercise in Computer Magic addresses the issues of sequence and selection (sometimes referred to as the "if-then-else" structure); the workbook Encoding exercise addresses repetition. The encoding exercise required that students draw a diagonal line using the "Pixel by Pixel" part of the "Painting by Numbers" screen. Students who solved this problem successfully discovered that they needed to identify a pattern of actions and apply them repetitively. In describing the looping problem Caitlin identified the most important step in solving the problem as finding the pattern. Sam and Breen solved the problem easily, leading to Sam's exclamation "I got it! It's a simple pattern and you just follow it!" Once Ian and Brian understood how the interface of the "Painting by Numbers" screen worked they were able to figure out the solution very quickly. Brian snapped out of the giggles long enough to solve the problem; once he figured out the pattern he counted the boxes in the grid and began feeding Ian the memory addresses in rapid succession. It took Kayetana and Jane a long time to solve the looping problem but once they understood it they were clearly excited and were more motivated to work with the program from that point on. Sophie and Erica, explored the "Painting Tools" portion of this activity screen even though there was no exercise associated with it; they were very excited to discover that the big squares in the grid were blow ups of patterns shown in the "actual size" box.

Programming and Hardware

Almost all of the students were able to complete the programming exercise, in part because the interface of this exercise would not allow failure. However, the speed with which students solved the problem varied as did their understanding of the exercise's value. I made a significant discovery which indicated that it was necessary for students to see the first encoding screen (the one called "Encode It") for them to make sense of the programming activity. I realized this when Caitlin was unable to complete the puzzle without additional information. This conclusion was supported by Tyler's description of how he solved the programming puzzle. Tyler said: "We had done this a little earlier and so we already kind of knew the order." I asked "When you say you did this earlier you mean when you were doing the other screen?" and he replied "Yes." In some cases I had allowed them to skip the "Encode It" screen since there was no specific workbook exercise associated with it. What I later realized was that the terms used in the programming activity were introduced in that first encoding screen and that the eight line program was modelled on it. So, although the medium is designed to allow for random access there are some elements that need to be presented in a certain order for students to build on new concepts sequentially.

To solve the programming problem Caitlin read the directions thoroughly, looked at all the possibilities, and then began dropping the bars into the correct slots. She placed the first four lines without a single error and then seemed stuck. I could tell she was having a problem reasoning through the next part because she'd been thinking out loud while she was working and the place she was stuck at seemed relatively easy: She was stuck at the point in the program after the first "if" statement; obviously the next line had to begin with "then." After a moment I realized that she had never seen the first screen in the encoding section which allows you to toggle the squares from black to white and white to black. As soon as I described for her what she would have observed on that screen she was able to finish the rest of the task immediately and without a single flaw. Just about the only other users I've seen complete the task with this efficiency are those with a strong programming background. Caitlin explained how she solved the programming puzzle:

I was trying to figure out if there was a story almost. And they have stories where you have different sentences and you have to put the story in order and so I kind of did it like that... Not just this goes on, that goes off - more like a story type of thing (Caitlin).

According to this description, she was successful because she analyzed the program as if it were a logical story that needed to be told in a certain order.

Caitlin used the logic of story-telling to help her sequence the program; other students tried to apply their knowledge of logical sentence construction to put the program lines in order. Such an approach is evident in Kayetana's reaction to the pseudocode; she said that it drove her "nuts" because it wasn't grammatically correct. Clearly this is a response that occurs because the pseudocode is written in "natural language." Similarly, when Tyler began working on the programming puzzle and didn't know what to do first, he looked for a clue in the syntax of the phrases; he observed "There's no capital letters... " Parallels between the pseudocode and English language led Aaron to reason "`then change... that has to be after something... " In this example, he too was using clues extracted from English language syntax to solve the problem. When asked to describe how they solved the programming puzzle another student, Breen, simply stated "we tried to figure out what made sense."

The workbook's hardware exercise asked students to "1. Circle the transistors in the two diagrams on this page. [AND and OR diagrams] 2. Draw the path followed by the supply current when you test the circuit as shown." The diagrams are both configured so that the inputs are zero and one. The object of the exercise is to clearly show how these two types of gates handle the same inputs differently. I wanted students to make the connection between transistors - an electronic component they've all heard mentioned - and how transistors actually operate on inputs in a computer circuit. Although students were able to recognize which part of the diagram represented a transistor and they were able to trace the path of the supply current, this did not necessarily indicate an understanding of the differences between the two types of gates. When they were asked to draw the supply current they could simply copy what was on the screen without thinking about its significance. I found that I was better able to determine whether the students understood the ideas of AND and OR by asking them to compare the two types of circuits. Asked to describe the behavior of the the OR gate, Breen observed "See it still got through because it's split open so it didn't need both gates." When there were two different inputs applied to the AND gate Alyssa saw that "it only partially goes through" but when the same inputs are applied to the OR gate "it goes through but it uses this one for all of the work."

XOR

Sam and Breen, two seventh graders, were the only pair to attempt a solution of the XOR puzzle as I originally conceived it. They dropped gates into the slots one at a time and tested the outputs at each stage. The XOR truth table included on the "info" screen was used as the model they needed to match. Although they weren't able to solve the problem in the available time, the technique used was solid and I'm sure it would have led to a correct solution if they'd been left enough time. Regardless of whether students were able to successfully complete the exercise as requested in the workbook, I received no confirmation that students understood the significance of the XOR gate.

Software Evaluation: Interface Issues

Many of the students who attempted to solve the XOR problem had problems understanding the task itself because the interface differed slightly from the interface they had familiarized themselves with in the programming exercise. I designed the XOR activity to be more difficult than the programming activity. The programming interface suggests that if an object is moved into a slot and it sticks there, then the placement must be correct; however, in solving the XOR puzzle it is possible that a logic gate may be placed in an incorrect spot. The XOR interface won't let the user place AND or OR gates in a slot with only one input or a NOT in a slot with two inputs - so there are some constraints - but it is possible to mix up the AND and OR gates and the program will not reject the configuration. The puzzle was designed so that the user can alter the configuration of gates and test the output results with different inputs. This process seemed too complex for most students; they weren't familiar with the process to begin with and then the added complexity of the interface baffled them. Based on this experience I'd favor a "no fail" interface concept similar to the one used in the programming exercise.

Another interface break was the difference in behavior between the first encoding screen, "Encode It," and the "Painting by Numbers" screen. It took students a while to realize that clicking in the Painting by Numbers grid would have no effect because a similar set of squares had been active buttons in the previous screen. Users required more extensive directions before they could use this screen successfully.

One pair of students came up with a fun application for one of the interfaces I designed - one I hadn't even considered. When Aaron and Tyler solved the programming problem they were really impressed by the "flash" feedback I had used to indicate a successful solution. They found that they could cause it to flash over and over again by moving one of the bars in and out of its slot; the other effect of this removal and replacement was even better: as designed, each time a program line was dropped into the correct slot, a random one of four chords would play; Aaron and Tyler found that if they rapidly moved a line in and out of its correct slot the chords would get cued up one after another so that it sounded like music was playing. When they had cued up a bunch of chords Tyler pretended that he was playing the piano on the keyboard.

Software Evaluation: Pedagogical Issues

One of the key goals of this evaluation was to determine whether use of the Computer Magic program caused a change in a participant's awareness of the material's significance. The questions I asked to help make this determination evolved over the several weeks I interviewed students; questions needed to be flexible enough to probe the specific points of interest of each session. Before using the program some of the students were asked to describe the processes that occur in the computer when they use a drawing program. Then after they used the program I tried to get them to articulate how the processes they just learned about would fit into this scheme. When I asked "What kinds of things do you think it [the computer] does to let you draw a picture?" Sophie responded "Little sections go on, you know, run little messages, go through the computer." As inarticulate as this may sound, I liked the idea that there are "little messages" running through the computer; it suggests that she is conscious that processes, not magic, may be occurring invisibly. Sophie's response to this question after using the program demonstrated an expanded awareness of the computing concepts presented. She said "... and it's got to kind of change the memory bits before it can actually show it up on the screen... and the insides have to be in the correct order and if they're not then you don't get your picture you get something else."

Ian began with a seemingly more sophisticated understanding of the relationship of electronics to computers. Initially he said "I know there are electrical impulses running from one thing to another but I'm not sure which and where." His description of the processes after using the program indicated that he had integrated the concept of logic gates and transistors into his existing model of how computers work. When I asked him to add to his previous explanation that here were electronic impulses involved, his response was "It goes through one of those gates where it gets stopped or goes on."

I asked Sam and Breen to make the connection between clicking something on the computer and processes they had seen demonstrated in the Computer Magic software. I asked "what kind of messages does the computer send itself... if I... click on the button where does it go?" Their response was that it goes to the gates [logic gates] and when I asked them what gates are made of, after a moment, they realized that gates are made of transistors.

Another of the project's goals was to assess user satisfaction with the method of instruction. What I found was that students in this environment were uniformly excited about trying out new software and that their curiosity seemed to increase the level of interest in the subject matter. There were exclamations of "cool!" and "I get it!" as students tried different activities. One of the students, Kayetana, said of the program "that doesn't look like math to me - but it is and that's cool." Unlike my experiences at the Exploratorium, attention span was not the issue in this school setting. Once students began solving the problems the program was generally able to hold their attention. What was an issue however, was the lack of time available. In general, it was regrettable that students could spend such a limited time using the program because it seemed obvious that more time was required for students to thoroughly explore the program. This was particularly problematic on the XOR exercise. Solving this puzzle required more time than the others but by the time most students reached this exercise they had only a few minutes left - not enough time to complete it or reason through the concepts.

Next Chapter

Copyright © 1996 by Lisa H. Weinberg