INTERNATIONAL COLLABORATION

The Runestone Project

Mats Daniels

 

 

 

 

 

Content

  1. Introduction
  2. 1.1 The Scope of this Report

    1.2 Motto

    1.3 The Runestone Team

    1.4 The Runestone Project

    1.5 Further Aims with the Runestone Project

    1.6 Peer-Teaching

    1.7 The Pilot Study

  3. The Group Project

2.1 The Task

2.2 The Hardware Set-up

2.3 The Website

2.4 The Game Server

2.5 The Video Server

3. Data Collection

3.1 Learning Style Questionnaire

3.2 Project Background Questionnaire

3.3 The Video Conference

3.4 Weekly meetings

3.4.1 Teacher Led Meetings

3.4.2 Student Meetings

3.5 Email and Web

3.6 Interaction Log Entries

3.7 The Instructor Logs

4. Preliminary Observations: Is the International Project a Good Education Form?

4.1 Performance (Syllabus Coverage)

4.2 Time Spent on Task

4.3 Staff Time and Costs

4.4 Student Development

4.5 Student Motivation

4.6 Additional Observations

4.6.1 Discrepancies Between the Groups

4.6.2 Student-Identified Problem Areas

4.6.3 Communications Technology

4.6.4 Language

4.6.5 National Culture

4.6.6 Team Coherence and Roles

4.6.7 Social Interaction

4.6.8 Peer-Teaching

5. Supervison and assessment

5.1 Issues

5.2 Loosely vs. tightly coupled projects

5.3 Local Educational settings

5.4 Supervision set-up

5.5 Assessment

6. Researching the Learning/Teaching Experience

6.1 Why Integrate Research into CS Teaching

6.1.1 What distinguishes the current academic climate?

6.1.2 What distinguishes teaching in CS?

6.1.3 Why research CS Ed?

6.1.4 Why build research into teaching?

6.2 Rigour? — What and How?

6.2.1 Accommodating different perspectives; combining techniques

6.2.2 What gives research value?

6.3 Applied to Runestone

7. Conclusion

Acknowledgements

References

Appendix: Runestone Project Background Questionnaire

Appendix Results from the Project Background Questionnaire

Appendix: Learning Style Questionnaire

Appendix: Result of Learning Style Questionnaire

Appendix: Presenting Runestone

 

1. INTRODUCTION

1.1 The Scope of this Report

The international collaboration within the university environment has in the past more or less been reserved for a few researchers. The reduced cost for communication has however made international collaboration a feasible activity for a broader category. This is likely to change even more in the future.

The international collaboration on the student level is in its infancy today, but there are examples. This report will focus on one such experiment; the Runestone project. This project is developing and evaluating the notion of incorporating international group projects into the undergraduate Computer Science curriculum. Runestone adds new dimensions to student teamwork, requiring students to handle collaboration that is remote, cross-cultural, and linguistically challenging. Runestone is a three year project, with the prototype version running in winter 1998 with students at Uppsala University, Sweden, and Grand Valley State University, Michigan, USA. The 1998 pilot study will be followed by a full-scale implementation in 1999 and another in 2000.

The focus of this report is on the student collaboration, but this collaboration is not possible without an international collaboration on the teacher level. This aspect will also be touched on as well as the issue of looking at the project from the perspective of Computer Science Education Research.

 

1.2 Motto

Our students will eventually work in a global market; what better preparation can we provide for international collaboration than … international collaboration?

 

1.3 The Runestone Team

This report will mainly be on the student aspects of the collaboration, but nothing would have happened if it were not for the teachers and researchers involved. The Runestone team collaboration is in some ways functioning as a role model for the whole project. The Runestone team consists of the following group:

• Vicki Almstrum, University of Texas at Austin, Texas, USA. Vicki has a special competence in the evaluation area, where her Ph.D. Thesis (Limitations in the Understanding of Mathematical Logic by Novice Computer Science Students) is an illuminating example.

• Lars Asplund, DoCS, Uppsala University. Lars has experience with large software projects and has played a central role in for the part in this project where course content and how collaboration between students is done. Lars is well experienced and genuinely interested in technical issues and takes an active part in the process of examining technical solutions.

• Christina Björkman, DoCS, Uppsala University. Christina has experience from participating in a Software Engineering course at University of Arizona, Tucson, Arizona. Christina has been involved in the planning of the project, where her knowledge about distance education and experience with project courses has been valuable.

• Carl Erickson, Grand Valley State University (GVSU), Allendale, Michigan, USA. Carl teaches courses in operating systems, networking, and distributed computing at Grand Valley State University. He is well experienced in using Internet technology in classes and for student projects. He is intimately involved in all stages of this project, and especially the development and running of courses, as well as, development of technical solutions.

• Bruce Klein, GVSU, Allendale, Michigan, USA. Bruce is the chairman of the Computer Science Department at GVSU. He brings a wealth of experience with student software development projects and is currently participating in a two year workshop on collaborative learning sponsored by the National Science Foundation. Bruce leads the USA end of the prototype project.

 

1.4 The Runestone Project

The project’s primary aim is to introduce real international experience into undergraduate Computer Science education in a way that has value for all participants. Group projects (typically 5-10 students per team, 5-10 weeks per project) are incorporated into courses at Uppsala University and Grand Valley State University. The international project students will collaborate closely with their foreign counterparts using appropriate communications and computing technology to solve a given problem. Because the students come from different specializations within CS, they have different knowledge to contribute to the project. Problems will be designed to cover the spectrum of backgrounds. Runestone’s secondary aim is to identify effective support structures for remote international collaboration, encompassing strategies for communication, management, and technology use. Runestone will evaluate pedagogical and technical solutions for collaboration, will examine the costs, both in time and money, and will investigate how students learn in such a setting and what they learn. This report introduces the Runestone project, describes the support and pedagogic mechanisms used, and presents preliminary observations from the first year.

 

1.5 Further Aims with the Runestone Project

The Runestone project aims to:

Another goal is to create a well-organized setting with courses that, after the initially higher start-up costs, run at normal or lower costs. One example of cutting costs without compromising quality is the use of student peer-teaching, which can reduce the demand for staff hours. Another example of cost cutting is that the costs for renewing the course can be distributed across the departments involved.

In carrying out the Runestone project, we will establish results that address the issue of transferability to other departments and institutions. For this reason, the evaluation will aim to distinguish between domain-specific and general lessons, particularly with respect to the impact of international collaboration on group interaction and personal development, the extent of peer-teaching, and the costs of using this form of education. For example, the project shall examine questions such as how much time is spent on becoming acquainted with new techniques for communication and in what ways (if any) using non-native language impairs learning.

 

1.6 Peer-teaching

Based on anecdotal evidence from our own experience as teachers, we believe that having students explain concepts and solutions to one another is a powerful learning technique. Our conjecture is that there will be plenty of occasions for the students involved with the Runestone project to help each other with activities such as explanations, clarification, sharing knowledge or rehearsal of ideas. Occasions for peer-teaching can be formal or informal. Formal occasions arise when students at site X present information for the students at site Y. Informal occasions include questions that arise during day-to-day e-mail or simple study sessions.

The Runestone project will systematically examine peer-teaching by considering which settings tend to encourage or discourage peer-teaching as well as factors that contribute to the effectiveness of peer-teaching in these situations. One of our hypotheses is that the rather different educational backgrounds of the two sets of students involved in the project will encourage peer-teaching. The differences in backgrounds should motivate the students to articulate their reasoning, rather than assuming that there is mutual tacit understanding between them and their foreign counterparts.

 

1.7 The pilot study

From early January through late March 1998, the Runestone project ran a pilot study, which involved a group of eight students: four in Uppsala and four in Michigan. These students were all in their third or fourth year of university studies. For the Swedish students, the group project was part of a course that started in September, whereas for the Americans it was the major part of a course that started in early January. The problem that was specified for the group project was fairly advanced, involving study areas such as real-time systems, networking, and distributed systems. A major goal of the Runestone project is to examine the influence of the group project and factors in how the project is set up upon what the students learn and how they learn it.

 

2. THE GROUP PROJECT

2.1 The Task

The actual project was to navigate a steel ball through a maze by tilting the maze in two dimensions with stepper motors. The user submits a navigation algorithm, defines a path for the ball to follow, requests the server to execute the algorithm, then waits for access to the game. When the user gains access, the game server resets the ball in the maze, executes the user's navigation algorithm, then provides feedback to the user on the result of the run. Feedback includes information on how the navigation code executed, and a graphical display of the path which the ball traced through the maze. The input to the navigation algorithm is the position of the ball. The output is the rotational positions of the motors as a function of time. Video images of the maze and ball are available from a black and white digital video camera.

 

This is how the group project [the Brio project] was presented to the students:

Goal

Navigate a steel ball through a maze by tilting the maze in two dimensions with motors. Use a web interface to submit a navigation algorithm, select a path for the ball to follow, request the server to execute your algorithm, wait for your turn, and watch the results of a run via video images. The input to the navigation algorithm is the position of the ball. The output is the rotational positions of the motors as a function of time.

Design

It is up to each team to decide on the architecture and design of their project. What you will be given is a requirement specification describing what your project must do, a set of hardware components (described below), and some useful software components (described below).

As the time frame for the project is quite short (approximately 8 weeks) we have scheduled a regular weekly meeting time for you to report progress, ask questions, advise of problems, etc.

We will specify a set of deliverable documents and their deadlines for the duration of the project. The deadlines on the documents will be to help you stay on track and make regular progress towards the completion of the project.

Hardware components

The following components are available for your projects. The unique components will live in rum 1219 and students will have keys to this room.

Software components

These components will be available locally and will be properly configured, tested, and if appropriate, installed.

Software technology

The following technologies or concepts will be important for the project. It would be a good idea to review these ideas, secure references (books and the web) for them, and ponder their uses over the Christmas break.

 

2.2 The Hardware Set-up

The hardware components available for the projects were located in Uppsala. The central piece consisted of a desktop computer, with a black and white digital video camera attached to its parallel port and an Ethernet connection to a laptop computer (the Swedish students each had access to a personal laptop). One of the laptops was used to communicate with the two rotational stepping motors via its serial port. The camera was permanently mounted over the Brio maze game as was a light source. A second laptop, connected via Ethernet, was used to run as a client computer with a web browser for playing the game.

The software components included a C library of code to read video signals from the parallel port, control camera settings, Motif app (Ximprov) for viewing camera data, experimenting with camera settings, an example C program using camera data (Ximprov), an Apache HTTP server for Linux, and Linux JDK 1.1 with RMI support

 

2.3 The Website

The starting point for playing the game (running the maze) was a website. This website had to:

Optional extras were to:

2.4 The Game Server

The game server needed to be a concurrent system, either multiple processes or multiple threads, and had to:

Optional extras were to:

 

2.5 The Video Server

The video server had to consist of one or more processes which must:

Optional extras were to:

 

 

3. DATA COLLECTION

Data in a variety of forms was collected during the pilot study. This section present the different forms used and some of the results. Data collection was carried out throughout the group project and covered all types of interaction between the students except their informal face-to-face meetings (which were covered by the project logs kept by the students). For detailed exposition of these and other techniques see, for example, [Isaac, and Michael, 1989, Gall, Borg, and Gall, 1996, Denzin and Lincoln, 1994)]. This data will be used to make some preliminary observations about what occurred and how to run the collaboration in the coming year.

 

3.1 Learning Style Questionnaire

All students in the Swedish class were asked to complete the Honey and Mumford Learning Styles Questionnaire. This allowed us to link up with research at the Open University which is investigating individual factors affecting success in using technology in learning CS.

 

3.2 Project Background Questionnaire

The learning style questionnaire was followed up with a questionnaire, also to the whole class, more focused on group and project work. One aim was to better understand how to run group projects for students. It was pointed out that the replies would be treated with strict confidentiality and would have no impact on the assessment for the course. The questionnaire contained roughly four different areas:

The actual form used and a summary of the answers are given in the appendix

 

 

3.3 The Video Conference

The first meeting between the students on both sides of the Atlantic was via video conference, with both ends recorded on videotape. While we had planned to hold a second video conference with all of the students after the project was over, this meeting was abandoned due to problems in synchronizing schedules.

The following is the instructions we gave the Swedish students before the video conference in order to prepare them for the video conference [the American students got similar instructions]:

A. Each group (Swedish and American) should describe their course of study, their department, and their university. Here's a way to get started, feel free to add to it:

B. Each student will introduce themselves. Here are some ideas:

C. Decide on what you think the hardest part of the project is. We'll discuss everybodies ideas. Look at the project web site for the project intro and the latest version of the project spec:

http://www.docs.uu.se/docs/undergrad/instances/spring98/it_proj

D. If you're American/Swedish, think of something you've heard about Sweden/US that you really wonder if it could be true. Could it be true, or is it just a myth? Alternatively, think of something you'd really like to know about US/Sweden that only a native could tell you.

E. What is your real strength in computing? Describe what you do best and like best. This might also include project issues (coordination, planning, etc) as well as technical issues.

F. What will it mean for the project to be a success for you? Be specific. Does the code need to work flawlessly? Should the Brio Net game hit the top 10 Internet games of the month? Do you want to come out with a pen pal from another country? Think about what a really successful outcome would be.

G. Every self-respecting project at UU's Dept of Computer Systems has a logo and a T-Shirt. What should our T-shirts have on them? Here's a primer on what rune stones are (for the Americans)

http://www.hovdata.se/hovdata/scatrun.html

We'll finish the video conference brainstorming on T-shirt designs.

The following is the notes taken by Marian Petre at, and after, the video conference:

Connection

Connection was of poor quality (Swede_1, afterward: "You have to pay when it's that bad?!"). 50% video, at best (apparently better at the US end) high dropout rate on audio. Hence: no 'face-to-face feeling'. Discussion was artificially slow. The shortcomings meant that discussion was strongly directed by Carl.

Technical details

Microphone placement was not ideal.

How clear were the overheads? Partially legible?

Room arrangements were formal:

Swedish students behind table, with Carl to one side. American students in rows behind tables, with Bruce and Mary behind. Suggest that in subsequent video-conferences, arrangement is less formal: no tables, participants all visible to each other (semi-circle of chairs?)

Attention:

Swedish students appeared to watch the screen more. American students appeared to stare at the table more (this may have been an illusion, given the poor video)

Consciousness of medium:

Swedish students appeared more video-attuned: they waited for the camera before speaking. (e.g. Swede_2: "Hello, what are you doing?" when Mats moved the camera while he was speaking.) All students appeared nervous about the medium. All appeared uncertain about what was expected of them. Students needed lots of encouragement. Carl moved discussion away from sources of embarrassment quickly.

Comprehension:

The shortcomings made it difficult to concentrate on what was being said. Just about all the jokes, from both ends, seemed to be understood and appreciated, even the 'low-key' ones; it seems likely that the couple of exceptions simply weren't heard.

 

Dynamics of discussion:

Some uncertainty about the locus of control in the discussion, among the American students in particular (e.g., "We should..." "That's later." "Shhh!"). Perhaps complicated by the detailed structure of the script? Carl waited for the laughter to synchronize the discussion. A slightly more natural exchange appeared to be beginning during the discussion of the project, when the American students posed questions that the Swedish students answered, but time was limited and the discussion was cut short. There was a bit of local discussion, particularly at the US end, with sotto voce remarks – something to look for in future interactions is whether such remarks are made audibly, to include the whole group.

Group dynamics:

Presumably most of the students were 'putting on a good show' and trying to behave 'correctly'. For example, there were a number of 'solidarity statements' (e.g., Swede_3 saying that the hardest part will be "communicating to each other...but we will fight to solve it" and American_1 saying that getting the project to work is secondary to "the transatlantic experiment").

There were some declarations of independence:

American_2's remark that he prefers it when "I can do my own thing." American_1's statement that he prefers working alone (counterbalanced by a vision of how he might fit into a group: "I'm pretty good at asking questions when I don't understand" and that he's a good writer and Java competent.)

There were several 'power statements':

The Swedish students' presentation of a block diagram of the project (a pre-emptive strike?). Swede_4's overt bid for project leadership. ("...pushing people where maybe they don't want to go — but someone has to do it..." and later, when Mats asked who would end up running the project: "It is inevitable that we do."). Swede_2's less forceful expression of interest in project management.

There were several 'group making' statements:

American_3 described himself as "one of the people who will get people moving". Swede_3 seemed to be trying to bridge perspectives with his strengths/weaknesses statement; he started by agreeing with the divide-and-conquer approach voiced by the Americans, gave it a cooperative twist "and when you look up again someone has solved another chunk..." and then talked about group activities like sketching the solution and determining protocols. Swede_1 offered herself as a team player, saying she is "quite adjustable...so I think I'll get along..."

It will be a matter of interest how roles of the two participants most timid at the video-conference (Swede_1 and American_4) develop over the course of the project.

Cultural issues:

There were a fair number of terms and bits of knowledge taken for granted on both sides. Carl offered a couple of translations. American_3 made an effort, explaining that 'senior' meant 'in his fourth year' – but he missed explaining 'UPS'.

 

 

Post-hoc discussion with Swedish students:

Carl remarked that the American students are somewhat intimidated by the Swedish students' perceived technical superiority. The students felt that they had benefited from the conference, even though the quality was so poor. They felt that subsequent interactions would be easier, especially connecting names, faces, and voices. Swede_1, Swede_2 and Swede_4 all felt they recognized the American voices after a while, American_1 clearly, American_2 probably. All of the Swedish students were impressed by American_3, with his multiple responsibilities. Swede_3 felt that cultural differences were less noticeable in the video-conference than they would be face-to-face "...but if we were to meet then we see those tiny details"

 

3.4 Weekly Meetings

3.4.1 Teacher led meetings

Each week, the teachers at both sites held a meeting with their local group of students, where they reflected on how the project had gone during that week. The debriefing followed a standard script, see below, but was sufficiently flexible to allow the teachers to immediately explore the students' observations and any new developments. The meetings were audio-recorded. At the end of these meetings, each student filled out a quick, one-page questionnaire about the meeting. The questionnaire asked about the meeting organization and the outcomes (decisions, learning, conflict resolution, clarifications, etc.), as well as about the respondent's satisfaction with the proceedings, both overall and in terms of their own role in the meeting. This material has not been analysed yet.

There are, broadly, three categories of information, and the questions that follow can be addressed in each category:

i) technical stuff (communications media, etc.)

ii) academic stuff (both learning and progress of the project work)

iii) interaction stuff (social interaction, collaboration, culture issues)

questions:

1. Were there any new insights? If so, from where? Under what circumstances?

2. Were there any surprises?

3. Were there any disasters?

4. Were there any culture clashes — or developments?

5. How was the quality of the interaction: were students satisfied overall? Were they satisfied with their own role in the interaction?

6. How does their attitude about the project compare to how they felt last week? To how they felt at the start of the project?

Emergent issues?

 

3.4.2 Student meetings

The whole trans-Atlantic project group held weekly meetings. One was done with the use of speaker phones, but most were done using Internet Relay Chat (IRC). All these meetings were audio recorded and logs of the IRC meetings were collected, and all meetings were concluded with completing a complete short, individual questionnaires. These one-page questionnaires focused on that meeting, asking about the organization of the meeting, the outcomes (decisions, learning, conflict resolution, clarification, etc), and the students’ satisfaction with the proceedings, both overall and in terms of their own role in the meeting. The students were assured that no one involved in the assessment of the project would have access to the tapes.

The questionnaire for the telephone meeting is found below. This meeting did not function well and was cut short. The students felt uneasy, mainly be feeling pressed to say something and at the same time not talk while someone else was.

Please take 5 - 10 minutes to give your immediate impressions of the project meeting.

Name:

How was the meeting organized; what was the structure of the meeting?

Were there clear roles (e.g., was someone ‘in charge’, was someone taking notes, etc.)? Please specify.

Who talked the most?

Who talked the least?

Who made the biggest contribution?

Did you learn anything? (If so, please indicate briefly what you learned)

Did you get information about what’s going on in the project that you didn’t know before? (If so, please indicate briefly what)

Was anything made clearer than it was before the meeting? (If so, what?)

Were there any problems of definition — different people using the same word or phrase to mean different things? (If so, who and what term?)

Were any decisions made? (If so, please indicate what they were)

Were all decisions unanimous?

Were there any disagreements?

If so, were they resolved?

How?

How satisfied are you with how the meeting went overall? Why?

How does it compare to previous meetings?

How satisfied are you with your role in the meeting? Why?

Were there any language problems?

Could you recognize the voices over the phone?

Were there any technical problems?

 

3.5 Email and Web

Much of the student interaction about the project was via electronic mail and documents shared on the Web. All student mail relating to the Runestone project was collected, and the Web site was monitored. All student mail relating to the Runestone project was copied to Marian, who kept an eye on it and will ultimately analyze it for things like decision strands, student roles, evidence of peer-teaching, and culture issues.

The homepage for the Swedish part of the Runestone-project was used for synchronization. It contained information about the project at several levels as well as some useful information about relevant things, e.g. Java. There were several pointer, e.g. to the American homepage, the individual students homepages, and their work diaries [see 3.6] as well as the IRC interaction logs.

 

3.6 Interaction Log Entries

The students completed weekly project logs where they kept a daily log of their time on the project, their activities and interactions during that time, and the outcomes. Other students in the Swedish project course, i.e. those in groups that consisted of only Swedish students, were also asked to keep project logs, in order to provide a basis for comparison. (This was not feasible for the rest of the American contingent.)

The following was handed out to set a standard for how to keep the diary. Log blanks were made available both electronically and on paper and the students could use either or both media.

Runestone: Some guidance for the project log

The log focusses on two things: time-on-project and interactions. This is meant to be an on-the-spot log that you complete for each period of time you spend on project work, as you finish that period. (However, if you fill it in from memory later, please indicate that you’ve done so. An ‘M’ in the margin will be enough.) It is meant to be a quick, habitual exercise, about a minute per entry. You might find it easiest to keep a log file open whenever you work. You are welcome to keep the log electronically or on paper, or both.

 

You will be asked to hand the week’s log in at the end of every project class.

Here’s roughly what the columns mean:

date

time & duration

medium

brief description

what...what accomplished...with whom...anything notable...etc.

Note the date.

Note when you start and how long you work

Note the communication medium for collaborative work: email, Internet phone, audio conference, etc.

Give a brief description of your activity during the period. If you did more than one type of thing (like having a meeting and then working on some code afterward), please make separate entries.

Please note:

- What were you doing? What were you working on?

- Were you working alone or with someone else (if so, who?)

- Did you accomplish what you set out to do? Or is this part of ongoing work or ongoing discussion?

- Were any decisions made?

- Did you learn anything? (If so, note briefly)

- anything else?

 

Here’s an example log:

date

time & duration

medium

brief description

what...what accomplished...with whom...anything notable...etc.

20 Jan

14:30

2 hours

video-conf. (1 hour)

Observed the students’ start-up video conference (all students plus Carl, Bruce, Mary, Mats, Christina). Terrible connection. Seemed like everyone caught all the jokes. Gathered the Swedish students’ impressions afterward.

 

17:00

30 mins

telephone

Compared notes on the video-conference with Christina — no discrepancies. Talked about what we should be looking at.

 

17:30

30 mins

email

Wrote up impressions of the video-conference for the rest of the project team. Asked Bruce how it went at the US end.

21 Jan

18:30

15 mins

email

Emailed notes to Mats about data collection. Answered Christina’s questions about what notes she should take. Email from Anders with the time log from his project.

 

19:00

2 hours

 

Reading on analysis techniques. Learned what ‘phenomenography’ means. Reviewed the project notes; made a list of things to do. Tested the tape recorder (need an adaptor for the microphone connector).

22 Jan

14:30

30 mins

face-to-face meeting

Planning data collection with Mats and Carl — discovered we had different models for how the meetings would be organized! Need to ask Christina about availability.

 

16:15

30 mins

 

Tried to draft a log blank...will have to revise it.

 

16:30

15 mins

face-to-face meeting

Discussed data collection with Mats and Christina. Sorted out the confusion about models of how the meetings will be organized. Agreed a draft spec. of who does what when.

3.7 Instructor Logs

Each teacher kept a journal of their observations, particularly with respect to peer-teaching, culture clashes or developing sensitivities, collaboration, effective or ineffective procedures, and technology issues.

 

4. PRELIMINARY OBSERVATIONS: IS THE INTERNATIONAL PROJECT A GOOD EDUCATION FORM??

The fundamental question for Runestone is whether – and in what respects – this is a good education form, meaning that:

We address each of these points below. Because the pilot study is just that – a pilot study – any observations we make are necessarily limited and preliminary. Moreover, detailed analysis of the data is not yet complete; the comments given below are based on on-going examination of the data and a first-pass, topic-based review of the material, as well as on a more extensive examination of the data generated on the Swedish side.

 

4.1 Performance (syllabus coverage)

The coverage of the syllabus is a special case here, because the primary aim of this part of the course in Sweden is to provide experience in the use of concepts covered in the earlier, more theoretical parts. Hence, the completion of the project task is perhaps a better measure. Based on their performance on previous projects, the Swedish students involved in Runestone are strong students. Under normal circumstances, their project would have been predicted to have been among the first completed and best produced by the class. This was not the case here and was, in our opinion, due to difficulties in coordination and synchronization among the students involved.

 

 

4.2 Time spent on task

The project logs of the Swedish students in the international group show that they spent roughly the expected number of hours on the course: the equivalent of three weeks of full time studies, i.e. 120 hours. The American students spent on average somewhat less, i.e. roughly 100 hours, but this is in line with the expectations for the course the American students followed. It is interesting to see how these hours were actually spent, especially compared to each student's individual estimates from the background questionnaire. One question had asked the student to estimate, for courses taken prior to the pilot study, the percent of their total course time they generally spent studying alone and in groups.

Percent of studytime spent alone vs. in group

 

Alone

In a group

 

estimated

reported

estimated

reported

Swede 1

70

23

30

77

Swede 2

30

57

70

43

Swede 3

80

30

20

70

averages

 

38

 

62

American 1

90

53

10

47

American 2

70

53

30

47

American 3

95

50

05

50

averages

 

52

 

48

For these summary figures, emailing is considered as working alone. Recategorizing emailing as a group activity would make the focus on group work even stronger. The, on the average, lower percentage spent working in a group among the Americans is due to a higher rate of local group work among the Swedes. This is not surprising, because the Swedes knew each other very well before this course. The fourth Swedish student has not yet filled in the report for the last month and is thus not reported here. One of the American students also has an incomplete time log. Swede 2’s reported time included considerable time searching the web for useful information, which is both time consuming and solitary.

 

4.3 Staff time and costs

Because this course has required new development, staff time spent on the course cannot be considered typical. The greatest development cost was in setting up the project, which is standard overhead for any project course. This offering certainly involved fewer lectures than usual and less involvement from teaching assistants. There were some special costs, for example running the video-conferences and obtaining special hardware for the project. None of the costs was discouraging.

4.4 Student development

It is too early to say much about the effect of recent project work on students' personal development. It is likely that the project outcomes for the students were not what they would have been had the project been individual or purely local. In either case, the students would have expected to complete the project, and some of them to excel. Hence, we speculate that the outcomes in personal development are likely to be different in kind from those of a ‘conventional’ project. Our experience as teachers suggests that the experience and frustration of working in a relatively large group with unknown persons is likely to be counted as a key lesson in the long term. The students have had to deal with problems that were different, and in many cases more inter-personal, than usual. Each student appeared to reflect on his or her individual responsibility for communication or other problems with the project. For some, insurmountable frustration and failure to complete a project were new experiences.

After the course, both the American and Swedish students talked about lessons in project and time management, ideas for improving the experience included alternative group structures, having more milestones, and better indicators of progress. The American students described a ‘lack of closure’: they knew some parts of theproject worked, but they hadn’t seen it working and didn’t know if it worked. (The Swedish students, on the other hand, were certain that it didn’t.) The students all realized the value of communication skills (including how to conduct a meeting and set an agenda); perhaps the clearest lesson for the students was the need to acknowledge all email and to answer promptly. All of the students rated the project as being more successful in terms of acquiring knowledge and experience than in terms of producing a product.

 

4.5 Student motivation

Three factors enhanced the initial motivation of students in this international group:

In the initial meetings, some students stated that the real challenge was to make the group work as a team, and to demonstrate the viability of the experiment; others cited both the teamwork and the challenge of the project itself. During the project, motivation was neither constant nor evenly distributed; students cited differences in expectations and motivation within the groups as one of the main problems. At times the awkwardness of physical separation and different time zones impaired student motivation and enthusiasm. Nevertheless, seven of the eight students report that they would be willing to participate in such a project again.

 

 

4.6 Additional observations

4.6.1 Discrepancies between the groups

Much of the observed frustration can be attributed to discrepancies between the two groups of students, in terms of expectations, sense of urgency, time available, local cohesion (and hence local group dynamics), technical skill, and access to a key, charismatic lecturer (an American working for the year in Uppsala). The American students felt that they were "a step behind all the way". The Swedish students felt that the American students lacked "passion". One American student expressed regret at not being able to contribute to the extent wanted, for the reason that there was too much else (i.e., job and family commitments) going on. The American students perceived the Runestone project as bigger than those they normally undertake; they felt that future international projects should make clear that all students must participate fully in order for the project to succeed.

 

4.6.2 Student-identified problem areas

 

4.6.3 Communications technology

None of the students considered the communication media as problematic. We tried a number of different forms of collaboration; IRC and email were the preferred modes of communication. [Guzdial, M., and Turns, J. (1997) and Kehoe, C., Guzdial, M., and Turns, J. (1997)] report experiences with tools that support project-based learning.

 

4.6.4 Language

Language per se was not a barrier for these students. The Swedish students are highly competent English speakers (with 8-9 years of study and English usage required in many university courses), although they are not necessarily fully confident. The students’ email and IRC logs are full of jokes – but the students expressed low confidence that their jokes were understood. Everyone was fiercely polite.

 

 

4.6.5 National culture

The students noticed a few cultural differences between the two groups, specifically in these areas:

Nevertheless, the students were emphatic that culture was a non-problem; each group described their counterparts as being "just like" or "pretty much like" them.

 

4.6.6 Team coherence and roles

The American cohort was a collection of individuals, whereas the Swedish cohort worked in concert as a team. The American students described a sense of working on an individual basis.

While roles were assigned rather late, there was a good international distribution of responsibilities. The groups have recommended that in the future we appoint a student at each site as local project leaders. Designated responsibilities for these two students would include acting as principal liaison and watching for problems within the local cohort or the overall interaction. This monitoring function might catch problems earlier and help to defuse them; for example, this year the Swedish students helped one another with programming and technical difficulties, preventing these factors from becoming problems.

 

4.6.7 Social interaction

There was relatively little social interaction between the cohorts; the students felt that they didn’t know their counterparts very well, and the project didn’t help them to get to know each other. Some interactions would probably have been more efficient if the participants had known each other better. Social interaction – jokes and talk about personal topics – increased toward the end, during the hectic efforts to make the project fly. Yet, for each of the students, some part of the process or of their counterparts’ actions or interpretations remained mysterious.

 

 

4.6.8 Peer-teaching

Peer-teaching between the cohorts was limited; it was largely related to craftsman skills, e.g., better technical solutions. This may be accounted for by the lack of familiarity between the students and possibly by the nature of the project, which could be sub-divided in a way that avoided the need to learn about what the others were doing. Some of the Swedes reported peer-teaching within the Swedish cohort, but this occurred largely in face-to-face interactions about which no data was collected.

 

5. SUPERVISION AND ASSESSMENT

5.1 Issues

There are a few issues that arose concerning the supervision and assessment in the pilot version. These were the "normal" problems of assessment of groups, for example the difficulties inherent in assessing individuals within a group and the problems of nationally and culturally heterogeneous groups, but there were also some uniquely interesting assessment features.

Firstly, a great deal and increasingly so in the full scale version to run in 1999, most of the interaction between supervisor and students are not face-to-face, so the problems of formative feedback and assessment are enhanced. Secondly, the same piece of work, produced by a single trans-national group, is assessed under two separate assessment systems, one in Sweden and one in the US. It is therefore possible that the same piece of work will get different grades for different students, calling into question the parity and equity of assessment schemes.

 

5.2 Loosely vs. tightly coupled projects

There are several ways of running projects of this type. A useful conceptual model of categorisation which we have developed is based on consideration of how similar the local implementations are. The more similar (and therefore the more constrained) the two ends are, the more tightly coupled: the more dissimilar the two ends can be allowed to be (and still work on the same project) the more loosely coupled. This concept of coupling can be applied to many aspects of the work: the necessary equipment, supervision methods, course content, assessment mechanisms, requirements from students (in terms of individual contribution, knowledge background etc.) and even scheduling — whether the component may extend over different time periods at each institution.

 

Some of the benefits of loose-coupling are:

The aim of the Runestone project is to be loosely-coupled wherever possible in order to facilitate transferability of the project. However, from the pilot study, experience suggests that some aspects can be more tightly coupled than others. A probable evolution is that once an instance has been trialled, successive years will tend towards tighter coupling, in the interests of efficiency.

 

5.3 Local Educational settings

The Runestone project component is designed to be fitted into distinct courses. There is no requirement that the courses at separate institutions should "match" in terms of size, assessment practice or technical content, as the linking feature is the project work itself, which is the same for all participating institutions. Because this is common work, students will have to have sufficient background to be able to contribute, but not every student needs to have equal knowledge or expertise in all areas. The intention is that the Runestone model should be easy to implement as a context-independent component into any combination of institutions and local conditions.

 

Within the pilot project, the institutional contexts were different in a number of respects.

 

 

Sweden

US

Year

3rd year (of four and a half)

4th and final year

Courses within which project was situated

Computer Systems II

(networks, real-time systems and distributed systems)

Capstone project

Notional hours to be spent on project component (per student)

120 hours

100 hours

Course time period

September to March

January to May (condensed for practical purposes)

Assessment

The grading scheme for the whole course was on a scale, Fail, 3, 4 and 5. However the project component was pass/fail. There was an option to undertake a set of additional assignments within the project for extra credit in the course.

The grading scheme was Fail, D, C, B and A.

Grading

Group

Individual

For the second iteration, the project will be run with the full cohort (taking the specified course) from both sites. This will involve circa 25 students from each institution divided into 6-person teams, three from each continent. Neither the task nor the educational settings will change. However, building on findings from the pilot project, the supervisory mechanisms will be modified and this will have an influence on the assessment process. Another improvement will be to duplicate the equipment, so that all students have physical access. Due to the increased numbers, there will be two mazes at each site.

The pilot project was based on a loose-coupling between the separate instantiations of the project. As much as possible there was no constraint on the context at either institution, and as few additional requirements (on staff or for equipment) were imposed. For practical purposes in the given situation, the full-scale version will be based on a more tightly-coupled model.

 

5.4 Supervision set-up

The supervision model was deliberately designed to be a "lightweight" open system. The fewer costs and constraints that were required of participating institutions, the easier it would be to extend the use of the model to other ideas and settings. Consequently supervision was designed to mirror traditional practices as much as possible and, in the original model, students would have face-to-face access to a supervisor at their site.

Within the pilot project the supervisory set-up was not ideal. The US supervisor did not have the same level of technical knowledge and was not acquainted with the equipment. As the physical equipment was in Sweden, the US students felt disadvantaged and, in fact, obtained indirect supervision from the Swedish supervisor(s) via the Swedish students. Interestingly, the US students were reluctant to contact the Swedish supervisors directly (via e-mail or the like), even though they were well acquainted with one of them.

In the full scale version, the supervisory model will be constrained, to provide equality of contact to supervisors for all students. At each site there is a single person assigned to the project component of the course. They will act in a dual capacity with respect to the students. These capacities will be manifest in the ways in which they interact with the students, either face-to-face, or virtual.

The virtual supervision is designed to be the only team-specific contact that is available to the students. We are anxious to avoid problems manifest in the pilot project, where only students at one site were able to obtain face-to-face advice, which they then relayed to their counterparts. A positive side-effect is that it provides a mechanism for defining the expected supervisory hours of staff.

 

5.5 Assessment

The pilot version was, throughout, an example of a loosely coupled project, but this was especially apparent when it came to assessment. Assessment in the pilot version was done completely according to the local standards, by the local academics, and was not part of the collaboration. There was no attempt to influence local practices, systems, standards or quality assurance mechanisms. At both sites (and as is common practice with project work) assessment was the responsibility of the supervisors. For both institutions this included staged deliverables and project documentation; additionally the Swedish students undertook an oral technical presentation.

In this iteration of the project, assessment will still be loosely-coupled in the sense that the US students will be graded on US criteria, and similarly with the Swedish students. However, the practical consequences of the revised supervision mechanism is that each supervisor will be responsible for the assessment of each member of all their teams. This means that each institution will delegate the responsibility for grading some of their students to the other supervisor. Hence, this is a more tightly-coupled scheme since we are reducing Location Independence because knowledge of local grading criteria is required at both locations.

There are several options available for doing assessment in joint projects. There are benefits and drawback with them all. In Runestone a shift from a loosely coupled scheme to a more tight version was an evolutionary response to an increased familiarity with the institutions involved in our project. This makes assessment in this instance easier and more efficient, but will make it harder to expand this project to other institutions.

 

6. RESEARCHING THE LEARNING/TEACHING EXPERIENCE

Building research into teaching has particular value in the current academic climate; this paper considers why and reflects on the particular need for Computer Science education research in its own right. Co-operation between research and teaching is needed in order to understand learning in the Computer Science context. The accelerating convergence of technologies for computing, communications and teaching affords an opportunity to integrate research and teaching objectives in Computer Science education. The paper presents four example projects in which research was tied to changes in teaching, so that general lessons could be drawn from individual experiences.

 

6.1 Why integrate research into CS teaching?

In many Computer Science (CS) departments, there is a tension – often divisive – between teaching demands and research expectations, and a concomitant failure of communication between research and teaching. Yet understanding learning in context cannot be done without co-operation between the ‘two halves’ (research and teaching), in two forms: technical research (about the content of the discipline) and educational research (about the learning of the discipline). Technical research has an impact on teaching objectives, most effectively if we understand the learning process – through CS educational research – in order to bridge the gap between research outcomes and input to teaching. Making that bridge can improve motivation among ‘research-oriented’ teachers and increase ‘clarity among teaching-oriented teachers. Computer Science educators often claim their teaching to be ‘research-led’, but they typically mean led by CS technical research; by ignoring CS educational research, we impoverish our provision. Computer Science is sufficiently distinctive as a discipline to require that educational research come from ‘within’ the discipline. The accelerating convergence of technologies for computing, communications and teaching affords an opportunity to integrate research and teaching objectives. This paper presents the case for doing so, and then presents some example projects in which this is being done.

The effort to marshal the latest technology and to implement the latest teaching methods to serve educational aims must be balanced with the need to seek out and address questions about which concepts, strategies, and techniques are fundamental to a Computer Science education. The fast pace of technological change poses a double challenge for Computer Science education: developments affect both the subject and the mechanisms of teaching. Educational methods race to keep pace with the opportunities afforded by technology. We must understand ‘what Computing is’ in order to teach it – we must marshal appropriate tools and methods to teach it well – and what we teach will influence what Computer Science becomes. This requires that research looks deeper than merely evaluating implementations, deep enough to examine what changes in teaching practice reveal about underlying issues such as concept acquisition, development of skills and expertise, sources of misconception and superstition, learning processes, the roles of different types of interaction between teachers, students, and materials, and so on. We need to know not just the effect of introducing new technology or methodology, but also the price.

 

6.1.1 What distinguishes the current academic climate?

As computers infiltrate virtually every domain, the demand for education in computing concepts and skills increases. Computer Science education has knock-on effects to all other domains, not just to engineering domains. The past several years have seen changes in the academic climate in many countries, characterized roughly by:

Often, the university administration looks to technology as a panacea; the scramble to offer ‘distance education’ courses via the Web is an obvious example.

 

6.1.2 What distinguishes teaching in CS?

In Computer Science, the fast pace of change is not just technological, but also intellectual and methodological. The discipline of Computer Science, without a firm traditional underpinning or a firm educational tradition, is buffeted by changing definitions of the domain itself. The academic discipline is characterized by many tensions: between science and engineering; between theory and practice; between training and education. The tensions are exacerbated by the current climate; in the face of income-oriented institutional perspectives, the push to satisfy future employers, the competition for students, and so on, the tensions are a matter of continual debate. Hence, the discipline is characterized by an almost unmanageable diversity:

This diversity and the pace of change mean that, not only must we provide students with a solid foundation, but we must also equip them for continual learning subsequently.

More fundamentally, the nature of what we study – that our tools are also our objects of study and are also a means of teaching – sets Computer Science apart:

In this context of abstract, difficult to observe, dynamic, interacting objects of study, it is a particular challenge for educators to make theory concrete – without confusing technology with theory.

More fundamentally still, Computer Science is about thinking. The constraints to thinking within the discipline are not physical, but human: our artefacts are constrained primarily by our ability to invent. Hence Computer Science teaching is about what computer scientists have managed to think about so far, and in what manner: algorithms, paradigms, languages, engines, tools, solutions are all thought products. But they are thought products that interact crucially with the physical world, and the relationship between the reasoning discipline of Computer Science and its technology is central to its particular character.

 

6.1.3 Why research CS Ed?

Even so, why should Computer Science educators involve themselves in Computer Science education research? It makes sense for them as members of a discipline: both to inform their practice, and to draw upon their practice to inform discourse in their discipline. Even though not all CS educators need engage in CS education research, all teachers should be aware of the research that pertains to their practice – as part of professional "scholarship".

Given the current conditions, it is especially important to distinguish truth from assumption, to have practice that is well-founded. Evolving teaching practice is normal to good teaching, but evaluation reliant on anecdote is not good enough. Adding a research perspective allows educators to learn more from their practice, e.g., to consider how much of local practice generalizes, and to identify the important parameters governing effectiveness in given situations. It allows educators to combine individual experiences in a meaningful way in order to address bigger issues, e.g., assessing the balance – understanding the trade-off – between practice and theory in courses and programmes.

Moreover, undertaking CS education research makes sense for many educators as individuals, adding a dimension to their practice of teaching. It focusses on questions close to their vocation, and close to their daily work. It allows their investment in teaching to contribute to their research record, one of the principal factors by which they are judged.

 

6.1.4 Why build research into teaching?

Combining research and teaching objectives has a number of advantages: economy of effort, opportunity, credibility, and adding value.

 

 

6.2 Rigour? – what and how?

The key question is how to achieve rigour in the face of human complexity and variability, and subject to the practical constraints of the educational setting. Rigour is plausible when research is viewed as a means of learning (i.e., adding information to the discourse in the community), rather than a means of proving. Scientific research is often not so much a process of getting answers as one of finding even better questions. This view leads to a healthy pragmatism, based on identifying the research question, considering what evidence is sufficient to address it, and accepting that constraints (and consequently trade-offs) are inevitable in the asking. It admits a variety of methods, theories, and accounts.

 

6.2.1 Accommodating different perspectives; combining techniques

There may be no single reality to which claims made in research reports correspond;

phenomena are ‘constructed’ not just ‘discovered’. Rarely can one sort of evidence reveal everything that an educator wishes to know about the impact of introducing a change to practice, but less-than-exhaustive evidence may provide what an educator needs to know in order to gain insight about educational roles and value. A single technique reveals something of what’s needed, and a broader examination often relies on a succession of techniques that build a collection of evidence. Research can take place in a variety of formal and informal settings, including classroom and laboratory. It can also

be conducted according to a variety of learning, teaching, and methodological paradigms.

The space of research techniques is large [for a detailed exposition, see, e.g., Isaac and Michael (1989); Denzin and Lincoln (1994); Borg and Gall (1989)], including:

Overcoming the constraints of the educational setting is often a matter of combination: accumulating results from a series of studies; compiling results from studies at different sites; mixing qualitative and quantitative techniques [e.g., Bryman (1988), Brannan (1992)]. Combining methods allows a sort of ‘triangulation’ among multiple perspectives, which can achieve a more complete account. Consolidating results from different sources can improve representativeness and strengthen conclusions, as long as the data arises from cognate studies.

 

6.2.2 What gives research value?

Typical criteria for research are relevance, importance or significance of topic, and contribution to existing knowledge. The value of a research contribution rests on its validity, the extent to which an account accurately represents the phenomena to which it refers. Our aim must be to establish that research results are valid ‘beyond reasonable doubt’: that they are plausible, credible, and supported by well-documented evidence. The value of evidence relies on clear and honest reporting of data that has been collected in a systematic manner using appropriate qualitative and/or quantitative methodologies. The crucial issues for research are whether it is representative, generalizeable, replicable, predictive, honest – and whether it recognizes its own assumptions and considers alternative interpretations or accounts. ‘Good’ evidence is appropriate evidence, i.e., data relevant to the question; ‘bad’ evidence fails to provide relevant information. We learn from ‘failure’ (i.e. refutation of our hypotheses) if the study is well-reported. Flawed evidence (i.e., evidence which is compromised by some limitation of the study) can still be useful, as long as we understand its limitations. A pragmatic approach to rigour relies on researchers knowing the value of different sorts of evidence and using available evidence within its value.

 

6.3 Applied to Runestone

6.3.1 Evaluation

The evaluation component will both assess Runestone in terms of its own aims, and draw from the experience any lessons that enable the Runestone format to be transferred to other departments and institutions. The evaluation will for this reason aim to distinguish between domain-specific and general lessons. Key evaluation issues include: the efficacy of the method – whether students learned and to what extent they collaborated; the cost of using this form of education, both in money and time, both for students and staff, e.g., how much time is spent on becoming acquainted with new techniques for communication; whether (and if so how much) language problems impede progress; which aspects of the set-up contribute to or detract from the collaboration.

 

6.3.2 Research

The two major aspects of the research component are peer-teaching and (changes in) cultural attitudes. The research will use largely qualitative methods to examine the processes students use in pursuing the project, the patterns of communication among the students and what their communication reveals about their attitudes and understanding.

One hypothesis of the project is that the different educational backgrounds between the two sets of students will promote interaction, because each group will need to draw on the other’s knowledge. Another hypothesis is that peer-teaching will contribute substantially to students’ understanding during the project. There are likely to be plenty of occasions for these students to explain things to each other. The project will examine occurrences of peer-teaching in order to try to determine what characterizes effective occasions (i.e., those in which understanding is improved), whether there are patterns of occurrence, and what factors might be used to promote effective peer-teaching. The anticipation is that peer-teaching will occur both spontaneously and in more formal settings when students in one group teach students in the other about topics not covered at that site.

There are, as described in section 3, several sources for data collection in this pilot study. A short summary:

The analysis of the pilot study will be qualitative and largely data-driven within the key topics. Results from the pilot study will be used to design more focussed research instruments for the subsequent years, facilitating more efficient analysis on a much larger student pool.

 

7. CONCLUSION

No reliable conclusions can be drawn from a small pilot study. However, this trial does suggest changes for the next phase in the Runestone project. Some example changes include having equal resources at the two sites, incorporating a clearer technical and project management briefing in order to achieve a faster and appropriately structured project start, making clearer suggestions about targets and milestones, and recommending a different team management structure.

Other means of collaboration will be investigated. For example, we are intrigued by the "CoWeb" (Collaborative Website) concept from Georgia Tech (see http://pbl.cc.gatech.edu:8080 /myswiki). In a Co Web, any page (including both text and graphics) can be edited by any user.

There is still much to understand about how international collaboration influences the learning process. We hope that the more detailed analysis of the full data will reveal more about factors that affected the nature of the social interactions within the team and will provide examples of peer-teaching opportunities taken and missed. The data collection schemes themselves will be evaluated, with the goal of introducing new methods that are faster to evaluate and more tuned in to this particular project. Scaling the project up to include the whole class in Uppsala next winter appears feasible.

Overall, the pilot study was a qualified success: the technology and interaction were demonstrated to be feasible. However, the project was not quite completed and the students experienced frustration associated with group interaction (particularly with the international interaction). Nevertheless, the students all report that they learned a great deal, and all but one reported that they would volunteer again for an international group project. Interestingly, the frustrations were largely attributable to individual differences (style, personality, commitment, and expectation), and a perceived imbalance of resources (key resources being located only in Sweden), rather than language, technical, or cultural factors. The need for faculty on both sides who know the technical content of the project was apparent, but would be interesting to find ways around this in future offerings.

 

ACKNOWLEDGEMENTS

The Runestone team and the American and Swedish student teams.

The Swedish Council for Renewal of Undergraduate Education, which sponsors the Runestone project.

NyIng.

REFERENCES

 

Berglund, A. and Daniels, M. (1997) Improving education quality, a full-scale study. Proceedings ACM SIGCSE Symposium (San Jose)

 

Brannen, J. (Ed.) (1992) Mixing Methods: Qualitative and Quantitative Research Theory and Practice. Avebury.

 

Bryman, A. (1988) Quantity and Quality in Social Research. Allen and Unwin.

 

Daniels, M., Petre, M., and Berglund, A. (1998) Building a Rigorous Research Agenda into Changes to Teaching, Proceedings ACM Australasian conference of Computer Science Education, (Brisbane)

Daniels, M,. Petre, M., Almstrum, V., Asplund, L., Björkman, C., Erickson, C., Klein, B., and Last, M. (1998)RUNESTONE, an International Student Collaboration Project, Proceedings IEEE Frontiers in Education Conference (Tempe, Arizona)

 

Denzin, N.K. and Lincoln, Y.S. (Eds.) (1994) Handbook of Qualitative Research, Sage.

 

Gall, M.D., Borg, W.R., and Gall, J.P. (1996) Educational Research: An Introduction (Sixth edition), Longman.

 

Guzdial, M., and Turns, J. (1997) Supporting sustained discussion in computer-supported collaborative learning: The role of anchored collaboration. Journal of the Learning Sciences (Submitted).

 

Isaac, S. and Michael, W.B. (1989) Handbook in Research and Evaluation for Education and the Behavioral Sciences, EdITS Publishers.

 

Kehoe, C., Guzdial, M., and Turns, J. (1997) What We Know About Technological Support for Project-Based Learning, Proceedings IEEE Frontiers in Education Conference (Pittsburgh)

Runestone Project Background Questionnaire

 

Please answer the questions honestly; your replies will help us to understand how best to run group projects for students. Your replies will be treated with strict confidentiality and will have no impact on your assessment for the course. Feel free to add additional comments, if you wish.

 

Name: ____________________________________________________________________

 

How much have you used the following media?

For each medium, please tick the column which best describes your experience:

 

never

at least once

many times

video-conferencing

audio-conferencing

shared applications (e.g., electronic whiteboards)

electronic mail

computer conferences

 

How comfortable do you think you will be using the following media to do collaborative work?

Please circle the rating that best applies, from 1 = uncomfortable to 5 = entirely comfortable)

 

UNcomfortable...............................comfortable

video-conferencing

 

1 2 3 4 5

audio-conferencing

 

1 2 3 4 5

shared applications

 

1 2 3 4 5

electronic mail

 

1 2 3 4 5

computer conferences

 

1 2 3 4 5

 

Rank each of the media in terms of how helpful you think it will be in the following activities:

1 = no help at all ... 5 = very helpful

 

video-conference

audio-conference

shared application

electronic mail

computer conference

interpreting the problem to be solved

         

considering alternative solutions

         

agreeing on a solution

         

establishing roles / dividing up the work

         

knowing your collaborators

         

getting work done

         

resolving mis-under-standings

         

 

In general, how do you feel about working with other students?

Please tick the one that best applies:

______ prefer it

______ like it sometimes

______ tolerate it – it’s OK

______ don’t like it much

______ strongly prefer to work alone

 

In a typical week, what percentage of your study time (excluding lectures and laboratories) do you:

spend studying alone ______ %

spend studying with one friend ______ %

spend studying with a group of friends ______ %

 

How often have you worked in a group?

For each activity, please tick the column that applies best to you:

 

never

at least once

many times

to do a team project

to solve problems (e.g., to work examples)

to do a laboratory exercise

 

Please name three advantages that you get from working with other students?

1. ________________________________________________________________________

 

2. ________________________________________________________________________

 

3. ________________________________________________________________________

 

 

Please name three disadvantages that you get from working with other students?

1. ________________________________________________________________________

 

2. ________________________________________________________________________

 

3. ________________________________________________________________________

 

 

How important do you think it is to be able to speak a foreign language well?

not important 1 2 3 4 5 very important

 

How important do you think it is to be able to understand a foreign culture well?

not important 1 2 3 4 5 very important

 

 

What do you think you can learn from foreign students that you can’t from students in your own country?

 

 

 

 

How likely do you think it is that you will learn the sort of things you’ve written above during this project?

not at all l ikely 1 2 3 4 5 very likely

 

For each of the following questions, please answer for group work in general, and international group work in particular (if you have no experience, please speculate):

 

group work in general

international group work

What’s the best part?

 

 

 

What’s the worst part?

 

 

 

What’s the most useful in getting work done?

 

 

 

What do you expect is most likely to go wrong?

 

 

 

 

 

When you have worked on teams before, which aspect has given you the most satisfaction?

______ the social outcomes (making new friends, interacting, and so forth)

______ the technical outcomes (the job was well done)

______ both the social and technical outcomes in fairly equal amounts

______ simply completing the assignment or doing your job

______ have never really worked on a team

______ other, please specify: __________________________________________

 

When you work in a group (whether formally or informally), which of the following activities do you find yourself doing?

Please tick the column that best describes your behaviour:

 

 

never

less than other people

about the same as others

more than other people

usually

initiating ideas

explaining

resolving differences

asking for information

asking for explanations

listening

summarizing

taking notes

leading

doing the work

 

Please describe an effective group:

 

 

 

 

For each attribute pair, please circle the one that best describes you:

Would you characterise yourself more as:

theoretical...

...or practical?

 

an introvert...

...or an extrovert?

 

a do-er...

...or a thinker?

 

methodical...

...or intuitive?

 

analytic (breaking things down into parts)...

...or holist (considering things as-a-whole)?

 

calm...

...or excitable?

 

easy-going...

...or stubborn?

 

tolerant of risks...

...or avoiding risks?

 

doing things because you want to...

...or doing things because they are expected?

 

talkative...

...or quiet?

 

enjoying the beginning of a project more than the end...

...or enjoying finishing a project more than beginning one?

 

 

 

Why did you – or didn’t you – volunteer to join the international group?

 

 

What do you expect to gain from doing this group project?

 

 

 

 

Thank you !

 

Results from the Project Background Questionnaire

 

A. Media

The students were asked to state their experience with use of different media for collaboration. The results were (The Runestone group students within parenthesis):

We also asked the students to tell how comfortable they were with using the media to do collaborative work? The results were (Min 1 and Max 5, and the Runestone group students within parenthesis):

The final question about media was how helpful the students thought it would be in a few named activities, i.e. (Min 1 and Max 5, and the Runestone group students within parenthesis) for each of the five techniques above:

 

Video conference

Audio conference

Shared applications

e-mail

Computer conferences

Interpreting the problem to be solved

4.5

(4.5)

2.9

(3.0)

3.3

(4.2)

3.1

(4.2)

3.1

(2.5)

Considering alternative solutions

3.9

(4.0)

2.9

(3.0)

2.9

(4.0)

3.4

(4.0)

3.2

2.5)

Agreeing on a solution

4.4

(4.8)

3.0

(3.0)

2.4

(3.2)

3.1

(4.0)

3.1

(3.0)

Establishing roles / dividing up the work

3.6

(3.5)

2.9

(2.8)

2.4

(3.2)

3.0

(3.5)

2.9

(2.2)

Knowing your collaborators

4.4

(4.5)

3.4

(3.8)

1.8

(2.0)

2.5

(3.2)

2.6

(2.0)

Getting work done

3.0

(2.0)

2.5

(2.2)

2.8

(3.5)

2.8

(4.0)

3.2

(2.8)

Resolving mis-understandings

4.1

(4.0)

2.9

(3.5)

3.0

(3.8)

3.5

(4.2)

3.0

(2.5)

 

 

B Collaboration

We wanted to know the attitude the students had towards working in groups. They were asked to choose a statement that best suited their attitude (The Runestone group students within parenthesis).

We also asked them to consider how they spent a typical week and state how they spent their study time (excluding lectures and laboratories), i.e. how much, in %, they (The Runestone group students within parenthesis)

Next we asked about their past experience with working in a group. They stated how often they had worked in a group to (The Runestone group students within parenthesis).

Finally we asked them to name three advantages and disadvantages they would get from working with other students. The suggested advantages were;

and when asked specifically about an international group they stated

The suggested disadvantages were:

and when asked specifically about an international group they stated

We asked them, which aspect has given them the most satisfaction in their past experience from working on teams, and gave a few examples.

The students also stated the activities they normally ended up doing when they worked in a group (whether formally or informally). They picked one that best described their own behavior in all of the following:

 

 

C Language and Culture

The Swedish students needed to use a non-native language, and both sets of students would be confronted with a different culture. We asked them to state how important they thought it was to be able to speak a foreign language well. One put down 3 on a scale 1 to 5, all the rest marked 5 (i.e. very important). We also asked how important they thought it was to be able to understand a foreign culture well, where 2 marked 4 and the rest 5 (i.e. very important).

 

D Personal profile

The students chose for each attribute pair below the one that best characterized themselves.

 

 

 

LEARNING STYLES QUESTIONNAIRE

There is no time limit to this questionnaire, it will probably take you less than 10 minutes.

The accuracy of the results depends on how honest you can be. There are no right or wrong answers.

If you agree more than you disagree with a statement tick the item number.
If you disagree more than you agree mark the item with a cross.

Be sure to mark each item (unmarked items will be assumed to be crossed).

1. I have strong beliefs about what is right and wrong, good and bad.

2. I often act without considering the possible consequences.

3. I tend to solve problems using a step-by-step approach.

4. I believe that formal procedures and policies restrict people.

5. I have a reputation for saying what I think, simply and directly.

6. I often find actions based on feelings are as sound as those based on careful thought and analysis.

7. I like the sort of work where I have time for thorough preparation and implementation.

8. I regularly question people about their basic assumptions.

9. What matters most is whether something works in practice.

10. I actively seek out new experiences.

11. When I hear about a new idea or approach I immediately start working out how to apply it in practice.

12. I am keen on self discipline such as watching my diet, taking regular exercise, sticking to a fixed routine etc.

13. I take pride in doing a thorough job.

14. I get on best with logical, analytical people and less well with spontaneous, ‘irrational’ people.

15. I take care over the interpretation of data available to me and avoid jumping to conclusions.

16. I like to reach a decision carefully after weighing up many alternatives.

17. I’m attracted more to novel, unusual ideas than to practical ones.

18. I don’t like disorganised things and prefer to fit things into a coherent pattern.

19. I accept and stick to laid down procedures and policies so long as I regard them as an efficient way of getting the job done.

20. I like to relate my actions to a general principle.

21. In discussions I like to get straight to the point.

22. I tend to have distant, rather formal relationships with people at work.

23. I thrive on the challenge of tackling something new and different.

24. I enjoy fun-loving, spontaneous people.

25. I pay meticulous attention to detail before coming to a conclusion.

26. I find it difficult to produce ideas on impulse.27. I believe in coming to the point immediately.

28. I am careful not to jump to conclusions too quickly.

29. I prefer to have as many sources of information as possible - the more data to think over the better.

30. Flippant people who don’t take things seriously enough usually irritate me.

31. I listen to other people’s points of view before putting my own forward.

32. I tend to be open about how I’m feeling.

33. In discussions I enjoy watching the manoeuverings of the other participants.

34. I prefer to respond to events on a spontaneous, flexible basis rather than plan things out in advance.

35. I tend to be attracted to techniques such as network analysis, flow charts, branching programmes, contingency planning, etc.

36. It worries me if I have to rush out a piece of work to meet a tight deadline.

37. I tend to judge people’s ideas on their practical merits.

38. Quiet, thoughtful people tend to make me feel uneasy.

39. I often get irritated by people who want to rush things.

40. It is more important to enjoy the present moment than to think about the past or future.

41. I think that decisions based on a thorough analysis of all the information are sounder than those based on intuition.

42. I tend to be a perfectionist.

43. In discussions I usually produce lots of spontaneous ideas.

44. In meetings I put forward practical realistic ideas.

45. More often than not, rules are there to be broken.

46. I prefer to stand back from a situation and consider al l the perspectives.

47. I can often see inconsistencies and weaknesses in other people’s arguments.

48. On balance I talk more than I listen.

49. I can often see better, more practical ways to get things done.

50. I think written reports should be short and to the point.

51. I believe that rational, logical thinking should win the day.

52. I tend to discuss specific things with people rather than engaging in social discussions.

53. I like people who approach things realistically rather than theoretically.

54. In discussions I get impatient with irrelevancies and digressions.

55. If I have a report to write I tend to produce lots of drafts before settling on the final version.

56. I am keen to try things out to see if they work in practice.

57. I am keen to reach answers via a logical approach.

58. I enjoy being the one that talks a lot.

59. In discussions I often find I am the realist, keeping people to the point and avoiding wild speculations.

60. I like to ponder many alternatives before making up my mind.

61. In discussions with people I often find I am the most dispassionate and objective.

62. In discussions I’m more likely to adopt a ‘low profile’than to take the lead and do most of the talking.

63. I like to be able to relate current actions to a longer term bigger picture.

64. When things go wrong I am happy to shrug it off and ‘put it down to experience’.

65. I tend to reject wild, spontaneous ideas as being impractical.

66. It’s best to think carefully before taking action.

67. On balance I do the listening rather than the talking.

68. I tend to be rough on people who find it difficult to adopt a logical approach.

69. Most times I believe the end justifies the means.

70. I don’t mind hurting people’s feelings so long as the job gets done.

71. I find the formality of having specific objectives and plans stifling.

72. I’m usually one of the people who puts life into a party.

73. I do whatever is expedient to get the job done.

74. I quickly get bored with methodical, detailed work.

75. I am keen on exploring the basic assumptions, principles and theories underpinning things and events.

76. I’m always interested to find out what people think.

77. I like meetings to be run on methodical lines, sticking to laid down agenda, etc.

78. I steer clear of subjective or

ambiguous topics.

79. I enjoy the drama and excitement of a crisis situation.

80. People often find me insensitive to their feelings.

THANK YOU for taking time to complete this questionnaire.

If you wish to see which ‘learning style’ your answers reflect, please make a note here.

If you have comments or questions, feel free to contact the research team.

The Learning Styles Questionnaire is © Honey and Mumford, 1986

 

Results of the Learning Style Questionnaire, spring 1998

 

Activist

Reflector

Theorist

Pragmatist

Swede_1

moderate preference

moderate preference

strong preference

low preference

Swede_2

low preference

strong preference

moderate preference

moderate preference

Swede_3

very strong preference

moderate preference

strong preference

moderate preference

Swede_4

very strong preference

moderate preference

low preference

moderate preference

Swede_5

moderate preference

strong preference

strong preference

strong preference

Swede_6

very strong preference

moderate preference

very low preference

low preference

Swede_7

moderate preference

low preference

strong preference

low preference

Swede_8

moderate preference

moderate preference

moderate preference

low preference

Swede_9

moderate preference

strong preference

low preference

low preference

Swede_10

moderate preference

very strong preference

moderate preference

very low preference

Swede_11

very strong preference

moderate preference

strong preference

moderate preference

Swede_12

moderate preference

moderate preference

low preference

low preference

Swede_13

very low preference

low preference

very low preference

very low preference

 

Presenting Runestone

Runestone, or aspects of Runestone, has been presented and/or discussed both nationally and internationally since the start in 1997. Below is a list of the most notable occasions: