IJET Logo

International
Journal of
Educational
Technology

home Issues submit



        articles Editors

Articles

Feature Resources


Online Instruction Versus Face-to-Face Instruction At UNIMAS

- Bing Hiong Ngu, Universiti Malaysia Sarawak

Abstract

This article compared online instruction and face-to-face instruction at University Malaysia Sarawak (UNIMAS). Participants were undergraduate full-time students who were enrolled in the Human Resource Development Program. The learning materials used was ‘How to write research proposals and reports’; and this represented a subtopic in the ‘Research Methods’ course. The design of an online learning environment emphasized four types of interaction: learner-content interaction (topic notes), learner-self interaction (multiple exercises), learner-learner interaction and instructor-learner interaction (online discussion on case studies and group project).

The ‘QuickPlace’ software was customized to incorporate the component of multiple choice exercises. This latter was written with html language and linked to the ‘QuickPlace’. The face-to-face group attended routine lectures and tutorials on the same topic. Test results indicate that the online discussion assisted students to learn case studies slightly better than the face-to-face instruction. This may due to the above learning interactions that resulted in a greater emphasis on self-oriented and group-oriented learning as compared to an instructor-oriented face-to face learning experience. However, feedback from the students indicate a need to further improve the design of the online course.

Introduction

With the advancement in learning technologies, online instruction has become popular these days (Newton, Marcella, & Middleton, 1998; Liu, Walter, & Brooks, 1998). Learners who follow online instruction are expected to engage in a self-paced learning strategy (Fischer & Scharff, 1998); and in addition, they are expected to engage in a variety of online communications such as asynchronous (or synchronous) interaction with other learners and instructor, virtual field trips, email, and voice communication through internet audio streaming (Wang, Hinn, & Arvan, 2001; Kumari, 2001; Carr-Chellman and Duchastel, 2000; Fischer & Scharff, 1998). The nature of collaborative learning within the online course design normally encourages group participation to generate online group project or ideas to solve an issue.

In contrast, face-to-face instruction requires academic staff to give lectures and have students attend face-to-face tutorial. This face-to-face instruction requires lecturers to engage in interpersonal contact, social contact, and non-verbal communication with students. It is possible that direct contact with the instructor in traditional face-to-face settings may still contain some stigma that prevents students from communicating freely with their instructor.

An interview with a few students at University Malaysia Sarawak (UNIMAS) showed that they were dissatisfied with some online courses offered that were merely lectures notes placed online. This indicates that more effort is required in designing a learning environment which should represent an integral part of the development of a quality online course content. The next section will discuss three online cases at UNIMAS.

Cullip (personal communication, 1999) used Miscrosoft Front Page 2000 to design an online discussion forum to replace normal tutorial session in learning English grammar. He attributed the low participation rate of on-campus students to the fact that they were too busy in completing assignments for other courses. As Cullip did not evaluate how much students learned from the online discussion, he did not know whether the online tutorial was as good as or could supersede face-to-face tutorial.

Another lecturer, Joblie (personal communication, 1999) used ‘QuickPlace’ to encourage on-campus students in participating online discussion and doing online group project in which a lecturer (i.e., Joblie herself) acted as a facilitator. In contrast to Cullip, Joblie reported satisfactory performance of students in the group project (in which students were required to write an essay) that represented part of course assessment. According to Joblie, it was possible that all 24 students who actively participated in the online discussion were satisfied with her method of monitoring. During the two weeks in which the online discussion was conducted, she replied students’ queries within two days (i.e., technical matter, course content and so on). Apart from the performance outcome, Joblie did not conduct evaluation in relation to learners’ perceptions of the ease of using the ‘QuickPlace’. In either the case of Cullip or Joblie, the online discussion forum was limited to learner-learner interaction (discussion among online participants), and learner-instructor interaction (lecturer monitors discussion) (Moore, 1989; Moore & Kearsley, 1996).

With the assistance of a technical staff from the Centre for Applied Learning and Multimedia at UNIMAS, NoorShah (2001) used ‘QuickPlace’ to create a website for teacher trainees who underwent teaching practice at various schools located in different geographical areas of East Malaysia. Email communication was available for the trainees as well as the supervising lecturers from UNIMAS. In the website, the Notice Board served as an avenue to make various announcements. The inclusion of school maps gave direction (for the supervising lecturers from the UNIMAS) to reach a school; and a class time-table provided information concerning when a teacher trainee would be in a classroom. Further, online discussion forum in which teacher trainees could share and contribute ideas to issues related to teaching (i.e., how to prepare a lesson plan, how to manage discipline in class and so on) proved to be a meaningful learning experience for them. However, the author noted that trainees who were separated in a wider geographical regions tended to exchange more messages online compared to those trainees who were placed in closer geographical areas. Also, some trainees in remote schools did not have Internet access.

This research attempts to address the above issues related to designing and evaluating online course. A research team was involved in designing and evaluating an online instruction for the topic of ‘How to write research proposals and reports’. This project represents a pilot study which would offer suggestions for lecturers, policy makers at UNIMAS who intend to offer online undergraduate courses.

Computer mediated collaborative learning

Phillip, Fairholme and Luca (1998) used an email listserver to successfully encourage students in participating in a group project. Marks were allocated to appropriate listserver use, and this served to encourage inter-team and inter-student communication. Each team followed the progress of other teams through the weekly progress report. In contrast, Pearson (1999) reported the use of an electronic network by trainee teachers did not facilitate active discussions among participants. Fear of public comments and criticism might deter them from expressing educational concepts and issues over the network.

The use of asynchronous learning networks (ALN) for undergraduate on-campus students who took a summer course while they were off campus provided some insights into the learning outcomes and students’ satisfaction of the ALN (Wang, Hinn, & Arvan, 2001). The success of online learning relies on learners’ motivation, self-discipline, and skill in managing the time. The factors that contribute towards learning outcomes include a reliable internet connection, the availability of technical support and the course design.

The technology of collaborative learning enables cross-cultural interaction in which participants from different countries collaborate to produce a joint project. For example, students from the City University of Hong Kong (part-time accountancy students) and the Eindhoven Unviersity of Technology in the Netherlands (business engineering students) participated in a joint project related to Information Technology (Vogel, Genuchten, Lou, Van Eekhout, Verveen, & Adams). Both synchronous and asynchronous interactions were made available with the aid of e-mail, videoconferencing and GroupSystem. Although eight out of 10 teams (72 students) succeeded in producing a report related to software engineering, the participants had to overcome technical problem, differences in knowledge background, and cultural adaptation.

A Comparison Design

In developing an interactive computer programming learning environment, Catenazzi and Sommaruga (1999) provided students with lectures notes and exercises so that they could edit, compile, run programs, and evaluate their learning performance. In the final examination, the test group performed slightly better than other students who did not participate in the program. In the study, the authors argued that the measurement of the effectiveness of the computer learning environment required a comparison group which did not participant in it (presumably meaning those students who attended traditional face-to-face mode of instruction)

Closely related to this is the development of a distance learning computer course (e.g., Programing & Software Engineering, Database Design and Analysis) at the University New South Wales, Australia (Lambert, Shepherd, Ngu, Ho, Whale & Geissinger, 1996). Results from this research indicate that the distance learning group performed as good as the control group who attended face-to-face lectures and tutorial. According to the authors, the inclusion of a comparison group is to assess whether a distance computer learning course targeting those full-time workers enrolled as part-time students were able to achieve performance as good as or better than the face-to-face group.

The study reported by Schutte (1966) inidcated that the virtual class mode of delivery was superior to the traditional classroom in learning social statistics. Students were motivated to engage in the new technology of virtual class as they spent more time in completing their class work in the virtual environment than the traditional group. In contrast, Smith and Taylor (1995) reported no significant difference in the mastery of a physics course that was presented to students either in web version or lecture version.

Collins (2000) compared web, correspondence and lecture versions of a second-year Biology course over four different semesters. Analysis on the course evaluation revealed that students in the web version were satisfied with the web approach though they did not score better than students in the correspondence version nor the students in the lecture version in the final test. Because the author has the intention to replace the correspondence version with the web version eventually, he included a correspondence version as a comparison to assess whether there was any significant difference between the two groups in their final test.

There is, however, opposing view concerning a comparative study involving a computer media and a traditional mode of learning. Lockee, Moore and Burton (2001) argued against the use of media comparison studies. They identified a range of variables (such as learner characteristics, media attributes, instructional strategy choices, and psychological theories) in the comparison studies which made the comparative design inadequate to justify the learning effects of the two instructions. That is, it is difficult to establish the cause and effect of the comparative design because it is almost impossible to match the variables for the participants in a comparison study.

In sum, research evidences point to two different views in evaluating computer mediated learning environment. As noted above, several researchers adopted an evaluation strategy emphasizing learners’ satisfaction and learning outcomes of the course design. Other researchers compared the effectiveness of online course and a traditional face-to-face course with the intention of replacing online course with traditional course should the former supersedes the latter. This investigation focused on full-time UNIMAS undergraduate students’ responses towards online instruction. Similar to researchers who adopted a comparative design, the researcher in this study included a face-to-face instruction as a control to test whether online instruction would be superior to face-to-face instruction in terms of learning outcomes. Further, data related to students’ views about the online instruction would be collected to examine whether they were satisfied with the online instruction.

Method

Materials

The topic investigated was ‘How to write research proposals and reports’, and this was part of a ‘Research Method’ course taught to undergraduate students enrolled in the Human Resource Development program. The learning environment provided topics notes, multiple choice exercises, and structured group activities (see Appendix 1). Each topic had its learning aims and activities. This latter included multiple choice exercises and/or online discussion (case studies). There were Links provided to relevant websites for additional support materials. Materials for evaluating the online instruction included a test comprising 7 multiple choice questions that resembled the practice multiple choice questions; and a case study (similar to online group discussion questions) in which students were required to answer 8 short questions. In addition, there was a questionnaire with 12 questions to elicit students’ perceptions of online class; a diary sheet to track the amount of time invested in completing the online class; and the usability evaluation form requiring students to rate the ease of use of the system as well as the usefulness of the topic notes. Apart from the online group project, the diary sheet, questionnaire, usability evaluation and a test were in paper-based. Lastly, the computer would generate feedback that included: (1) number of messages posted, (2) number of contributors and responses generated. This provided a means to evaluate whether students participated in the online discussion.

Figure 1:  Interaction learning model

 

Interaction learning model

The design emphasized student-centered, and activity-based learning environment (Carr-Chellman & Duchastel, 2000). As shown in Figure 1, the design of an online learning environment emphasized learner-content interaction (Moore, 1989), learner-self interaction (Soo & Bonk, 1998), learner-learner interaction (Benson & Rye, 1996; McGill, Volet, & Hobbs, 1997), and instructor-learner interaction (Braggett, Retallick, Tuovinen, & Wallace, 1995; Brown, 1996; Stephenson, 1997-98).

When students studied the topic notes, they would be engaged in learner-content interaction. Since the topic notes were presented in text only, this represented one-way learner-content interaction (Tuovinen, 1999). That is, learners could not manipulate (such as edit) the presentation mode of the content as they seek to understand the content. A multiple choice question was provided after each sub topic. This required students to tick the correct statements, and computer provided immediate feedback (see Appendix 1). This frequent intervention of multiple choice exercises (learner-self interaction) aimed to help students in reflecting (try to understand or make sense) the topic content (Soo & Bonk, 1998). Take note that in the learner-self interaction, learners were required to decide the correct statement(s) for the multiple choice questions; whereas in learner-content interaction, learners were not required to use their decision to judge the accuracy of the topic materials. Rather, all the topic materials were presumed to be correct. However, it is possible that learner-self interaction overlaps with the learner-content interaction to a certain degree. In both cases, the learners were required to reflect and comprehend the materials presented to them.

The online case studies (learner-learner interaction, and instructor-learner interaction) and group project (learner-learner interaction) represented structured group activities. In either the learner-learner interaction or the instructor-learner interaction, the interaction system would be two-way and asynchronous. This asynchronous communication did not depend on the participants being present together at a specific time to exchange messages. In other words, it allowed participants time to reflect upon responses to questions posed, and this may help them to generate fruitful discussion (Berge, 1994).

For the case studies, students were presented with discussion questions (see Figure 3). This provided opportunity for them (learner-learner interaction) to debate, ask, and contribute to a discussion question. A lecturer would act as a facilitator (instructor-learner interaction) whose task was to monitor students’ participation in the discussion, and to steer the course of discussion in cases where students were side-tracked. Note that this latter instructor-learner interaction was for online case studies only. The lecturer did not participate in the online group project discussion (see Figure 4). It is anticipated that online group discussion would facilitate a constructivism learning approach (Fischer & Scharff, 1998; Papert, 1980; Resnick, 1996). Its emphasis was on collaborative learning in which all students were required to generate ideas, to construct and build knowledge as they exchanged information with either their peers or the facilitator. In other words, this would assist students to integrate multiple perspectives through reflection of diverse views on a subject matter.

Traditional Classroom Interaction 

Students were provided with a handout of lecture notes (included multiple exercises) that matched with the online content. Using overhead transparency, the lecturer presented to students the main points of each subtopic and its multiple choice exercise. Students were free to ask questions concerning the lectures; and to indicate which statement(s) in the multiple exercises were correct. Presumably students would attempt to understand the content via the interaction with the lecturer or the notes or the multiple choice exercises. Interaction process was presumed to be quite substantial due to the small number of students (i.e., 19). For the tutorial materials that were also the same as online group discussion questions, students were divided into different groups to discuss various tutorial questions. This group discussion represented interaction among students, and the results of the discussion would be shared among the groups.

Designing the Learning Environment

The research team comprised staff from the Faculty of Cognitive Sciences and Human Development and Centre for Applied Learning and Multimedia (CALM) at UNIMAS. One academic staff (the researcher) who had never taught the subject before wrote the content; and two others who had three years experience in teaching the subject reviewed the content. The content writer worked closely with a technical staff from the CALM to create an online learning environment.

Lotus QuickPlace Software

The Lotus ‘QuickPlace’ software was used. This was developed initially to assist business people to work on group project in a web-based working environment. To customize the ‘QuickPlace’ software for this online instruction, the component of multiple choice exercises was written with html language and incorporated in the ‘QuickPlace’. As can be seen in Figure 2, a menu on the left-hand side contained a few folders. The Bulletin Board denoted a folder that contained announcements. The Course Guide folder provided information such as introduction to the topic, lecturers’ contact, assessment requirements, and a study schedule. The Study Help folder provided study tips on the online learning strategy. The Project Room was for students to discuss the online group project; while the Discussion Room catered for online case studies. Each online case study had its separate folder(s) depending on the number of discussion questions. The Links enabled students to get support materials from useful websites. The security and customize folders can only be accessed by the lecturer to make modification to the curse content.

Figure 2: Screen shot of the main page

 

Procedure

Participants were second year full-time undergraduate students at the UNIMAS who enrolled for the Human Resource Development program. They were randomly assigned to two groups: 18 in the online group and 19 in the face-to-face group. As this research project was conducted during students’ actual course time-table for that particular topic (i.e., 24/02/00 to 02/03/00), all students were matched on the same amount of time to complete the topic.

Online group.  Students in the online group were assembled in a computer laboratory at UNIMAS. The researcher briefed the students concerning the aim, and procedure of the experiment. Each student was then assigned a password to access the course content in a computer. Students were told to browse through the Bulletin Board, Course Guide, and Study Help folders first before they studied the topic notes. The programmer who customized the ‘QuickPlace’ was available to assist students (during the first day ) in case any computer problem arose. Then, students followed the online course in a self-paced manner, and they were required to record all their activities in a diary (see Appendix 2). In the study schedule (Course Guide folder), students were informed of a group project assignment (how to write a research proposal) which needed to be submitted online at the end of the topic (i.e, 02/03/00). A team of 6 students worked together to write a research proposal that should include the different components of a research proposal. The requirement of the research proposal was 4-5 pages in length, font point 12, and double spaced.

At the end of the topic, students were given 20 minutes to complete a test comprising multiple choice questions, and a case study; and 15 minutes to fill a questionnaire and usability evaluation form. The allocation of marks for the test and group project aimed at encouraging active participation of the students in the online learning.

Face-to-face group. Students attended a two-hour lecture and one hour tutorial given by a lecturer (the content writer). The lecture notes were matched with the online content. Lecture notes (included multiple exercises) were presented to students using overhead transparency. The tutorial materials were the same as those given in the online case studies. Students were required to sit a test, and complete a group project similar to those students who followed the online instruction.

Results and Discussion

Objective Evaluation

 Data analyses were based on the 37 students who took part in the study. Students were required to answer 7 multiple choice questions, and 8 short questions for a case study. One mark each was awarded for one correct answer to a multiple choice question and a short question.

Table 1 presents the means and standard deviations of correct answers. A t-test indicates that the online group outperformed the face-to-face group on a case study, t (35) = 3.10, p = .004; but not on the multiple choice questions, t (35) = 1.31, p = .2. Concerning the group project, one mark was allocated for each correct component of the research proposal. The score for the three online groups were 88%, 66%, and 68% respectively. As for the face-to-face group, the respective score were 74%, 76%, and 68%. Thus, the two groups did not show differential performance in the group project.

Table 1  Means and (standard deviations) of correct answers

 

            Online instruction

                n = 18

Face-to-face instruction

        n = 19

 

                M (SD)

                    M (SD)

Proportion correct answers

Multiple choice

Case study

4.78 (1.44)

4.11 (1.54)*

4.21 (1.18)

2.68 (1.25)

Note: *indicate t-test significant at .05 level

Figure 4: Online Group Project 

Subjective Evaluations

Table 2 shows the usability evaluation which revealed the perceived usefulness of the content, and the perceived ease of use of the system. In general, students were more satisfied with the course content rather than the tools used. Students used a Likert scale (Excellent (5) to Poor (1) to rate the content items. They rated favorably on the content items. Less than 10% of students rated the Topic Notes and Quality of exercises as poor. A similar rating pattern emerged for the online case studies. In contrast, most students rated the Navigation using bottom and links as Not useful (see also Table 3, comments on question 8). But they rated the Online study help and Online discussion page as quite useful.

Comments from the questionnaire (see Table 3) indicate that students were frustrated by the frequent server failure (see questions 2, 4, 7, 11). Also, comments on question 8 indicates that some students found the design buttoms were not user friendly. For instance, when a student responded to a message, a separate screen would appear in which the original message was no longer in view. This represents a split-attention effect (Tarmizi & Sweller, 1988; Sweller, Chandler, Tierney, & Cooper, 1990) in that students had to switch backward and forward to keep track with the original message while answering the message.

The diary sheets (see Appendix 2) were not analyzed. A number of students filled the diary sheet incorrectly. They indicated the time for doing certain activities when the computer servers were down in the campus. Note that most students lived in university hostels, and they used computers in the campus. Other students filled the type of activities but forget to include the time that they took to complete those activities.

Table 2:  Usability evaluation

 

Content Items

Percentage Response on Rating

Excellent

 

Poor

5

4

3

2

1

Topic Notes

0

56

44

0

0

Quality of exercise

0

39

44

11

6

 

Online Case Studies

 

1. Conceptual framework

6

44

44

6

0

2. Design and research method

6

44

39

6

6

3. Results

0

39

61

0

6

4. Discussion

6

39

39

17

6

 

Tools

 

Very useful

 

Not useful

5

4

3

2

1

Navigation using bottom

6

1

44

22

11

Links

6

6

28

56

6

Online study help

0

33

44

17

6

Online discussion page

6

33

44

11

6

 

Table 3 :  Questionnaire to elicit students’ perceptions of online class

Questions

%

Yes

%

No

%

No response

Comments’ made

1. Did you find the exercises prepared you the online discussion and the group project?

66

22

11

Exercise helps understanding

2. Is there sufficient learning activities in helping you to learn?

55

38

5

Not enough time to do those activities; server down; too much activities

3. Do you think six students in a group project is a good choice?

83

16

0

We can compare ideas; prefer 4 instead of 6

4. Did you discuss the online discussion with your friends ( and group project with your group members) first before you post the message?

61

33

5

No time; sometime; discuss because of server down; discussion enables me to get feedback from friends;

5. Did you read all the message posted?

77

22

0

I copy the message and read in hostel; can information from the message

6. Did putting the learning materials online motivate the discussion between you and your friends?

55

44

0

Their comments and contribution lead to more effective learning; it is not the learning materials, it’s the technology and facilities that did not motivate us; it is time consuming; add knowledge when we exchange information

7. Did you download and print out the learning materials?

72

27

0

To understand better;download some materials; makes my life easier because of server down,;easier for me to refer;

8. Did you find the organization of the different parts of the system (topic notes, exercises, online discussion pages) clear?

72

27

0

I have to turn page to page; design button is not user friendly; ; but the online discussion needs to be improved

9. Would you recommend this online materials to a friend?

44

44

11

It is ok if the computer and technology is working; unless UNIMAS has a good server supplier; take a long time; online materials are not user friendly;

10. For you personally, what was the best thing about this online study?

 

 

 

Online discussion ; need not attend lecture; can discuss with others; do not need to write in paper; the notes and exercises; this new method of study attract students’ interest in study; access anytime if the server is ok

 

11. For you personally, what was the worst thing about online study?

 

 

 

Power failure, bad server; server down, failure system

12. Any other comments?

 

 

 

Put a user-friendly interface, make the server better; online study is good, but the server problem make us very stressed; need sufficient time; the online study is too near to semester examination, feel very stressed

Computer generated feedback

Table 4 presents conferencing exchanges from 24/02/00 to 02/03/00. With respect to the online case studies, apart from the Conceptual Framework, the other three (Design and Research Method, Results, and Discussion) had slightly less than 50% participants. Most messages posted were original, and the facilitator contributed almost all the replies. On examining the questionnaire (see Table 3, question 2), 77% of the students read all message posted. Table 3 also indicates that power and server failure (Table 3, questions 2, 7, 9, and 11) caused the greatest dissatisfaction among the students. Indirectly, this suggests that students did not engage much in learner-learner interaction due to the difficulty in accessing computers. Nonetheless, students probably benefited from instructor-learner interaction in that the replies provided by the lecturers helped them to compare and correct their answers. As shown in Figure 3, students (indicated by their student no. 2651 and 2436) discussed ISM survey form, and the lecturer (indicated by bhngu) supplied further information concerning the ISM survey form. Regarding the group project, students were active in posting and replying messages (see Figure 4). If computer access did not represent a problem (as mentioned above), this active learner-learner interaction among students might result in a greater number of total messages posted, and a better performance on the group project.

Table 4:  Conferencing exchanges on online case studies and group project from 24/02/00 to 02/03/00)

 

 

Conference

Online Case Studies

Group Project

Conceptual framework

 

4 Questions

Design and Research Method

4 Questions

Results

 

 

2 Questions

Discussion

 

 

2 Questions

A

B

C

No. of message posted

 

83

39

23

19

17

18

4

Online

Contributors

 

62

31

17

14

6

6

3

% of online participants

 

86

43

47

39

100

100

50

Lecturer contribution

 

21

7

6

5

0

0

0

Original messages

 

60

27

16

14

2

2

1

Replies

23

13

7

5

15

16

3

Figure 3:   Online Discussion 

 

Conclusion

 Test results indicate that online group performed slightly better than face-to-face group on case studies only. It seems that the online group was not disadvantaged despite frequent interruptions of power and server failures in the campus. The positive online learning effect may owe to its different types of interactions (learner-content, learner-self, learner-learner and instructor-learner). This various types of learning interaction may foster a more self-oriented and group-oriented learning experience than the face-to-face instruction (Maher, 1998). As a consequence, this self-oriented and group-oriented learning experience promotes learning (especially the online case studies) more than face-to-face instruction.

In contrast, face-to-face class did not provide opportunity for every student to interact with the lecturer when the lecture was conducted. Due to nature of traditional lecture, only some students were given the opportunity to respond to the multiple choice exercises incorporated in the lecture notes. Whereas all students in online group were encouraged to participate the online discussion topics, students in face-to-face group worked in separate groups with each group discussed one tutorial topic only.

However, before UNIMAS embarks on offering online courses, more research needs to be done to ensure the quality of the online course. First, a better user friendly interface, in particular, the eliminating of split attention effect, would enhance the overall design. Second, the use of students’ names (more personalized) instead of student numbers may lead to more active group discussion. Third, if students were given a training session about online discussion forum at the beginning of the semester, this may better prepare them for a serious study later.

Also, a pretest and posttest design would better measure the gain in transfer performance. In addition, online diary sheet rather than paper-based record sheet would facilitate the instructor to monitor students’ progress more systematically. In this design, there was only one lecturer who attended to students’ online discussion, more staff would be required to cater for a larger group of students.

More importantly, technical problems such as server failures and the power failures suggest an institutional implication for online course servers. There is little point in trying to provide online courses relying on faulty delivery servers that are critical to the process. Thus, a backup server is essential to ensure the smooth delivery of the online materials.

Acknowledgement

The author thanks Elaine Guat Lien Khoo (glkhoo@fcs.unimas.my) and Roger Harris for helpful comments on earlier version of this paper.

References

          Benson, R., & Rye, O. (1996). Visual reports by video: An evaluation. Distance Education, 17(1), 117-131

          Berge, Z. L. (1994). Electronic Discussion Groups. Communication Education, 43(2). 102-111.

          Braggett, E., Retallick, J., Tuovinen, J. E., & Wallace, A. (1995). Distance Education Project NATCAP. Report on the establishment of Telematics delivery systems in one priority cluster area in NSW. 1993-94. Wagga Wagga: Charles Sturt University.

          Brown, K. M. (1996). The role of internal and external factors in the discontinuation of off-campus students. Distance Education, 17(1), 44-71.

          Carr-Chellman, A., & Duchastel, P. (2000). The ideal online course. British Journal of Educational Technology, 31(3), 229-241.

          Carr-Chellman, A., & Duchastel, P. (2000). The ideal online course. British Journal of Educational Technology, 31 (3), 229-241.

          Catenazzi, N., & Sommaruga, L. (1999). The evaluation of the Hyper Apuntes interactive learning environment. Computers & Education, 32, 35-49

          Collin, M. (2000). Comparing web, correspondence and lecture versions of a second-year non-major Biology course. British Journal of Educational Technology, 31 (1), 21-27.

          Cullip. P., F. (1999). Learning pedagogical grammar. Centre for language studies, University Malaysia Sarawak.

          Fischer, G. and Scharff, E. (1998). Learning technologies in support of self-directed learning. Journal of Interactive Media Education, 98 (4). Retrieved February 10, 2000 from World Wide Wed: wwww-jime.open.ac.uk/98/4.

          Joblie, F. (1999). Computer assisted language learning. Centre for language studies. University Malaysia Sarawak.

          Kumari, D. S. (2001). Connecting graduate students to virtual guests through asynchronous discussions –analysis of an experience. Journal of Asynchronous Learning Network, 5(2).

          Lambert, T., Shepherd, J., Ngu, A., Ho, P., Whale, G., & Geissinger, H. (1996). Bridging the Gap: Computer Science meets Distance Education at UNSW. School of Computer Science and Engineering, University of New South Wales.

          Liu, D., Walter, L., & Brooks, D. (1998). Delivering a chemistry course over the Internet. Journal of Chemical Education, 75 (1), 123-125.

          Lockee, B. B., Moore, D. M., and Burton, J. K. (2001). Old concerns with new distance education research. Educause Quarterly 24, 2, 60-62.

          Maher, E. (1998). Does the Delivery media impact student learning? A case study. Open Learning, November, 27-32

          McGill, T. J., Volet, S. E., & Hobbs, V. J. (1997). Studying computer programming externally: Who succeeds? Distance Education, 18(2), 236256

          Moore, M. G. (1989). Editorial: three types of interaction. The American Journal of Distance Education, 3(2), 1-6.

          Moore, M. G., & Kearsley, G. (1996). Distance education. A system view. Belmont: Wadsworth.

          Newton, R., Marcella, R., & Middleton, I. (1998). NetLearning: creation of an online directory of Internet learning resources. British Journal of Educational Technology, 29 (2), 173-176.

          NoorShah, M. S. (2001). Overcoming communication issues (Learning from the experience of PKPG Teaching Practice website in UNIMAS). The Internet and Higher Education, Volume 4, Issues 3-4.

          Papert, S. (1980). Mindstorms: Children, Computers and Powerful Ideas. New York: Basic Books.

          Pearson, J. (1999). Electronic networking in initial teacher education: is a virtual faculty of education possible? Computers & Education 3, 221-238.

          Philips, R., Fairholme, E., & Luca, J. (1998). Using email to improve the experience of students doing a group-based project unit. Retrieved February, 21, 2000 from World Wide Wed: http://cowan.edu.au/eddev/98case/phillips.html.

          Resnick, M. (1996). Distributed Constructivism. Proc. International Conference of the Learning Sciences, Chicago, IL.

          Schutte, J. G. Virtual Teaching in Higher Education: The new intellectual superhighway or just another traffic jam? Retrieved February, 2001 from http://www.csyb.edu/sociology/virexp.htm

          Smith, R. C., Taylor, E. F. (1995). Teaching physics online. American Journal of Physics 63(2), 1090-1095.

          Soo, S. –K., & Bonk, C. J. (1998). Interaction: What does it mean in online distance education? Paper presented at the EDDMEDIA & ED-TELECOM 98, Freiburg, Germany.

          Stephenson, S. D. (1997-98). Distance Mentoring. Journal of Educational Technology System, 26(2), 181-186.

          Sweller, J., & Chandler, P., Tierney, P., & Cooper, M. (1990). Cognitive load and selective attention as factors in the structuring of technical materials. Journal of Experimental Psychology: General, 119, 176-192.

          Tarmizi, R. A., & Sweller, J. (1988). Guidance during mathematical problem solving. Journal of Educational Psychology, 80, 424-436

          Tuovinen, J. E. (1999). Research framework and implications for online multimedia education practice based on cognition research. Paper presented at the Communications and networking in education: Learning in a networked society, Aulanko, Hameenlinna, Finland.

          Vogel, D., Genuchten, M., Lou, D., Van Eekhout, M., Verveen, S., Adams, T. (2001). Exporatory research on the role of national and professional cultures in a distributed learning project, IEEE Transactions on Professional Communication, June, forthcoming.

          Wang, X. C., Hinn, D. M., & Arvan, L. (2001). Stretching the boundaries: Using ALN to reach on-campus students during an off-campus summer session. Journal of Asynchronous Learning Network, 5(1)


Appendix 1

 How to write research proposals

 After studying the purpose of writing a research, you should know:

            Why a research proposal is written and to whom the research proposal is targeted.

Learning Activities

Multiple choice exercises

Online discussion (all participate)

Topic Note 1. Purpose of a research proposal

            A research proposal is a plan to do research investigation. The first challenge in writing a research proposal is to propose a study that will contribute to theory and research in a particular field. The second challenge is to demonstrate the feasibility of the research investigation. This depends on judgments about the sufficiency of available resources (time, money); and access to population of interest.

Research proposals are written for various purposes. Research proposals are written for theses. This enables supervisors of the theses to offer suggestion for improving the study. Conducting research needs money. Thus, research proposals are written to obtain research grants from government agencies, universities research funds, and private agencies. Occasionally, a research proposal describing a project is required to obtain permission to collect data.

 Exercise

Please tick the correct statements

 The general purpose of any proposal is:

·        To persuade a committee of scholars to fund a research project

·        To implement a program that you would like to launch

·        To convince your supervisor that the research project is feasible in terms of resources (time, sample and etc.)

·        The research project can provide a new insight to a particular field of study 


Appendix 2:  Diary Entry

 Name of the student

How much time do you spend in:

Day 1

Day 2

Day 3

Day 4

Day 5

Day 6

Day 7

24/02

25/02

26/02

28/02

29/02

01/03

02/03

Reading topic notes and trying exercises from notes

 

 

 

 

 

 

 

     Taking part in online discussion (e.g., posting message, reading message, and contributing messages)

 

 

 

 

 

 

 

 

Taking part in group project (e.g., posting message, reading message, and contributing messages)

 

 

 

 

 

 

 

 

Searching the web for the writing research proposals materials

 

 

 

 

 

 

 

Downloading relevant reading material

 

 

 

 

 

 

 

 

Looking at the computer screen

 

 

 

 

 

 

 

 

Any other activities (please specify)

1.

2.

 

 

 

 

 

 

 

Author Note

Correspondence concerning this paper should be sent to Dr. Bing H. NGU, Faculty of Cognitive Sciences and Human Development, Universiti Malaysia Sarawak, Malaysia. Electronic mail may be sent via Internet to bhngu@fcs.unimas.my


IJET Homepage | Article Submissions | Editors | Issues

Copyright © 2002. All rights reserved.
Last Updated on 20 June 2002