HOME JOURNALS CONTACT

Information Technology Journal

Year: 2008 | Volume: 7 | Issue: 4 | Page No.: 667-672
DOI: 10.3923/itj.2008.667.672
Performance Evaluation of Reusable Learning Objects in e-Learning System-A Blended Approach to Make-Learning to Work
M.K. Jayanthi, S.K. Srivatsa and T. Ramesh

Abstract: This study discusses the analysis, design and evaluation of school e-Learn an interactive e-Learning tool. It covers the performance and evaluation details of usability, designing the evaluation, the objectives and respondents, the study, the Questionnaires questions asked and basic assumptions, what needed to be evaluated, the execution of the evaluation, its results and conclusions. The evaluation results indicated that the e-Learn system produced more rehearsal from students than the traditional teaching and improved their marks. It was easier and more interesting to use with greater facilities to research and rehearse knowledge. There was a general belief in the bended approach that e-Learn system, that it did indeed assist knowledge retention, this in itself is an important factor for the students psyche. As compared with the neutral system the interactive system held interest longer and was more capable of interacting at the students own level.

Fulltext PDF Fulltext HTML

How to cite this article
M.K. Jayanthi, S.K. Srivatsa and T. Ramesh, 2008. Performance Evaluation of Reusable Learning Objects in e-Learning System-A Blended Approach to Make-Learning to Work. Information Technology Journal, 7: 667-672.

Keywords: blended approach, e-Learning, reusable learning objects, techniques and results, case study and usability evaluation

INTRODUCTION

The dirty little secret of e-Learning is that learner usage rates are dismally low commented Paula Young of Price Waterhouse Coopers at a recent conference. Behind the hype and the genuine potential of online learning, the reality is that not that much is going on. As lecturer if I ask my students `how many people have started an online course? A lot of hands go up. `How many people have completed an online course? I continue. Very few hands stay up and it was much easier to provide e-Learning materials than to get students to use it. The general move to embrace learning objects is driving all e-Learning vendors to create increasingly modular courses that offer access to ever shorter, ever more specific course units.

The weakness of the classroom and a key strength of e-Learning, lies in flexibility. e-Learning clearly has the potential to be more effective. In that case the student need only spend two hours on it. You can learn at your own pace, when you want. But it is only effective if people actually use it. So let`s go on to examples of organizations that have made it work. In the first part, the Reusable Learning object based school e-Learn system is described and the results of the assessment made by students are presented.

Has e-Learning truly arrived in Tamil Nadu: To answers this and other questions regarding e-Learning in AdiDravida Welfare (ADW) Schools of Tamil Nadu State, India and to study the Impact of e-Learning using Reusable Learning Objects we prepared the e-Learning course for the students of Ist STD to V STD Levels according to their syllabus in Tamil and Mathematics subjects and we thought it to 150 students. The performance of the students is evaluated based on the exams and Multiple-choice quizzes. We want to implement the e-Learning in village schools, Because India lives in villages and 80% of the population and students are from villages. Although it`s as much a victim of hype as the rest of the New Economy, it can`t be denied that e-Learning is being widely and successfully used in organizations throughout the India and particularly the ADW village schools in Tamil Nadu. That`s not to say India has nothing going for it. The government commitments to new educational technologies are something the Americans can only dream of e-Learning has arrived, but its implementation is patchy and understanding of what it is, let alone what it can do for organizations, is limited by misunderstanding and clouded by hype.

Fig. 1: The School e-Learn mathematics course

Reusable learning objects in e-Learning system: This case study discusses the initial evaluation of the e-Learning System in used schools. The e-Learning Model is Web Interactive Student e-Learning Model shown in Fig. 1. The reusable Learning Objects was used in our School e-Learn was built on top of the ProFlexLearn system that supports interactive e-learning (Ghaoui and Janvier, 2004). To make everyone to use e-Learning systems in their life long learning process an blended approach-learn through a combination of e-Learning and classroom is must. To make e-Learning to work, we developed a School e-Learn system using reusable learning object and we used a blended approach for e-Teaching.

THE BLENDED APPROACH

The blended approach is how to make e-Learning work. Net (probably the largest selling of e-Learning materials in the UK) uses the blended approach that includes both the classroom and e-Learning. NETg staff themselves learn through a combination of e-learning and classroom. One can define learners on a spectrum from dependent learner to independent learners. School has taught us and still teaches our children, to be dependent learners-expecting to be told when to learn and to be fed with information. Independent learners work out what they want to know and search for the courses and sources of information. For e-Learning to succeed fully we need people to be independent learners. However Lavinia Hill and Julie Davidson recognized that you couldn`t wave a magic wand and turn your people into independent learners. So they created a flexible, but structured, approach that worked.

The 5 secrets of making e-Learning work:

Organizational Support: Does the learners organization and manager give a clear message that online learning is encouraged? Do they recognize and support the results of online learning? Is it part of the appraisal and salary review process?
Motivation: Is the key to ensuring learners complete their online courses. Sometimes this can be external (egg, your job depends on it) but the strongest motivation comes from an internal desire to complete a course.
The human touch: E-tutors are Central Only 3% of the population wants to learn online alone (according to research by the Campaign for Learning). For the rest of us, we need human support. Trainers are central to making e-Learning work.
The classroom makes online learning work: The hype of pure online learning has now been replaced with recognition that a `blended` approach is the most successful, mixing classroom, online and online monitoring. Research shows that a classroom introduction, for instance, increases the success of e learning.
Dependent or independent learners: We are trained to be dependent learners, expecting to learn when we are told to. For e-Learning to succeed we need to move people to being independent learners, or to provide strong structure to make it work.

USABILITY EVALUATION

Usability Evaluation is a concept comprising the effectiveness, efficiency and satisfaction with which specified users can achieve specified goals in a particular environment (ISO 9241-11, 1998).

Effective: Our School e-Learn is effective one and it achieved the required goals with accuracy and completeness.
Efficient: It operate quickly and effectively in an organized way so as to finish the task in the least possible time using as little effort as possible taking into consideration the accuracy and completeness of the tasks allowing for the resources expended.
Satisfy: It fulfilled or gratified the requirements; the students stated that it is comfortable and accepted one.
Learn ability: Even a new student also finds that the system is makes them to understand the concepts easily in play way teaching.

The design prescriptions: Design prescriptions emerge from a how to do it approach. Unfortunately, the theory behind these methods, prescriptions, procedures or tools, is rarely provided. The prescriptions we had at the beginning of the process are Anderson et al. (1993) and Wright and Lickorish (1994).

Case study questions and basic assumption

The evaluation objectives were: School e-Learn evaluation was done by the interactive multi-choice question and answer (QandA) section. This section offers some 10 questions to each modular topic.

Note that in the following report on the evaluation results:

Q 1: What is your sex? [m/f]
Q 2: Which class are you studying?
Enables level of experience to be related.
Q 3: Do understood the subject?
This allows each respondent to report on personal Feelings specific to the Topic.
Q 4: Do you have any experience with computer?
Enables level of experience to be related.
Q 5: Do you consider that e-Learning was easy?
Allows comparative evaluation to be extrapolated and provides some indication of how well the system was received.
Q 6: Is their any part that you consider could be improved?
This question allows each respondent to provide Personal feedback and ideas for improvement.
Q 7: Do you consider that Topics are easy to understand?
Q 8: Is the topics are thought to you clearly?
Q 9: Do you consider that Topics are explained clearly?
Compare to book?
Q 10: Is the blended approach of classroom teaching and e-Learning is useful for your learning?
This question allows each respondent to provide personal feedback and ideas for improvement.
Q 11: Are there any other comments that you would like? to make?
This question is designed to expand on the system and its use by the respondent with ideas of users.
Q 12: The examples are useful?
This question allows each respondent to report on personal general feelings whilst also allowing the interviewer free reign expand areas not previously specifically covered.
Q 13: Do you consider that these e-Learn match your own ideas?
This allows the quality of the results for Learning Styles to be quantified and compared.
Q 14: Was this method of teaching is easy to study?
This rates the feelings of each respondent for definitive comparison.
Q 15: Did the e-Learn adequately reflect your subject?
This allows the respondent to rate the quality of output against expectations and personal Understanding of his/her psyche.
Q 16: Overall was the sequencing well presented?
This rates the way the interface sequencing works.
Q 17: Overall usage of Blended approach of Teaching?
This rates the interface intuitiveness works.
Q 18: How intuitive did the student find the interface?
Rated observation by the evaluator.
Q 19: Did the student find any problems using the web site?
Comments were made here when the student encountered any difficulties. This picks up any small design errors.
Q 20: Was understanding and comprehension demonstrated?
Rated observation by the evaluator to report on content design.
Q 21: Was the flow seen to be as planned?
Rated observation by the evaluator to report on the sequencing design.
Q 22: Did the student understand the facilities offered?
Rated observation by the evaluator to report on the use of facilities - in some instances the student demonstrated good understanding without the use of the various facilities; thus none use or small use of the facilities did not result in a low rating - as with all ratings they are subjective to the evaluator.
Q 23: Was the use of screen language correct for the student?
Rated observation by the evaluator to report on the Language Pattern used for the student.
Q 24: How many times was each facility used in the Learning section?
This is the header question for the Control Topic Learning section. Each facility use was scored using the gate system scoring fives. This section allows use of the various facilities to be quantified and extrapolations to be made.
Q 25: Topic No, Hyperlink, Bibliography, Images, pictures, Help, FAQs, Revision and Revision Question Answers are useful to you are not?

THE EVALUATION TECHNIQUES USED

Experimental evaluation: Was used for scientific experimental practice to test the hypothesis about the user interface of school e-Learn system.
Survey evaluation: Was used to discover subjective information both from the user and about the user`s interaction with the system using questionnaire and/or interview techniques of school e-Learn system.
Co-operative evaluation: Was used to get the feedback from users about good/bad points and/or problems encountered by the users working with a prototype (Macaulay, 1995).
Cognitive walkthrough evaluation: It requires an expert to go through a task or tasks. It requires the expert to be able to act as a pseudo-user with all the knowledge that the user is likely to posses and ignore knowledge of the system that the expert may have that the user is unlikely to possess (Faulkner, 2000). It is used to measure the overall performance of our school e-Learn system.

THE USABILITY EVALUATION TECHNIQUES

The following usability evaluation techniques are used in our school e-learn system during different stages of software development life cycle.

Design stage: Wizard of Oz is a method of testing a system before it is built with the user interacting with the wizard (a person who simulates the system`s proposed functionality).
Early prototyping stage: Conformance to guidelines, expert walkthrough, Heuristic evaluation and small pilot study using video.
Advance prototyping stage: Usually in the user domain checklists, controlled experiments, co-operative evaluation, focus groups, interviews, questionnaires, software probes, usability metrics and videoing.
Delivery stage: Observation in the workplace.

EVALUATION DESIGNED IN SCHOOL E-LEARN

During the design stage a Wizard of Oz approach was adopted and the potential requirements developed using cooperative evaluation. At the inception of the implementation Cognitive Walkthrough Evaluation and Brainstorming were adopted. Observational Evaluation was used to test sections of development with both computer literate and non-literate subjects: feedback was noted and incorporated into the design. At all stages the tenet of Force Field Analysis (Lewin, 1946) was used. Force Field Analysis method used to get a whole view of all the forces for or against some development (Fig. 2). It helps to identify probable prioritized effective changes using the following steps:

Describe goals: Where do we want to be [Goals/ Objectives].
Describe position: Where are we now [Current Position].
List all forces: Two columns [for change | against change].
Score each force: For likely effectiveness [1 weak to 5 strong].
Draw force field analysis diagram: Show scaled forces for and against.
Analyse forces: In particular look for hidden agenda which will negate action.
Prioritise change: List in practical change order.

Once the prototype was sufficiently well developed to present to students, a definitive evaluation form was developed using Cognitive Walkthrough Evaluation. This provided the answers to:

what methods should be used and why?

What methods: Personal Observation-Questionnaire - Interview.
Why ?: Keyboard keystrokes to quantify screen facility usage, Likert scales to quantify subjective feedback and User feedback to obtain personal user feelings.

Fig. 2: Force field analysis

EVALUATION RESULTS

The initial evaluation results indicate that school e-Learns interactive system is likely to make a significant improvement to student learning and remembering. The trade-off between time to complete the questionnaires and providing more than a bipolar option was justified by the general consensus that the reports were accurate. Introducing more choice to the questions would require the user to spend substantially more time and yet not be likely to add significantly to the accuracy of the results. Table 1 presents the results of the assessment of School e-Learn system using blended approach. More specifically e-Learning students have experienced a wide range of quantifiable learning benefits and present study with school students shows that:

Time spent in learning-reduced 70%
Understanding, learning ability-increased 50%
Overall training costs-reduced 75%

Table 1: Results of the assessment of school e-Learn system using blended approach

Fig. 3: The assessment results of mathematics course

Exam performance and pass out rate-increased 40%
Absentees rate-reduced 36%
Motivation, learning interest on subjects-tripled

The evaluation results of the case study conducted among 150 students aimed at finding, through empirical data, the actual meaning of the qualitative attributes mentioned in the prescriptions listed before respondents based on the questionnaire and feed back and Multicoated question answers, All respondents opted for the Blended approach (100%)- gives more information, understanding-makes them to remember easy and recall information from the memory - makes them to tells the correct answer-gave them better results in final exam-the blended approach has more learning advantages-found it good to learn from - a lot more with them and gave the students excited-excellent development-dynamic, easy-to-use, interactive, data-driven to both students and teachers and gives all the advantageous of the technology (Fig. 3).

CONCLUSION

This case study covered the rational way that the evaluation was designed, built and executed. It summarized that the evaluation of the school e-Learn interactive multi-choice question and answer section indicates that the system does in fact aid memory retention and recall. With a clear deadline, strong motivation, organizational support, online encouragement and fellow staff to work with to a common goal they made the e-Learning work. All the students have passed the exams with 5% of the students attaining an unprecedented 100% score on one exam. The motivation element is crucial and sometimes this is created externally. When I ask the questions about completing e-learning courses, I often ask those whose hands stay up what enabled them to complete. Compulsion is one way to make e-Learning work. By making compulsion of the online course a compulsory part of the curriculum they are hoping to ensure students will find the time to do it. The term e-learning is just two years old and we are still experimenting with how it works best. However what is clear is that the blended approach works best - and that it takes a lot of hard work and organizational support to get the most from it. Greater government involvement, more emphasis on creative and immersive approaches to learning, more blending of e-Learning with other forms, a greater use of learning communities (mainly by southern Tamil Nadu users), large technology infrastructures, in particular intranets are required before e-Learning can be widely deployed.

REFERENCES

  • Anderson, A., A. Tolmie, E. McAteer and A. Demissie, 1993. Software style and interaction around the microcomputer. Comput. Educ., 20: 235-250.
    CrossRef    Direct Link    


  • Faulkner, X., 2000. Usability engineering. Palgrave.


  • Ghaoui, C. and W. Janvier, 2004. Interactive e-Learning. J. Distance Educ. Technol., 2: 26-35.


  • ISO 9241-11, 1998. Ergonomic requirements for office work with Visual Display Terminals (VDTs). Part 11: Guidance on Usability. International Organization for Standardization, Geneva, Switzerland.


  • Lewin, K., 1946. Force field analysis. http://www. accelteam.com/techniques/force_field/analysis.html.


  • Macaulay, L., 1995. Human-computer interaction for softwate designers. International Thomson Computer Press.


  • Wright, P. and A. Lickorish, 1994. Menus and memory load: Navigation strategies in interactive search tasks. Int. J. Man-Mach. Stud., 40: 965-1008.
    CrossRef    

  • © Science Alert. All Rights Reserved