Citation Details: Thomas, R.C. and Milligan, C.D. (2004). Putting Teachers in the Loop: Tools for Creating and Customising Simulations. Journal of Interactive Media in Education (Designing and Developing for the Disciplines Special Issue), 2004 (15). ISSN:1365-893X []
Print: [PDF] [HTML] Published: 28 Sept. 2004

Putting Teachers in the Loop: Tools for Creating and Customising Simulations

Ruth C. Thomas and Colin D. Milligan

Scottish Centre for Research into Online Learning and Assessment
School of Mathematical and Computing Sciences
Heriot-Watt University
Edinburgh, EH14 4AS

Abstract: When designing learning materials, great emphasis is put on creating a 'definitive resource' - but this focus can often lead to the production of inflexible content which follows a fixed pedagogy and fails to cater to individual learning styles and teaching situations. If this is recognised, tools can be produced that allow the teacher to customise generic components to provide a tailored learning experience supporting different teaching approaches and scenarios and addressing a wider range of learning styles. This paper will relate these ideas to the use of online simulations in science and engineering education. In support of this, the educational benefits of simulations are outlined, followed by a review of research into factors influencing their effective use.

The complex nature of these factors leads to the conclusion that the notion of a 'definitive' simulation interface is a myth. Simulation users must be empowered by tools allowing them to take control of the design process. The range of changes which could be facilitated by giving teachers the tools to alter simulation visualisations are discussed and demonstrated with examples of simulations and online learning materials produced using a suite of tools for creating and customising educational simulations, JeLSIM.

Keywords: Java, e-Learning, simulations, education.

Interactive Demonstration: Java applet based simulations are linked as examples from this paper for which you will need a Java-aware web browser. The tools used to create the simulations are freely available from the JeLSIM (Java e-Learning SIMulations) website.

1 Background

When Bill Gates said "Content is king" [1], he understood that simply moving printed information to the web isn't enough. Sadly, many producers of online learning don't seem to have listened. Instead of exploiting the real benefits of computer based delivery (calculation, sorting, querying, retrieving and presentation), to create interactive environments that engage learners, online learning materials are typified by static content and linear structures and amount to little more than electronic books (e.g. WBEC (2000) [2]). e-Learning materials are still routinely judged by quantitative criteria (how many hours of learning online?) rather than qualitative ones (has the student learning experience been improved?). As the cost of producing 'text and images' (or converting from existing materials) is far less than producing interactive components, this diet of 'text and images' continues to predominate.

There is however a specific need for interactive content in an online setting. Helping the learner to engage with the learning material is vital where access to a tutor may be restricted. By creating content that is task based, or designed to challenge existing understanding, the student is encouraged to adopt a more active role in their learning. No matter how flexible and interactive the learning material however, the teacher cannot simply give the student a resource and leave them to get on with it. One of the key skills of an educator (wherever they fall on the constructivist -behaviourist spectrum) is in tailoring or customising the material that is available to the student to reflect their understanding of the learner's needs.

Current technological advances which standardise the exchange and storage of blocks of learning content and provide effective ways of discovering content [3], empower teachers by giving them a greater degree of control over what gets delivered to students. If teachers wants to change the material then they may alter the text or replace other components. However they wouldn't normally be able to alter the interactive content such as simulations as its appearance and functionality is fixed.

This paper examines the possibilities that are created if simulations are made as easy to customise as text or images, allowing simple interactive resources to be tailored for use in a wide range of teaching situations. The paper considers the needs of the learner, but focuses primarily on the needs of the teacher and begins with a review of the educational value of simulations and the factors influencing their effectiveness.

1.1 The Educational Value of Simulations

The term 'simulation' is being applied in e-Learning in an increasingly broad manner and is sometimes used synonymously with 'animation', however, for the purposes of this paper a simulation is defined as having the following two key features:

  1. There is a computer model of a real or theoretical system that contains information on how the system behaves.
  2. Experimentation can take place, i.e. changing the input to the model affects the output.

As a numerical model of a system, presented for a learner to manipulate and explore, simulations can provide a rich learning experience for the student. They can be a powerful resource for teaching: providing access to environments which may otherwise be too dangerous, or impractical due to size or time constraints; and facilitating visualisation of dynamic or complex behaviour.

Educationally, simulations have a unique role in supporting learning as they allow learners to directly manipulate a system and to observe the effect of the change, providing a form of intrinsic feedback. This interaction between the learner and learning material allows students to develop a feel for the relationship between the underlying factors governing a system, promotes an appreciation of appropriate ranges for system parameters, and gives a qualitative feel for the system before the introduction of theory (Thomas and Neilson, 1995). Simulations can be used as cognitive tools, allowing students to manipulate the parameters to test hypotheses, trying out 'what if' scenarios without a real consequence or risk and in a time frame which is convenient and manageable to them, they enable the learner to ground their cognitive understanding of their action in a situation. (Laurillard, 1993)

A constructivist view of learning e.g. (Jonassen, 1994; Duffy and Cunningham,1996; Wilson, 1997) encourages educators to recognise their students' strongly held preconceptions and knowledge to provide learners with experiences that will help them revise and build on their current understanding of the world. The student should not be a passive participant, but should actively engage in the experience, which should allow exploration and encourage reflection. Clearly, simulations have the potential to form an important component of a constructivist learning environment (Jonassen (2003)). They have a central role in scientific discovery learning (SDL) (van Joolingen and de Jong, (1997)) that is characterised by learners discovering concepts for themselves by designing and performing scientific experiments. Techniques like "Prediction-Observation-Explanation" (POE) can be used with simulations to challenge learner's alternative conceptions (e.g. White and Gunstone (1992), Jimoyiannis (2001)).

Simulations can be used to provide realistic problems and scenarios. Advocates of situated learning (Winn (1993), Brown, Collins & Duguid (1989)) believe it is easier for learners to apply new concepts if they are acquired whilst undertaking authentic tasks in real world situations or representations of them. Schank (1995), a proponent of case-based reasoning believes that learning by doing in meaningful situations is an important component of education.

Simulations have the advantage over other media that they can bring both reality and interactivity to eLearning. They provide a form of feedback that facilitates exploration in a manner that can mimic scientific method, allowing students to explore and build their own understanding.

2 Factors Affecting Simulation Design

Simulations clearly have educational potential. However, research shows that the educational benefits of simulations are not automatically gained and that care must be taken in many aspects of simulation design and presentation. It is not sufficient to provide learners with simulations and expect them to engage with the subject matter and build their own understanding by exploring, devising and testing hypotheses. Rieber (1996)cautions designers of interactive learning environments, "not to assume that explicit understanding will follow even if users are successful at completing a task". Learners must be guided and supported in their use of simulations.

Educators may wish to focus the learner's use of the simulation by setting the scene, providing objectives, directions,context, tasks and problems. The need for additional support, in the form of guidance, feedback and scaffolding has been recognised for some time (Thomas and Neilson (1995), Pilkington & Grierson 1996)). From within a simulation, it is possible to provide feedback and guidance in the form of hints, corrective feedback, tips on rollover, highlighting or the addition of elements to augment reality de Jong and van Joolingen (1998). Feedback is important, but not all teachers will agree on how much to provide in a given subject, e.g. providing too much feedback wouldn't support constructivist goals.

Support for simulation use can come from human experts (teachers, coaches, guides) or from peers as well as from electronic help and guidance mechanisms.Experts can provide considerable support during simulation use, but when they are not present, support provision can attempt to duplicate their role (e.g. Acovelli and Gamble (1997)). Two techniques used in this area are coaching and scaffolding. Coaching involves monitoring and regulating learner's performance, provoking reflection and perturbing learner's models (Jonassen (1999)), Scaffolding involves manipulating the task to supplant the students ability to perform the task, either by changing the nature or difficulty of the task or imposing the use of cognitive tools. Some support and guidance can also be provided by linking the simulation with other multi-media resources, which provide answers to frequently asked questions, but within typical simulation software, feedback related to the students' immediate need or current task, or assessing their performance, is often limited or absent.

It has been noted, that learning with simulations in a discovery learning environment is often not as effective as expected (Lee, 1999) and de Jong and van Joolingen, 1998)). A particular problem seems to be difficulties students have in generating hypotheses (Shute and Glaser, 1990; Njoo and de Jong, 1993). Quinn and Alessi (1994) suggest that "Prior understanding of the scientific method may be necessary to learn from simulations". Howe and Tolmie (1998) demonstrated the benefits of contingent prompting in hypothesis formation.

It is important that students engage with the underlying simulation model, not just with the user interface. As Davies (2002) points out, interactivity is not synonymous with engagement. Pilkington and Parker-Brown (1996) noted the tendency for students to concentrate on manipulating objects without generating a deeper understanding of the model or principles behind the observed behaviour. Laurillard (1993) draws the distinction between qualitative reasoning: incorporating knowledge of real world objects and quantitative reasoning: referring only to quantities and processes explicitly presented on the screen and suggested that the more interpretive approach of qualitative reasoning must be encouraged. One method of doing this is to allow students to construct the models themselves, for example using systems dynamics software such as STELLA [4] or PowerSim [5] Though Alessi (2000) is of the opinion that for most educational purposes, such models should be enhanced with an instructional overlay. Alessi also points out that depending on the instructional paradigm adopted it may be advantageous to expose the model (glass-box), e.g. in expository learning or to hide the model (black-box), e.g. in discovery learning where students must discover it for themselves. Thus, educators may wish to control the degree of access the learner has to the internal model.

Allen, Hays and Buffardi (1986) suggest that physical fidelity (look and feel) is more important in cases of development of skills which involve little or no cognitive effort in their execution, while functional fidelity (realistic cause-effect relationships) is more important in tasks that depend on deeper cognitive processing of task information. However, providing an authentic experience by maximising the fidelity with which it duplicates the real world does not always engender learning. Alessi (1988) suggested that increased fidelity could inhibit initial learning for novices by overwhelming them with details that they cannot assimilate, but conversely decreasing fidelity may engender learning that is not transferable to the real world. He proposed a solution of dynamic fidelity that is determined by instructional effectiveness and changes based on learner performance. Examples of functional fidelity that can be introduced to students as they progress are experimental error and malfunctioning equipment, e.g. Magin and Reizes (1990) and Thomas and Milligan (2003). Again, the educator could benefit greatly from having control over the fidelity of the simulation.

Clearly, there is no 'right' way to present a topic using a simulation. Learners of different age, ability and level require different approaches in their use of simulations. Teachers may wish to adopt different instructional strategies. Decisions about the type of task, the type of support and how to ensure engagement will be reflected in the visualisation presented to the learner. Ideally, these decisions should be made by educators, as it is they who have the subject expertise, teaching experience, and knowledge of typical student misconceptions. If a teacher finds a particular approach does not work, he or she may well wish to modify a simulation to remedy this. Consideration of these points, leads to the conclusion that a single fixed interface designed by a non-teacher is never going to be the optimal solution.

3 Modification of Simulations

Simulations generally require programming skills to produce them and as a result are expensive to create, and can be difficult to modify. In the past, therefore, teachers have been effectively disenfranchised from involvement in simulation design. A teacher may wish to alter any of the features relating to factors identified in the previous section. The following section looks at the specific reasons why teachers may wish to modify visualisations for a given model. The modifications are considered in three categories, simple presentational issues, the scenario addressed by the visualisation and the instructional overlay applied to that scenario.

1. Presentational issues.

  • Learning Styles and Preferences: Learners have different learning styles, preferences and abilities. Different media, forms of internal feedback in the simulation (Rieber (1996)), graphical user interface metaphors (Cates (2002)) or multiple representations (Ainsworth (1999)) can be utilised to suit specific needs.
  • Aesthetics: By altering the look and feel of an interface, changing colour or fonts, adding logos and images, a teacher can personalise an interface and gain a sense of ownership of the resource. Learners, too, are more at ease with materials with a familiar 'look and feel'.
  • Minor Details: Sometimes teachers may wish to make minor changes to the interface, for example to naming conventions, units and symbols appropriate to the curriculum they are teaching. Minor changes can counter a 'not invented here' attitude, and aid localisation.

2. Simulation scenario:

  • Re-using the model: There are many examples of situations where the same underlying mathematical model can be used to describe different phenomena in different disciplines, (e.g. feedback control theory, exponential growth). Tailored visualisations of a general purpose model can provide resources in a range of disciplines.
  • Scenario Focus: Teachers may wish to modify a scenario to focus the student on a particular aspect of the model, to provide a different task or problem to teach the same subject in different disciplines (e.g. chemistry for biologists). This can be facilitated by changing the input variables the student is allowed to modify or providing a different default starting condition for the simulation. They may also wish to set up scenarios to stimulate learners into testing 'what if' scenarios in an attempt to understand models and processes at a deeper level.
  • Student expertise: The same model may be of use at different educational levels within the same subject. Dynamic fidelity, mentioned in the previous section, can be used to change the interface to suit the learner's needs. The model can be simplified for novice users allowing them access to a limited number of the model variables or to provide a user interface suitable for their needs and skills. For example, limiting the use of technical terms for the novice or avoiding the use of graphs and numbers for the less numerate learner. Introducing the form of a system before discussing the complexity of the model is of general use when introducing students to any new topic.
  • Exposure of the model: As highlighted in the previous section, depending on the educational aims, the teacher or designer may wish to vary the degree to which the learner is allowed to view and even manipulate the underlying model.

3. Instructional Overlay:

  • Educational context: By altering the instructional overlay a simulation can be re-used within a variety of contexts, i.e. in lectures, as preparation for a real laboratory exercise or as a computer-based laboratory, for tutorials, as a resource in problem-based learning environment, as a game, for self-study or in assessment.
  • Educational Approach: As can be seen from the sections 1.1 and 2, there is a spectrum of educational uses for simulations, from the open ended discovery-based learning through POE to more directed, expository learning, teachers would benefit greatly from the ability to produce their own visualisation of a pedagogically neutral simulation.
  • Integration: Teachers may wish to integrate simulations into other learning material, in which case they will need to contextualize it and provide instructions for the simulation use in the surrounding material. In a constructivist or problem based learning scenario, the simulation can be made available as one of a range of resources relevant to the domain being investigated.
  • Modifying the learner support: As discussed in the sections 1.1 and 2, different support mechanisms are required when teachers are present in a classroom, when students are learning with others and have an opportunity to discuss and negotiate their understanding and when students are using simulations for self-study. The ability to change the feedback style, i.e. to what degree it is natural vs. artificial or immediate vs. delayed (Alessi and Trollip (2001) and the actual feedback wording would be of great benefit in tailoring resources.

A system that empowers the teacher to create and modify simulations benefits learners too. By providing learners with multiple representations, different forms of feedback and choice in data presentation, they can be empowered to choose the style of visualisation that best suits them. More ambitiously learners could take over the role of teachers and construct interfaces to learn by teaching (Ploetzner, Dillenbourg, Praier & Traum, 1999)

4 Tools as an Answer

The JeLSIM toolkit is a suite of software written to support easy creation and customisation of simulations. It provides a toolkit for the rapid creation of Java applets through a visual interface not unlike a traditional drawing package. The simulations produced by the toolkit are small to medium desktop applications usable as components within an eLearning module or problem solving environment. The underlying model can be any type (e.g. continuous, discrete or logical). The key feature of the tools is that they separate the model from the visualisation. The JeLSIM tools were originally developed as the MultiVerse toolkit with funding from the JISC Technologies Application Program (JTAP).

Like Reigeluth and Schwartz (1989), the JeLSIM toolkit makes a distinction between the computer model and visualisation. The following is a typical scenario for simulation production. A model is written by a Java programmer based on a design specification agreed after consultation with a design team (consisting variously of subject domain experts, teachers, educational designers, graphic designers). In producing the design specification, the design team broaden the problem out to consider more than just the immediate requirements. The aim is that the model produced will be re-usable for a wide range of scenarios and levels, thus even though the immediate need may require a low fidelity interface where much of the model complexity can be hidden, a model will generally be designed to maximise its future re-usability. The design of the JeLSIM toolkit means that even novice programmers can create effective models because the Java code describes only the algorithm and not the user interface. The intention is that once written, the model code doesn't change and the programmer's involvement ends. The model code is then loaded into the JeLSIM tools and provides a template for all visualisations of the model.

A visualisation is created by a teacher or instructional designer. The tools themselves present a visual environment for the creation of new visualisations, rather like a computer drawing package. A set of common visualisation objects -- graphs, tables, digital inputs and outputs etc. are provided -- which the visualisation author can use to display the value of any of the input and output properties for the simulation. A typical interface would consist of a number of input parameters and visualisation objects to illustrate the outputs. Once finished, the simulation is easily deployed (as a Java applet embedded in a web page, or even an IMS Content Package).

As the simulations are created as Java applets, they are easily integrated into existing web based learning materials. The context for the simulation can then be provided by altering the web content in the surrounding pages. The rationale is summarised in figure 1, below: one model written in java becomes the basis for many visualisations, created in the JeLSIM tools and embedded in web pages.

Figure 1: The Separation of Model and Visualisation

Over time, a library of models will be generated which can be used by others. At present around forty models are available.

4.1 Examples

The separation of model and visualisation outlined above allows the JeLSIM tools to support many of the types of customisation identified in section 4. This section provides examples of work where a series of individual models have been customised to produce a diverse range of resources. Three examples are examined in chemical kinetics, solar geometry and entrepreneurship.

4.1.1 Solar Geometry

The solar geometry model calculates the location of the sun in the sky and the angle it would shine on a building, given a time, date and location on the earth's surface. A simulation of this model may be relevant to young children being taught about the seasons or the solar system, or to architects who need precise calculations to help them predict the incidence of sunlight on buildings. Clearly the same visualisation would not be appropriate for both sets of users but as the examples show, a single model can be used to provide alternate interfaces suited to the needs of these different groups. A number of interfaces have been constructed for this; they are listed below together with some of the customisation features they demonstrate:

Calculation interface: For a building engineer, the model can be used purely as a calculation tool; the interface assumes prior knowledge of the subject, uses technical terms and provides numeric output. The calculation interface can be viewed and explored online [6]. A screenshot is provided for illustration (Figure 2).

Figure 2: Calculation Interface

Primary school interface: In contrast, in the second example, to cope with different learning styles required for primary school use, minimal text is employed, numbers are not used and the sun's location is presented by an image of the sun that moves through the sky over the period of a day. The visualisation is still generic, in that a teacher can us it as a tool for exploration of a number of questions e.g. the variation of sunrise and sunset through the seasons, what happens at the equator or the north pole over the year. The default latitude and longitude as well as images used for the scenery can be changed by teachers to mimic the local area. The primary school interface can be viewed and explored online [7]. A screenshot is provided for illustration (Figure 3).

Figure 3: Primary School Interface

Exploratory interface: The third example has been devised for more advanced learners who can cope with graphs. Here an exploratory interface that provides access to all variables has been provided. From this interface, all aspects of the simulation can be explored. No tasks are set or guidance provided in the interface, it is anticipated that these would be provided externally, either by the teacher or in surrounding web-based material. The exploratory interface can be viewed and explored online [8]. A screenshot is provided for illustration (Figure 4).

Figure 4: Exploratory Interface

More details of the solar geometry model are available [9].

4.1.2 Chemical Kinetics

The chemical kinetics model explores the rate at which chemical reactions occur at different concentrations and temperatures. The original remit of the project was to simulate chemical kinetics practical experiments suitable for students ranging from Scottish Higher or equivalent through to 2nd year university level. The underlying model was generic (i.e. could handle any chemical reaction) and can be used in other teaching contexts than practical experiments.

The exploratory interface: This interface exposes all the variables for the learner to investigate. On it's own it could be appealing to a highly motivated student who learns best by attempting to understand the whole system. It can be used to compare output from two different sets of inputs and is designed to be used as a series of controlled experiments to allow the student to build up a picture of the relationship of different system variables. By providing visualisations where exploration is more constrained a more directed style of learning which investigates the effects of changing one variable at a time can be produced. The exploratory interface can be viewed and explored online [10]. A screenshot is provided for illustration (Figure 5).

Figure 5: Chemical Kinetics Exploratory Interface

Practical experiments: A novel approach was taken in production of visualisations of practical experiments in chemical kinetics. The idea was to move towards authentic practicals and not to make interfaces that mimic the dexterity required in practical experiments, rather to duplicate the thought processes. The chemical reaction simulated by the model occurs at the same speed as in the real world. Students, not the computer, take and analyse the readings. Students can make measurement errors and analyse data incorrectly, just as in the real world. The work was funded by the Scottish Qualifications Authority (SQA [11]) and the University of Cambridge Local Examination Syndicate (UCLES [12]). Different interfaces were required to suit the two sponsors. Localisation issues such as differing conventions for chemical naming and units were easily dealt with using the tools. The generic nature of the model means that a wide range of chemical reactions can be simulated and the model re-used for other experiments. The whole chemistry resource developed for SQA and UCLES is available [13], with documentation. A representative practical experiment interface can be viewed and explored online [14] along with a completed analysis [15]. A screenshot is provided for illustration (Figure 6).

Figure 6: Practical Experiment Interface

4.1.3 Entrepreneurship

The model used in this example was designed to assist those starting a small business in the production of a business plan. A single model covers cash flow projections, breakeven analysis and trading forecasts. The same model is utilised to provide all the interactive content in the business start up course. The visualisations begin with a simplified view to allow the learner to explore the principles. They then take the learner through to a realistic case study of a cash flow and trading forecast which can be explored either through a number of pre-defined 'what if' scenarios or completely openly. Additional case studies that show the typical difficulties in setting up businesses of other types can be produced from the same model. The course can be viewed and individual simulations explored online [16]. A screenshot is provided for illustration (Figure 7).

Figure 7: Entrepreneurship Interface

These examples do not represent the definitive answers to teaching a particular subject, they are illustrative suggestions for teaching a subject. Any one of them can be tailored to suit individual teaching preferences.

5 The Future

It has been shown that the teacher can have considerable control over ways in which they use simulations in their teaching. However, further research and development is required. The system is being actively developed:

  • Prototype systems exist demonstrating simple model building and collaborative simulation use;
  • A system is being developed to allow use of simulations within computer aided assessment systems.

5.1 Learner Support

It is one thing to provide support and guidance in anticipation of the learners needs, it is quite another thing to provide it contextualised and on demand. An ultimate aim must be to duplicate the role of teacher or other expert for the remote learner (Thomas and Milligan (2003)). Further research is required into mechanisms for providing context sensitive feedback, support, scaffolding and contingent tutoring as the learner uses a simulation.

Another way of providing support to the isolated learner is to facilitate communication with other learners and with experts. In terms of simulations, this doesn't just mean providing text based chat or email facilities to allow learners to ask questions of an expert, but collaborative access to the simulation to allow teachers to understand the context of a student's question or even to take control to demonstrate a point. Such functionality would be valuable to students as well as experts. An early prototype of a simple collaborative system exists.

5.2 Learner Control

Developments that allow learners greater control over their learning are also desirable. Providing learners with a choice of visualisation styles and objects to suit their personal needs and abilities would improve learner's choice and autonomy and could even be tied to a learner profile. Techniques that allow learners to annotate and save simulation states or save a learning history (Parush, Hamm & Shtub (2002)) are of benefit. Mechanisms for recording and replaying activities in a simulation for post task analysis or to learn from others are also desirable. IMS specifications currently in development explore the possibility of standardising the functionality offered by virtual learning environments to facilitate such support. Work is ongoing in this area and a 'Use Case' submitted to IMS illustrates a possible solution [17].

5.3 Assessment

Computer Aided Assessment is under pressure to become more than a method for testing knowledge using simple question types such as multiple choice. There is a need for new types of question that test higher order skills. Use of simulations within assessment engines to provide questions and record answers opens a range of possibilities of new question types. It may also have the potential to provide context sensitive feedback and support (Thomas and Milligan (2003)). Simulations provide an ideal opportunity for authentic assessment where students are assessed in the same environment in which they learn whilst they are performing a meaningful task and are a truer measure of the potential of a student. Research and development work on the integration of simulation and assessment technology is currently being undertaken by the authors.

5.4 Model Building

One area where the teacher cannot currently participate in controlling simulation resources within the JeLSIM toolkit is in the construction of the simulation model. Prototype tools exist for the construction of simple simulations such as action mazes. However providing non-programmers with the capability to construct more mathematically complex models in an environment in which they also can control the visualisation and the instructional overlay would meet some of the needs for simulation environments identified by workers such as Alessi (2000).


Acovelli, M., & Gamble, M. (1997). "A coaching agent for learners using multimedia simulations." Educational Technology 37(2): 44-48. [cited]

Ainsworth, S. (1999). "The functions of multiple representations." Computers and Education 33: 131-152. [cited]

Alessi, S. M. (2000). "Designing educational support in system-dynamics-based interactive learning environments." Simulation & Gaming 31(2): 178-196. [cited] [cited]

Alessi, S. M., Trollip, S. R. (2001). Multimedia for Learning: methods and development. (3rd Edition). Allyn and Bacon. [cited]

Allen, J.A., Hays, R.T., & Buffardi, L.C. (1986). Maintenance training simulator fidelity and individual differences in transfer of training. Human Factors, 28(5), 497-509. [cited]

Alessi, S. M. (1998) Fidelity in the design of instructional simulations. Journal of Computer based Instruction 15(2) 40-47

Brown, J. S., Collins, A., & Duguid. (1989). "Situated cognition and the culture of learning." Educational Researcher 18(1): 32-42.

Cates, W., M. (2002). "Systematic selection and implementation of graphical user interface metaphors." Computers and Education 38: 385-397. [cited]

Duffy, T. M., Cunningham D.J. (1996). Constructivism: Implications for design and delivery of instruction. Handbook of research for educational communications and technology. D. H. Jonassen, New York: Simon and Schuster: 170-198.

Davies, C., H., J. (2002). "Student engagement with simulations." Computers and Education 39: 271-282. [cited]

Howe, C., and Tolmie, A. (1998). "Computer support for learning in collaborative contexts: prompted hypothesis testing in physics." Computers and Education 30(3/4): 223-235. [cited]

Jimoyiannis, A. K., V. (2001). "Computer simulations in physics teaching and learning: a case study on students' understanding of trajectory motion." Computers and Education 36: 183-204. [cited]

Jonassen, D. H. (1994). "Thinking technology: Toward a constructivist design model." Educational Technology Research and Development 34(4): 34-37. [cited]

Jonassen, D. H. (1999). Designing Constructivist Learning Environments. Instructional-design Theories and Models, Volume II: A New Paradigm of Instructional Theory. C. M. Reigeluth, Lawrence Erlbaum Associates. [cited]

Jonassen, D. H., Howland, J., Moore, J., Marra, R. M. (2003). Learning to solve problems with technology: a constructivist perspective. Pearson Education Inc.

de Jong, T. & van. Joolingen, W. R. (1998). "Scientific discovery learning with computer simulations of conceptual domains." Review of Educational Research 68(2): 179-201.

van Joolingen, W. R. and de Jong., T (1997). "An extended dual search space model of scientific discovery learning." Instructional Science 25(5): 307-346.

Laurillard, D. (1993). Rethinking University Education: a framework for effective use of educational technology, Routledge. [cited] [cited]

Lee, J. (1999). "Effectiveness of computer-based instructional simulation: a meta analysis." International Journal of Instructional Media 26(1): 71-85. [cited]

Magin, D. & Reizes, J. (1990). "Computer simulation of laboratory experiments: the unexplored potential." Computers in Education 14(3): 263-270. [cited]

Njoo, M., & de Jong, T. (1993). "Exploratory learning with a computer simulation for control theory: learning processes and instructional support." Journal of Research in Science Teaching 30(8): 821-844. [cited]

Parush, A., Hamm, H. & Shtub, A. (2002). "Learning histories in simulation-based teaching: the effects on self learning and transfer."Computers and Education 39: 319-332. [cited]

Pilkington, R. and Grierson, A. (1996). "Generating Explanations in a Simulation-based Environment." International Journal of Human-Computer Studies 45: 527-551. [cited]

Pilkington, R. and Parker-Jones, C. (1996)."Interacting with Computer Based Simulation: The Role of Dialogue." Computers and Education 27(1): 1-14.

Ploetzner R., Dillenbourg, P., Praier M. & Traum D. (1999). Learning by explaining to oneself and to others. In Collaborative-learning: Cognitive and Computational Approaches. Editor: Dillenbourg, P. Oxford: Elsevier: 103-121. [cited]

Quinn, J. and Alessi, S. (1994). "The effects of simulation complexity and hypothesis-generation strategy on learning."Journal of Research on computing in education 27(1): 75-91. [cited]

Rieber, L. P. (1996). "Animation as feedback in a computer-based simulation: Representation matters." Education Technology Research and Development 44(15-22). [cited] [cited]

Reigeluth, C. & Schwartz, E. (1989). "An instructional theory for the design of computer-based simulation." Journal of Computer-Based Instruction 16(1): 1-10.

Schank, R. and Cleary, C. (1995). Engines for education, Lawrence Erlbaum Associates Inc.

Schank, R. C., Berman, T. R., Macpherson, K., A. (1999). Learning by doing. Instructional-design Theories and Models, Volume II: A New Paradigm of Instructional Theory. C. M. Reigeluth. New Jersey, Lawrence Erlbaum Associates.

Shute, V. J. and Glaser, R. (1990). "A large scale evaluation of an intelligent discovery world: Smithtown." Interactive Learning Environments 1(1): 51-77. [cited]

Thomas, R. and Neilson, I. (1995). "Harnessing Simulations in the Service of Education: The Interact Simulation Environment." Computers and Education 25(1/2): 25-29. [cited] [cited]

Thomas, R. C. and Milligan, C. (2003). Online Assessment of Practical Experiments. Proceedings of 7th International Computer Assisted Assessment Conference, Loughborough. [cited] [cited] [cited]

White, R. Gunstone, R. (1992). Chapter 3 "Prediction-Observation-Explanation" in "Probing Understanding", The Falmer Press, London. [cited]

WBEC (2000) Web-Based Education Commission: Commission to the President and Congress of the United States. [cited]

Wilson, B. G. (1997). Reflections on constructivism and instructional design. Instructional Development Paradigms. C. R. Dills, Romiszowski, A. J. Englewood Cliffs, NJ., Educational Technology. [cited]

Winn, W. (1993). "Instructional design and situated learning: Paradox or partnership." Educational Technology 33(3): 16-21. [cited]


[1] Bill Gates, "Content is King", [cited]

[2] The Power of the Internet for Learning: Final Report of Web-Based Education Commission. [cited]

[3] IMS Global Learning Consortium Inc., [cited]

[4] STELLA home page. [cited]

[5] PowerSIM home page. [cited]

[6] Solar Geometry: Visualisation for the expert. [cited]

[7] Solar Geometry: Visualisation for the novice. [cited]

[8] Solar Geometry: Exploratory interface. [cited]

[9] Details of the solar geometry model. [cited]

[10] Chemical Kinetics exploratory interface. [cited]

[11] Scottish Qualifications Authority [cited]

[12] Local Examinations Syndicate, University of Cambridge. [cited]

[13] SQA-UCLES Chemistry Kinetics Project. [cited]

[14] Chemical Kinetics laboratory experiment. [cited]

[15] Completed Chemical Kinetics experiment. [cited]

[16] Business start-up module. [cited]

[17] CETIS Use Case for IMS Interactive Content SIG. CETIS-005: Managing and Annotating Simulation States within LMS, Milligan, C., Currier, S. and Cross, R. [cited]


[1] Bill Gates, "Content is King",

[2] The Power of the Internet for Learning: Final Report of Web-Based Education Commission.

[3] IMS Global Learning Consortium Inc.,

[4] STELLA home page.

[5] PowerSIM home page.

[6] Solar Geometry: Visualisation for the expert.

[7] Solar Geometry: Visualisation for the novice.

[8] Solar Geometry: Exploratory interface.

[9] Details of the solar geometry model.

[10] Chemical Kinetics exploratory interface.

[11] Scottish Qualifications Authority

[12] Local Examinations Syndicate, University of Cambridge.

[13] SQA-UCLES Chemistry Kinetics Project.

[14] Chemical Kinetics laboratory experiment.

[15] Completed Chemical Kinetics experiment.

[16] Business start-up module.

[17] CETIS Use Case for IMS Interactive Content SIG. CETIS-005: Managing and Annotating Simulation States within LMS, Milligan, C., Currier, S. and Cross, R.