Introduction

Technology-Enhanced Learning is a term that invites unpicking in order to understand how the neutral infrastructure of technology can bring evaluated and positive changes to learning. As practitioners experienced in working as educational technologists, our past work involves the design, implementation and evaluation of learning experiences of a number of types. Some of our work has been driven by pedagogical principles (see e.g. Scanlon, 2010b; McAndrew, 2010; and Scanlon and O’Shea, 2007). Inevitably though, some aspects of the work of an educational technologist are technology driven: as Scanlon 2010b notes ‘making use of what becomes practical or possible due to advances in contemporary technologies.’ The related field of Human Computer Interaction (HCI) has tended to focus on making the technology of the day meet usability criteria. The pleasure in using many modern interfaces testifies to the successes of work in HCI. However attention to such matters as interface styles sometimes may overlook the ways that design can impact on the important motivational and long term aims of the learner. The examples and narratives of successful learning design included in this paper raise questions such as how can we determine what data is useful for understanding barriers to student learning or for improving student learning? And, how can we use this to identify changes to the interface or introduce alternative technology to make improvements?

The area of user experience (UX) has made some moves to greater recognition of a range of users. However Bargas-Avila & Hornbaek (2011), point out that this focus has moved towards leisure and that ‘Context of use and anticipated use, often named key factors of UX, are rarely researched’ (p. 2689). Working with learning as an aim provides a context of use that refocuses the approach on the user; for example how a Wikipedia page is read will be expected to change depending on the homework that it is being used to answer. Reading about how calculus can be used to calculate a vehicle’s velocity is a different experience if you are then expected to spend an hour trying to solve a related problem. For the educational technologist considering what works best for the user in relation to learning events, issues such as the particular task structure adopted and the long term and intermittent nature of interactions which can lead to learning are raised. This suggests that it would be useful to consider how work in educational technology can inform and be informed by work in human computer interaction and how these both impact on the growing area of interest in design for learning, often called learning design.

Links between educational technology and human computer interaction and design for learning

Educational technologists are living in interesting times. The growing awareness of the potential reach of large scale systems of technology enabled instruction has focused attention on the history of how such systems developed and the accumulation of knowledge over the past forty years of how learning with such systems can be maximized. The application to education of a systems based approach to instructional design in the mid-60s was influential in the development of the UK Open University (see Daniel, 1996).

Earlier work in the field used the term instructional design, and was based on the assumption that learning depends in some predictable way on the instruction that a learner undergoes. After 50 years the paradigms applied to research in this field have shifted, moving from behaviourist theories, to recognition of cognitive and social perspectives. Conole (2013a; 2013b), among others, uses the term technology enhanced learning (TEL) which is current in European research. There is a rich mix of complex and interesting questions, cognitive, technological and social which need to be addressed in the study of learning. Conole (2013a) draws attention to the developing field of research in learning design, which has emerged from the instructional design, computing and learning sciences communities. Learning design can refer both to the range of actions involved in the specification of learning activities, and the representation of the design. It confers benefits in the sense of making explicit the process of planning and providing a means of describing the design.

In reviewing the history of instructional design and the place of research studies on the effectiveness of different styles of instruction Romizowski (2002) reflects that:

‘as is often the case in the human sciences, the hard research studies, such as the work by Kessels [e.g. (Kessels and Plomp, 1997)], come after and corroborate what the reflective practitioners have already identified from their praxis and transformed into a working paradigm, or at least a set of heuristic principles, or “maxims.”’

(Romizowski, 2002, p. 26)

Recognising the role of practice-based research has been important to the development of educational technology at the Open University. The Institute of Educational Technology is positioned to have in-depth access to evidence, research and practice about how one particularly large and influential institution of distance and online education works. Evaluation and research, together with direct involvement in teaching and course design gives the Institute an excellent practice base from which to develop. In terms of evaluation an early conceptualization of the CIAO (Context, Interactions and Outcomes) model for conducting educational evaluation as outlined in Scanlon et al. (1998) and Jones et al. (1999) ensured our evaluations considered context and a wide range of data. In addition to this, a recognition that the Institute’s mission includes a duty to consider the future impact of technology on learning led to informal horizon scanning and experimentation, more recently formalized in our Innovating Pedagogy reports (see e.g. Sharples et al. 2012, 2013, 2014).

The business of predicting the future is fraught with difficulties. Predictions about the future trends (e.g. Forbus and Feltovich, 2001) do not always turn out to be right:

The next major technology to change the face of education will be based on the widespread use of artificial intelligence (AI). Progress in AI has led to a deeper understanding of how to represent knowledge, to reason, and to describe procedural knowledge. Progress in cognitive science has led to a deeper understanding of how people think, solve problems, and learn. AI scientists use results from cognitive science to create software with more humanlike abilities, which can help students learn better.

(Forbus and Feltovich, 2001, p. 26)

With 14 years of hindsight, this quote reflects an over-optimistic view of specific technology, while at the same time missing the fact that the predominant influence will come from widespread access even though underlying models are slow to change.

Perhaps a more reliable view needs to be holistic. Educational technologists are drawn from a range of backgrounds. Issroff and Scanlon (2002a) considered the influences of different disciplines on the work of educational technologists. They note the importance of the move in the 1990’s towards design science described by Collins as: ‘a design science of education must determine how different designs of learning environments contribute to learning, cooperation, motivation, etc.’ (Collins 1992, p. 24)

In their review of the influence of theories from different disciplines on educational technology they also noted that the field of HCI: “is evolving to include interpretations and explanations of the culture and context which surrounds the use of systems. The goal of HCI has not changed, in that the aim is to design usable and effective systems, but researchers are recognising the role of context and culture and considering these in their evaluation of systems” (Issroff and Scanlon, 2002a, p. 8). They extended their consideration of methods and influences on work in educational technology by adopting an Activity Theory approach to the evaluation of higher education (Issroff and Scanlon, 2002b). Their recognition that learning settings where technology had been introduced and where learning had taken a social turn were complex led to their adoption of an Activity Systems approach. There was a corresponding move in HCI towards a recognition of the benefits of an activity systems approach (see e.g. Nardi, 1996). McAndrew et al. (2010) use a similar activity systems approach to discuss the difficulty of satisfying stakeholders with diverse interests in the technology, the pedagogy and the overall system. They draw attention to two different needs for evaluation: one focusing on examining the nature and quality of learning that occurs, another taking a user-centred approach to understand interactions with the systems. Their method applies task analysis to examine the conflicts that emerge when learners are interacting with technological systems in learning settings.

Scanlon (2010a) outlines how educational technology research has absorbed working methods from HCI such as participatory design, design based research and socio-cultural approaches and building on an evaluation model featuring context, interactions and outcomes (see earlier). In Issroff and Scanlon’s (2002a) discussion of theories in use in educational technology they describe those relating to ‘principled decisions about the design of learning materials’ and others which ‘influence the way we frame our research on learning’ are described. The work related to the design of materials has close links with the paradigm of design-based research. Barab and Squires (2004) provide a description of design-based research that captures the spirit of the endeavour of participatory design as iterative cycles of improvement. They describe a process that expects researchers to ‘systemically adjust various aspects of the designed context so that each adjustment served as a type of experimentation that allowed the researchers to test and generate theory in naturalistic contexts’ (Barab and Squire, 2004, p.3). This principle can be adopted in some cases, see for example its application to using technology and pedagogy adjustments to develop approaches to inquiry learning (see e.g. Jones et al., 2013).

Scanlon et al. (2013) conducted an in-depth examination of the processes of innovation in technology-enhanced learning. Innovation was defined ‘as the practical implementation of new ideas and technologies with the intention of having an observable impact on teaching and learning’ (p. 36) The study combined a consideration of the findings of case studies with a systematic analysis of data collected from in-depth interviews with key figures from research and industry. ‘Technology-enhanced learning consists of much more than a set of research-informed products. It is a complex system, which includes communities, technologies and practices that are informed by pedagogy (the theory and practice of teaching, learning and assessment).’ (p.5) The work involved in successful TEL innovation can be characterised as bricolage. ‘This is a productive and creative innovation process that involves bringing together and adapting technologies and pedagogies, experimentation to generate further insights and a willingness to engage with local communities and practices.’ (p. 6) Bricolage is sometimes referred to as tinkering and in that regard related to the Lévi-Strauss (1962) use of the term bricoleur refined in Scanlon et al. (2013) as follows:

Bricoleurs do not typically start a project and then consider which tools and materials will be required to achieve their goals. Rather, they review their available materials and tools and work out how to use them to achieve their goal or something close to their goal Above all, bricolage is rooted in engagement with the concrete properties of a situation and the available materials, rather than with an abstract model of how they will behave

(Scanlon et al., 2013, p. 31)

Interviews conducted during the project revealed that ‘successful TEL innovators (do)not simply (act) as inventors or as scientists proposing and testing hypotheses but also as bricoleurs who achieve educational goals by bringing together diverse technological elements, frameworks and social practices.’ (Scanlon et al., 2013, p. 7)

Viewing innovation as a form of tinkering provides a tension with focused design work on planned adjustments that can be evaluated. However the concept is very much in line with a realistic position of a process in flux and subject to multiple influences. It seems clear that critiques of research in education lead to an understanding of the need for an approach that goes beyond randomized controlled trials and as stated in the goals for funded research by the Institute of Education Sciences includes “information about the practical benefits and the effects of specific interventions on education outcomes but also contribute to the bigger picture of scientific knowledge and theory on learning, instruction, and education systems” (IES, 2015, p.1).

Context

This paper draws on our experience of how the Open University in particular as a distance teaching University has engaged with technology enhanced learning. We describe the way this experience has implications for the adoption of more open approaches to education, such as Massive Open Online Courses (MOOCs) (Daniel, 2012) and the influence of taking a design approach on how we can think about how learning works and how we can measure performance (McAndrew and Scanlon).

The Open University (OU) as a distance teaching institution which has been providing open education for over 40 years, is one of the oldest and most experienced distance learning organizations (see Daniel, 1996 for an account). It pioneered higher education courses that use different media intended specifically for distance learners working on their own at home. Its Charter specified that it should teach by a diversity of means such as broadcasting and technological devices appropriate to higher education and to promote the educational wellbeing of the community generally (Scanlon, 2003, p. 134). By 2000, the OU was producing 9% of all UK graduates.

The model it uses is of ‘supported open learning’, which combines content with tutor support and assessment to guide learners through their program of study. The key strengths of the OU are in its mission to be open and its long history of success in providing accessible distance education. From its start the OU operated at scale, running courses for thousands of learners. By 2012 it was supporting approximately 260,000 registered learners, with cohorts that have reached as many as 15,000 on an introductory technology course, and reaches millions more through open content and shared environments. In December 2012, a MOOC platform, FutureLearn was founded by the OU as a company and two years later has a large number of partners including major UK universities, Australia’s Monash University, Ireland’s Trinity College Dublin and three non-university institutions: the British Museum, the British Council and British Library.

Ross and Scanlon (1995) defined the open learning experience as involving: the opening up of opportunities as a result of the removal of constraints to learning, whether practical or educational, a learner-centred approach in which individuals are empowered to take responsibility for their own learning and educational flexibility (Ross and Scanlon, 1995, p. 30).

Lessons from the Open University’s large-scale online teaching experiments

The Open University first experimented with online courses in 1976 but much of its design was initially focused on providing printed material. Sclater (2008) describes how at that stage much of the content was developed in house for print, but more and more material was being accessed via the internet, both interactive content and communication with and between students. The OU has a large amount of expertise in online conferencing and e-assessment, gained using a variety of different systems (see e.g. Whitelock (2008). There was a need for a learning management system (or virtual learning environment) to provide the consolidation, functionality and improved user experience. Sclater writes:

‘The selection of the open source learning management system Moodle for this purpose has allowed the University to develop an effective open learning platform for its students.’ (Sclater, 2008, p. 8)

This proved to be an important move towards open approaches for the OU which will be described in the next section.

Open Educational Resources

The OU has been offering open educational resources since 2006 through its OpenLearn website, and via its courses on iTunesU and YouTube. The OpenLearn website is an Open Educational Resources (OER) repository which is hosted by the OU. It is entirely online, free to use, and accessible to all and consists of extracts from the past and present OU fee-paying curriculum—these include text-based resources as well as audio and video materials together with resources especially created for OpenLearn. It was launched with the support of the Hewlett Foundation with the aim of making Open University learning materials and courses widely available and provides online learning open to anyone, and free to use. In the first 18 months after its launch 75,000 users registered with OpenLearn. In January 2010 the platform had its 10millionth visitor. It runs on Moodle as an open-source virtual learning environment.

Coughlan and Perryman (2011) describe OpenLearn as ‘organised on a modular basis, categorised by level and by the number of study hours associated with each learning resource.’ (p. 13 ). Coughlan et al. (2013) refer to the belief that open practices in education are essential to changing the way we learn to meet 21st century challenges. This will be examined below. However detailed interactions around these practices have received little attention to date. The Hewlett Foundation has sponsored two further research initiatives on OER based on the OpenLearn platform. First, it sponsored an Open Learning research network (OLNET) working on collective intelligence about what works in open learning. Subsequently an Open Educational Resources Research Hub was funded (McAndrew and Farrow, 2013) as a focus for international research activities, conducting a schedule of targeted collaboration with existing OER projects worldwide with the aim of centring research outside the project, ‘framing an ‘evidence gap’ relating to the benefits of-and barriers to-widespread OER adoption (p. 2).’ The OER Research Hub provides a focus for research, designed to give answers to the overall question ‘What is the impact of OER on learning and teaching practices?’ and, in the process of answering this question, identifies the particular influence of openness. This approach involved working in collaboration with projects across four education sectors (school, college, higher education and informal) and extending a network of research with shared methods and shared results, including established methods and instruments for broader engagement in researching the impact of openness on learning. All project collaborations address two key hypotheses: first, that the use of OER leads to improvement in student performance and satisfaction and second that the open aspect of OER creates different usage and adoption patterns from other online resources (See Atkins et al., 2008 for a further account of previous work on OER).

Falconer et al. (2013) conducted a review of the use of OER in adult education. They report that: The twin ideals of providing open access to knowledge and of enhancing pedagogy through collaborative development and sharing of resources are another major strength, engendering strong altruistic commitment among initiative staff and stakeholders which has contributed largely to the initiatives. (Falconer et al., 2013, p. 38)

For those who lack recent study experience, entry into learning organisations can be daunting. Online learning offers new opportunities to build environments that could suit new learners. In the Bridge to Success (bridge2success.aacc.edu) project, funded by the Next Generation Learning Challenge (nextgenlearning.org), materials have been generated that take OER from the Open University and make them available as reworked OER with a focus on use in US Community Colleges. This project showed how introductory Open University courses designed for learners without qualifications could be reworked for use in the USA and be used in a variety of contexts (in access classes, supporting disadvantaged adults, alongside other courses). ‘Learning to learn’ and ‘Succeed with Math’ were found to be useful in confidence building and skills development. The OpenLearn project provided the platform for Bridge to Success. The content is released on Open Learn’s Lab Space (now renamed Open Learn Works). This material is available for use on other systems under the CC-BY permissive form of the Creative Commons license. Transfer to other servers is permitted and supported by the release of downloadable content packages and clear messages that material can be copied and reused as permitted by the CC-BY license.

In 2008 the OU also began releasing audio-visual material from its courses in edited form onto its iTunesU and YouTube channels. There has been recognition that these informal channels are part of a potential learning journey for lifelong learners. This learning journey from informal to formal as noted by Lane and Law (2013) can be ‘very varied and occur(s)over many years but recognizes that as lifelong learners …people will want to move between informal and formal learning opportunities at different times or even at the same time (p. 5 ).’ They also describe the strategic approach to the use of open media as follows ‘to provide the most appropriate and effective learning experience for registered students seeking qualifications to enable a wider public set of audiences to have informal learning opportunities’.

An inquiry into the users of the OU iTunesU site (Rosell-Aquilar, 2013) administered a large survey (over 2000 responses) carried out over two years using the iTunesU site from the Open University. External iTunesU learners are very different from the internal users: there are more men than women, mostly middle-aged, and they use the resources mostly for personal reasons.

The OU approach works at scale and goes beyond the face to face model of most educational institutions, with implications for cost and speed of decision making. For open and free resources, where solutions need to reach an even larger scale, then compromises need to be made but accessibility remains a key factor. Survey data indicates that 19% of the users who engage with opportunities for open learning through OpenLearn declare a disability (Law et al., 2013). In meeting the needs of open users the university cannot rely on staff advisors or on personalization, especially as analytics data which indicates that the majority of users will bypass any barriers to content, including splash screens and logins. Rather, two key steps are to meet accessibility standards of provision for the environment and content; and, to offer a choice of material that is suitable for expected needs. This can seem onerous. However a consistent result from projects investigating accessibility is that planning for use by disabled students leads to content that serves all users better; for example, making instructions clearer for dyslexic students will also make them clearer for all.

As content is made more open it is likely to be used by those for whom the material is not in their first language, a further access issue. Information from the iTunesU survey showed nearly half (47%) were non-native speakers of English. The Open Translation MOOC (Beaven et al., 2013) offered an innovative way to address this by developing translation skills around carrying out the task of translating content from open courses. This was only viable as the selected creative commons license allows such reversioning.

This section illustrates the ways in which the approach of openness has implications for design, but also that apparent constraints can lead to improvement in learning designs and improved learning outcomes.

Learning analytics, learning design and OU MOOCs

The First International Conference on Learning Analytics and Knowledge in 2011 (Long and Siemens, 2011) in their call for papers adopted the following definition of learning analytics: ‘the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs.’ (p. 34)

The OU currently collects a large amount of data about its students and uses it to make adjustments to curriculum design and to policies required to provide effective student support. Clow (2014) describes the way the Open University is making use of a complex collection of data in terms of a programme led by staff known as ‘Data Wranglers’ whose role is to make sense of a range of data sources related to learning, and analysing that data in the light of their understanding of practice in individual faculties/departments. Based in the Institute of Educational Technology their role is to produce reports that summarise important points and make recommendations that lead to action. This activity recognizes the need for and importance of human sense-making to turn data into something which is actionable. The data involved is survey feedback data from students, activity data from the VLE, data about the mode of delivery and the structure of courses and completion, pass rate and demographic data. Evaluation of this project has been positive.

In addition, if patterns in the data can be identified which are highly predictive of OU learning outcomes, and especially if those patterns occur at particular points in a module, in theory, interventions and redesign of a module can take place. Lockyer, Heathcote and Dawson (2013) argue that learning design provides information about pedagogical purpose which allows the interpretation of learning analytics to be informed by the teaching and learning context.

Clow (2014, p. 50) gives more detail on the ways that data wranglers help teaching Faculty as follows:

‘to act as human sensemakers, facilitating action on feedback from learners, making better sense of what the feedback means and how the data can be improved and helping to develop the community of practice round the use of learning analytics.’

The analytics provided by the data wranglers are used for a variety of purposes. Academic teams are provided with faculty or module level information. This can include data about learners’ participation in online forums, and student feedback. One possibility is comparing this data with the intentions for desired student behaviour in the learning design of the course. Information is also useful for actions to improve retention. The RETAIN project (http://retain.open.ac.uk/) found that the level of activity (such as clicks on the Virtual Learning Environment) did not predict success or failure, but a relative drop in activity was an indicator of a student in difficulty (Wolff and Zdrahal, 2012). Students could be successful without being active online, but if a previously active student stopped being so, they were unlikely to complete. The VLE system now produces information tracking student behaviour and engagement.

Recent developments in effective student support have led to a new model of curriculum-based support teams. A considerable amount of effort has gone into developing the analytics and models of effective tutor –student support.

Since 2005, the OU has captured all outward and inward communications with students and tutors. Currently, over 7.5 million contacts have been recorded, each categorized to reflect the nature of the contact and the resultant outcomes. Until recently, this dataset has largely been a repository for student information and has not been widely exploited to extract cohort information, patterns of behaviour or useful insights into commonalities between programmes of study, approaches to assessment and modes of delivery. In the last two years, greater use has been made of this information and data captured at registration, to develop a fuller understanding of the reasons which lead to student contact and the triggers for student behaviours, which can then be matched to a variety of anticipatory support behaviours. (Prinsloo et al., 2012, p.132)

Suthers and Verbert (2013, p. 2) stress that there is a need for work on learning analytics to properly connect to changes in pedagogic practice and theory building. They write:

“all research in Learning Analytics should address the “middle space” by including both learning and analytic concerns and addressing the match between technique and application. Advances in learning theory and practice are welcome, provided that they are accompanied with an evaluation of how existing or new analytic technologies support such advances. Advances in analytic technologies and methods are welcome, provided that they are accompanied with an evaluation of how understanding of learning and educational practices may be advanced by such methods.”

This investment in learning analytics as part of the business of producing courses, is linked to an equivalent investment in the design of learning, and the use of visualisations and formulations of pedagogical patterns, so that research can be done on what learning designs are successful.

Cross et al. (2012) report on the research and evaluation of the Open University Learning Design initiative. This report presents research and evaluation undertaken between 2008 and 2012. In particular, it considers the impact of new curriculum design tools and approaches piloted by the project on institutional processes and design cultures. These tools and approaches include tools for sharing learning design expertise visualising designs and for supporting design and reflection in workshops. The project has adopted a learning design approach so as to help foreground pedagogy and the learner experience. Alongside the work at the OU nine pilots have been completed across another six UK universities.

Also, Cross (2013) reports on the evaluation of the Open Learning Design Studio MOOC (OLDS MOOC) ‘Learning Design for a 21st Century Curriculum’ written and facilitated by staff from The Open University (lead partner), Goldsmiths, University of London, London Knowledge Lab, University of Greenwich, University of Leicester, University of Oxford and University of Georgia (see http://www.olds.ac.uk/) for access to the course materials). It was designed with further and higher education professionals in mind - lecturers, qualification teams, awarding bodies, learning technologists, library and student support staff and learning and teaching specialists - but was also suitable for a wider group with interest in curriculum and learning design such as teachers in secondary schools or facilitators in other informal learning settings. The course started with 2420 registered learners, but over the 8-week period numbers slowly decreased and the course ended with 97–300 participants visiting course space in the last week.

Participants’ use of Web 2.0 tools was interesting, as findings indicated that a greater proportion of those completing the course rated their knowledge and understanding of Web 2.0 tools as moderate to expert compared to those who started the course. However, prior MOOC experience did not show a similar advantage. Blake and Scanlon (2014) suggest that ‘this may be an indication that the suitability of present open and freely available tools for supporting large scale learning needs to be carefully considered.’ (p. 9). Jordan (2012, 2013 and 2014) provides a useful lens with which to study completion in MOOCs.

Evaluation focused on the registered learners’ expectations from the course, analysis of participation rates; use of course resources, use of badges and collaborative group working. Data sources used for evaluation included pre- and post-course surveys, discussion forums, social media contributions of participants, public spaces of the course and blogs created by the participants. Buckingham- Shum and Ferguson (2012) emphasize the importance of social learning analytics in considering such experiences.

Another OU MOOC experiment took place as an addition to an online course (H817, part of the Masters in Open and Distance Education offered by the Open University UK). In 2013, H817 ran between February and October over 9 months, however the MOOC component of the course consists of 100 learning hours spread over seven weeks from March 2013 and was open to a wider audience than those registered on the course. The course adopted an ‘activity-based’ pedagogy. There was an emphasis on communication through blog postings and the forum. Participants had the opportunity to acquire badges for accomplishments.

There have been other accounts comparing student performance on large scale conventional OU online courses with MOOCs (see Lane, Caird and Weller, 2014. Also, Clow (2012, 2013a, 2013b, 2014) has outlined how analytics might be developed to encompass journeys between formal and informal learning settings, basing much of his work on the experiences of tracking users’ behaviour and learning outcomes in iSpot (see also Scanlon et al., 2014).In each of the examples we can see the links between the design, technology use and the outcomes for learning.

Conclusions

In this paper we have illustrated the ways in which distance education for formal learning is undergoing a period of rapid change. We have drawn on examples of research and practice from the Open University on open, online and distance learning.

Our tracing of the development of educational technology and the links between research and practice has illustrated how interdisciplinary research can throw some light on the complex interplay between technology and pedagogy in the design of learning. Although in this paper we have only provided a partial account of the ways in which the different elements of a distance learning experience operate to benefit the lifelong learner, we have tried to show that open education is offering alternative ways of supporting learning at a distance. From the experience of developments in technology and pedagogy in the Open University over the last 40 years, we would like to highlight three areas of opportunity and changing practice that provide insight into the benefits and drawbacks in learning at scale. The three areas are Open Educational Resources, Learning Design and Learning Analytics.

The study of Open Education Resources has only recently been seen as an area for research, however through the work of the OER Research Hub we have indicated how a more open approach impacts on the motivation of social learners and on the motivation of teachers’ adoption and adaption of material. OER can also bring benefits to groups that are under-served by more traditional routes such as for learners with disabilities.

Learning Design provides a way to set out and describe the intent in learning material and makes it possible to make judgements about what works. This has a strong analogue in the realisation that the way we interact with computer is an important element in how we experience the services that they facilitate. Learning design is a less mature however there is emerging understanding of the different forms of learning that can be represented.

The value of learning design depends to a large extent on parallel work on learning analytics. Work on learning analytics can select problem areas and motivate interventions to improve retention and maximise the impact of different support models. However, these can only be transferred to other contexts by tracking the impact of teaching interventions as revised learning designs on student outcomes.

Across each of these areas of learning design, learning analytics and open education resources greatest benefits will come through an integrated approach where design, technology and pedagogy are combined.