Third day of the conference dawns fine and sunny. After the best poster, paper awards, we are on to the keynote of the day with Professor Carmel McNaught from Chinese University of Hong Kong on technology supported innovation. Do we have adequate evidence for claims of success. Are core principle from one's own ontology transferable? Why, when we have so many tested tools and strategies, is the uptake of educationally sound learning designs limited? Presented on changing contexts, evaluation research and evaluation research, paradigms and mixed methods, LEPO framework and innovation life cycle, project examples, implications for institutional policy and processes and summary of principles. Recommended new media consortium annual horizon report for 2014. Need for continual evaluation and research to cope with fast pace of technological change.
See McNaught(2011) on evaluation as part of blended designs. Philips, McNaught and Kennedy (2011) types of evaluating research (book-2012), the LEPO framework.McNaught, Cheng and Lam (2006) on good evaluation research. Reeves (2006) on design based research instead of predictive research.
LEPO framework is on learning environment, processes and outcomes within an educational context. LE facilitate LP lead to LO which determine LE. Consider also the innovation life cycle, project management evaluation for baseline,messing, formative and effectiveness.
Provided 5 examples. As overview see Lam, Lee and McNaught (2010;2011) found positive experiences for elearning if used to meet student learning needs. Examples from eportfolios with students learning English - needed buy in from teachers.
Use of forums across 13 courses - use SOLO to analyse quality of postings, forums worked if structured but student centred.
Mohan and Lam (2005) investment banking working on peer learning evaluation and assessments.
Teachers using social media
Implementing collaborative interdisciplinary scenario inquiry tasks in large science classes. University of Queensland and Australian learning and teaching council project.
Interactive collaborative assessment system called IS-IT to allow group tasks and peer assessments to take place.
Implications - encourage students to do the work; reward through policies and processes.
After morning tea, I pick up the learning analytics theme its associate professor Cathy Gunn's from University of Auckland paper on defining the agenda for learning analytics. Broad context of educational data mining, web analytics,ml earning analytics, recommender systems and business intelligence. Connect educational design research to learning analytics contribute to evidence based educational practice. McKenney and Reeves (2012) on educational design research. Potential of LA to add to data and achieve linking of assumptions about learning and learning design intent to the learner behaviour , learning processes and outcomes it produces in meaningful and reliable ways. Not just asking students but using LA in non invasive ways to find out similar things. Thing through what can really achieve with LA and we must do to exploit potential. How can we translate data to be meaningful to teachers? Can we convince our institutions we have a duty of care to use LA?
Next presentation is on exploring students' interpretation of feedback delivered through learning analytics dashboards with Dr. Linda Corrin and Paula de Barba from the University of Melbourne. Current focus seems to focus on retention and providing data to academics. Need to provide data to students and support them with how to interpret and progress from data. 2000 plus students across 2 first year and 1 second year courses. LA needs to consider differences in learning design of courses. Pilot consisted of survey to capture demographics, think aloud interview in week 3, week 11 and final survey on motivation to study. Dashboard provided for access, learning activities, quizzes etc. themes were reflection, study strategies, motivation, class average, consolidated view and interpretation. Students could work out strategies but could not explain why. Motivation improved for most students. Consolidated view appreciated. Interpretation weak. Positive outcomes mostly.
Last paper on the development of an application with process feedback to enhance student centred learning with Sudhakaran Edathil, Christopher Chin, Stephanie Zank, Dev Ranmuthugala and Susan Salter from University of Tasmania. Presented by Christopher Chin. Reinforced efficacy of timely feedback and examples of assessment and feedback tools available. Desire2learn but can be shifted across to other LmS. Learning mode is fully guided, assessment modems partially guided and assessments itself is not guided. The guided path scaffolds students with revealing each step in the algorithm and each step also has a hint to assist self learning. Student feedback positive on the process feedback tool.
After lunch, comes the drive back home to Christchurch.
Next years Ascilite is in Perth hosted by Curtin University.
No comments:
Post a Comment