Sunday 13 May 2012

Thoughts on research

There are subtle differences across the research areas and their aims. All look at the use of technology in learners’ lives, but they each seek something particular. With  Kennedy(2008) it’s about whether the digital natives definition is true in reality; for Conole et al (2006) it’s more about how they engage with and perceptions of elearning; the JISC Ipsos Mori looks at school leavers expectations of technology; LEX about how the interaction with technology may change throughout the learning journey; PB LXP looks at the relationship between work and ICT learning; ECAR on how technology affect the college experience.
All of these seem to be focused on expectations of and use of technology so that practitioners can learn to use technology better with their students.

Kennedy (2008) focused on the fact that embracing technology is not universal experience and there is no consistency in what students use, although there is a great deal of technology that is being used. There is an interesting discussion around whether learners are using the technology they use in everyday life, for educational purpose, and I think that on the whole, learners are using this for the scaffolding elements of study – emails and messages to share and collaborate and stay in touch with each other. There does seem to be an expectation from students that there institution uses technology well (basic ICT should be functional), and guides the learner in the way they use technology – but not necessarily leading the way to new forms of technology use. It is the courses that define how they might use technology. It would seem that learners adopt technology to suit themselves, their situation and ease of use. Which is something that is echoed in my experience and the experience of fellow students. What was particularly pertinent to me, was that there seemed to be a need for a blended approach – something that I have been advocating a lot at work recently.
Does that mean we need to adopt a more blended approach? I know from experience that I will adopt technologies that I like, that help me to learn, that promote the ease of use that the research talks about. I don’t like using technology for technologies sake (hence why I have always struggled with the social bookmarking tools, as I don’t find they make my life easier.) But that has further impact on institutions, as we know a more learner orientated approach requires more support and scaffolding. Or maybe what the research says is the majority of students use technology that we use in our everyday lives. We like portability, access to communication and to tailor things to us. So what teachers need to do is make sure they use this technology in the same way – well. The use should be realistic and appropriate (see the vignettes from PB-LXP – here the technology that was useful for work development were seen as good as they had real life context. I don’t think student expect cutting edge, they just expect professionalism. (Which is more poignant in an age where we pay increasingly for our education.)

Methodologies

As with some of the research and use of  surveys last week, this week we are presented with some evidence and data that is also augmented with the addition of audio and visual logs, as well as interviews. The surveys range from external, online, paper, at times suggested by faculty, within the first week, at six monthly intervals; so there is a great breadth of methods being used.
In my previous role, I was responsible for administrating learner response surveys, so it’s interesting to see the different forms and approaches and compare with my own experience. There will always be shortcomings. For me, centrally, I could only reach those who were on email,(too much time to pull of address data and process hand filled documents) who had completed their learning(about 5,00 people a year).We considered research with those in the middle of their learning, but this involved a great deal of administration time for one person, to make it worthwhile (our data systems not sophisticated enough). On average about 70% of completed learners were contacted each year with about 40% of them (around 1,500) completed the survey. The surveys ask them to give feedback on how they found their learning – from the general quality, to the quality of training advisers who helped them. It included both quantative and qualitative. I have learnt that you have to give people the opportunity to speak – to have their say. That way you also hear the good and the bad. Having data for 5 years, with not a great deal of changes in the questions, has meant that we can build up a bigger picture of trends. I am always conscious when creating reports to ensure that the contexts are right , and that we explain the actual data based on percentage representation (as highlighted well in the Kennedy reports) But we don’t just use the surveys. We also use the self evaluations of training managers; the anecdotal feedback gathered on a quarterly basis from those supporting training; the data from the database, and like Richardson, our own understandings as based on being immersed in the environment. Which allows us to get ‘triangulation’ of a sort.
So here’s the thing – there are always going to be flaws and imperfections. There will always be an imbalance towards the objective for the research, or a bias in the reporting. It’s very hard to be completely unbiased. So we always need to bear this in mind when looking at data and research. Yorke (2009) has an interesting discussion about this in the context of students completing surveys, pulling together other research in the area, looking at student cognition and response times. Do students say what they think, or what they think they should think? An interesting question – especially as I must have completed about 30 psychometric surveys in the last 5 years alone. If I think about my own experience – I fill these in now a lot quicker than I used to. Certainly with the testing I do before I meet the physiatrist for my job role, I try not to think too hard – especially as it’s timed! However I hate end of course surveys and happy sheets. I need time to process what I thought, and what I have learnt. I also like to give honest feedback – which can be hard if your name is on the top of the paper and you hand it to your tutor. But then there does need to be the right timing – too long and I don’t care about feedback anymore. The most useful feedback is peer to peer, but you need to create a culture of feedback. Hence it is also good to consider when these research activities take place. Now that I understand how important it can be, I probably am more likely to take part in research. However as an 18 year old – I probably didn’t have much time for research.
It’s interesting to see that one survey used an external company, and rewards. I fill in more surveys if I know there is a reward. I still do it honestly, but it’s an added incentive. We also commission a lot of external research, so that there is more ‘professional’ approach, and maybe less bias. Often this also includes sampling from outside of the organisation as well, so that we can draw comparisons.
Questions from the activities
How can researchers ensure that the responses they get from students about their expectations are meaningful?
This will really depend on the students, age, context. Students understanding why the research is there. I would suggest that depending on these a different approach may be used. For example, a masters student may be better at being interviewed, because often they are asked to analyse their own feelings and experience, so they may have more cognitive capacity. A 16 year old may need more simple direction. One thing not really mentioned in the research is thing like focus groups. Often the sharing of experience helps students to be more open about their own experience. I have also experimented with this in an online context through conferencing. So researchers need to think about who they want to be involved in the research and what they want to get out of it. But also be clear about the limitations of study, and not be afraid to talk about those limitations.
 Is there a need for new methodological approaches to try to elicit the real learner voice? Where the researcher is an insider-researcher, involved in the design/delivery of the module being studied, does this add an additional layer of bias? Can better ways be developed to analyse the digital traces students leave when traversing their modules?
I have lumped these questions together as they all lead to similar discussions for me. There is a new approach that has been taking the fore in some more developed online learning – where the programmes allow students to do things like rate the pages, or like them, or add notes. The behind the screen LMS can also help to track learners and their activities (as I am sure they are doing with us), to judge where they might get stuck, or spend more time. We do this in face to face learning, where we watch the learners, check for understanding, change activities, get regular feedback sessions with facilitators. Evaluate at the end. The same can be done in an online environment. It helps you to see what activities are being done, the tone and emotional state of the group. Naturally – every mix of students will be different, but this analysis of the course itself, can help with triangulation. I would like to see analysis of activities, alongside student reflected experiences, alongside tutor and trainer experiences. I know of many occasions when I think that someone hasn’t enjoyed a course, or they have been difficult throughout – yet when they get to the end they say it was amazing.
When we talk specifically about the role of technology in the learner life. Richardson (2009) study was great as he tried to control some of the earlier anomalies – so therefore, training tutors, setting student expectations etc. For me all research becomes more useful when it has a depth of approach. Hence the use of audio or video logs add a different, richer, dimension – but they need to be used alongside other data.

Yorke, M. (2009) ‘“Student experience” surveys: some methodological considerations and an empirical investigation’, Assessment and Evaluation in Higher Education, vol.34, no.6, pp.721–39;

Saturday 12 May 2012

Summary of research so far


 Story so far!
 
Oblinger and Oblinger (2005)

United States

‘Watershed’ in research – students interaction with technology may be changing the way they learn.
Net Generation:
·          Grown up with technology
·          Technology embedded in their world
·          Digitally literate
·          Virtually connected and socially orientated
·          Expect quick responses

Kennedy et al (2008)
Australia

Is the ‘net generation’ really tech savvy
Embracing technology and tools is not the universal student experience and we cannot assume that the ‘net gen’ know how to employ technology or adopt a one size fits all approach.
No consistency in the technology used by students  but a breadth of experience
Must react to the diversity of our students
The study found that students were ‘overwhelmingly positive’ about the use of technologies, indicating that they used them for all aspects of their studies (finding information, communicating with teachers and peers, module administration and general study purposes).

The possibility of a digital divide
Sharpe et al
(2005)
United Kingdom
Reviews needed of current learner experience literature
Found that many of these studies focused primarily at the level of module evaluations, rather than on how learners actually use and experience technology. From this two projects commissioned (LEX and LXP)

LXP
Conole et al (2006)
UK
How do learners engage with and experience e-learning (perceptions, use and strategies) and how does e-learning relate to and contribute to the whole learning experience?
 Students are appropriating technologies to meet their own personal, individual needs – mixing the use of general ICT tools and resources with official module or institutional tools and resources.
·          the Web is unequivocally the first port of call for students
·          technologies are used extensively by students to communicate with peers and tutor
·          learners see technology as integral to all aspects of their lives.
·          increasing use of user-generated content in the form of site such as Wikipedia is challenging the traditional norms of the academic institutions as the key knowledge expert and providers.
·          learners do not necessarily use the same technologies for learning as for other aspects of their lives, although for some learners there is an overlap.

SPIRE
White 2007
The SPIRE report (White (2007), JISC-funded ‘SPIRE’ project), which focused on the use of Web 2.0 technologies

The overall theme is that of sharing: materials, ideas, knowledge, friends and contacts. Another overarching principle that occurs across these services is the notion of a flow or flux of information and ideas.

Further research should be undertaken to gain a better insight into the motivations of those using Web 2.0 services.

The JISC Ipsos MORI polls 2008

school leavers’ views of technologies and their expectations of the kind of technological environment they expect at university
·          Overall, universities are perceived to be providing a basic level of ICT provision to a good standard. Expectations are met, and sometimes exceeded.
·          However universities are not currently perceived to be leading the way in developing new ways people can learn.
·          At the moment, technology trainingfor students (and, one might suspect, for staff) tends to focus on how to use different systems.
·          There is little sense that for these students, the university has a remit to encourage them to think differently about information, research and presentation.
LEX
(2004 – 2009)

How learners interact with technology throughout their learning lives.

·          Nearly all students have access to a great deal of technology, not just that provided by their institution, but also their own laptops and mobile phones.
·          Learners are immersed in a technology rich environment and make use of the technology available to them in a wide range of ways.
·          Some learners feel disadvantaged by a lack of functional access to technology or the skills to use it properly
·          Others are making deliberate choices to adopt sophisticated technologymediated learning strategies and finding and using a range of tools in personalized, creative ways to support their study.
·          Learners have high expectations of institutions to provide robust, reliable and accessible technology
·          Most learners do not have clear ideas of how courses could be using technology in educational and innovative ways.
·          Learners lead complex, time pressured lives and they need to develop organization skills to help them manage the multiple demands of study with home, family and employment
·          Learners are clear that most of their technology use for learning is defined by their courses and tutors. The powerful influence of context means that teachers and their institutions need to take the lead in supporting learners' developing digital literacies.

PB- LXP
(2007 – 2009)

the relationship between work/practice and student learning using ICT


·          Utility and ease of use are key factors in the appreciation of ICT tools provided by the module
·          The relevance of ICT tools to the work context can fuel study commitment
·          ICT elements in modules introduce a practical element into study, which is much valued by students
·          Online study methods are valued where they support students’ feelings of control and being able to make progress
·          ICT tool usage can help to connect study with application to practice in the workplace Students’ different work contexts influence their attitudes towards the ICT in their module.
ECAR
Dahlstrom (2011)

undergraduate students and information technology and how information technology affects the college experience
  • Students are drawn to hot technologies, but they rely on more traditional devices
  • Students report technology delivers major academic benefits
  • Students report uneven perceptions of institutions' and instructors' use of technology
  • Facebook generation students juggle personal and academic interactions
  • Students prefer, and say they learn more in, classes with online components

PB- LXP

Project Aim
The relationship between work/practice and student learning using ICT
The PB-LXP project set out to address this issue of the relationship between work/practice and student learning using ICT, in the context of six modules where technology-enhanced learning plays a key role

Methodology
The students are older (typically between the ages of 30 and 50) they are studying while working (in most cases) their modules requires them to develop work-related practices.
Six courses from Business, Health and Social Care, Technology and Computing agreed to participate and volunteer students from each course were recruited and interviewed at the beginning, middle and end of their course. Thirty students participated in data collection although some did not complete a final interview.


·         Utility and ease of use are key factors in the appreciation of ICT tools provided by the module
·         The relevance of ICT tools to the work context can fuel study commitment
·         ICT elements in modules introduce a practical element into study, which is much valued by students
·         Online study methods are valued where they support students’ feelings of control and being able to make progress
·         ICT tool usage can help to connect study with application to practice in the workplace Students’ different work contexts influence their attitudes towards the ICT in their module.
·         Convenience, flexibility and accessibility were mentioned by most students as a valued positive contribution of having courses online and therefore study-able at a variety of locations and times.
·         Further benefits directly related to more effective study were also identified by students – uploading notes to a wiki made students process what they were learning more actively and made end of course revision much easier; ICT brings activities into study and many students commented that reading alone was less effective than reading combined with activities using the tools introduced in their courses; students felt more able to judge their own progress where they could do activities enabled by particular tools, such as online quizzes, feedback on activities, reading other students’ work.
·          The relationship between work/practice and study was a rich area of influence in both directions. Where students could see direct benefits to their work context, they reported willingness to study intensively and to persist with difficult material. Experience with particular tools at work could feed into using similar tools on their course, though this was not uniformly positive. Students could be critical of the way some tools were used on their course. Some tools and processes learned on courses were taken into the workplace.
·         Most students reported increased confidence in using ICT as a result of having to use it extensively on their course – particularly (but not only) those who started off from a low skill level. Many reported increased usage in their work and social lives as a result.
·          Web tools enabled collaboration to play a positive role in study online, and in one case, a whole course was based on team working to construct a project for assessment. The course would not have been possible without the use of a range of online tools on a weekly basis, particularly FlashMeeting – audio-visual conferencing.
·          Virtually all students reported using search engines to supplement study and several downloaded course material and assignments on to a memory stick to enable working across several locations. The more experienced students used Skype and specialist software not specifically included within the course. Personalisation of environments did not play a strong role in any student’s experience for those included in our study.
·         Survey findings validated the Technology Acceptance Model of ICT usage derived from the literature (Davis, 1989). Students’ responses about their perceptions of technology used in their work, study and social contexts, reproduced the same two factors of usefulness and ease of use. The most important aspects of technology for these students were its usefulness and ease of use. Students were asked to list the technologies they used.  This was used to construct a measure of actual technology use and it was found that perceived usefulness of technology at work was the most valid predictor of ICT used.
When asked about critical moments, many said that it was doing assignments and getting the marks and feedback so we did not find critical moments or influential factors that led to sudden major shifts in how they used technology

The issue of highly skilled e-communicators also did not really manifest itself. Some of our students were highly skilled computer users but the idea that they had special strategies that we could find out about and document didn’t materialise

These highlight the interpenetration of work and study and the specific role that technological tools can play in bridging between study and work to beneficial effect for the student . The survey findings replicate a model of technology perception in the literature, and underscore the important role played by the work context in orienting students towards technology. Universities and course designers can do more to highlight this two-way relationship with ICT bridging both the work and study contexts.

LEX 2004- 2009

Project Aim
How learners interact with technology throughout their learning lives.

Methodology
The programme spanned two phases over four years from 2005-2009. It comprised nine research projects in total (two in phase 12 and seven in phase two), employed mixed method approaches, and had the sustained involvement of over 200 learners and more than 3000 survey respondents. Five national workshops3 were run disseminating the methods and findings.

We looked at learners from HE, FE and work based learning. In total, these projects involved 186 learners in some form of sustained engagement over an extended period, such as interviews, audio or video diaries, or production of case studies. In addition, 2921 learners participated in the surveys conducted by the LEAD, Thema and PBLXP projects.

To achieve these aims each project adopted a mixed method approach, employing a variety of data collection techniques. The primary data collection method was some form of interviewing or diary keeping. Some made use of surveys

Main Findings
·         Nearly all students have access to a great deal of technology, not just that provided by their institution, but also their own laptops and mobile phones.
·         Learners are immersed in a technology rich environment and make use of the technology available to them in a wide range of ways.
·         Some learners feel disadvantaged by a lack of functional access to technology or the skills to use it properly
·         Others are making deliberate choices to adopt sophisticated technologymediated learning strategies and finding and using a range of tools in personalized, creative ways to support their study.
·         Learners have high expectations of institutions to provide robust, reliable and accessible technology
·         Most learners do not have clear ideas of how courses could be using technology in educational and innovative ways.
·         Learners lead complex, time pressured lives and they need to develop organization skills to help them manage the multiple demands of study with home, family and employment
·         Learners are clear that most of their technology use for learning is defined by their courses and tutors. The powerful influence of context means that teachers and their institutions need to take the lead in supporting learners' developing digital literacies.


Responding to Learners Pack

Creanor, L., Trinder, K., Gowan, D. and Howells, C. (2006) ‘LEX: The learner experience of e-learning’, Final Project Report; also available online at http://routes.open.ac.uk/ ixbin/ hixclient.exe?_IXDB_=routes&_IXSPFX_=g&submit-button=summary&%24+with+res_id+is+res19282
Mayes, T. (2006) ‘LEX Methodology Report’, JISC-funded LEX Project, Glasgow, University of Strathclyde; available online at http://www.jisc.ac.uk/ media/ documents/ programmes/ elearningpedagogy/ lex_method_final.pdf  

ECAR visual


http://www.educause.edu/studentsAndTechnologyInfographic

ECAR

Project Aim
ECAR(educause center for applied research)

Study of undergraduate students and information technology sheds lights on how information technology affects the college experience

Methodology
Online survey = 10 days June 2011
(using external company)

The 2011 study differs from past studies in that the questionnaire was reengineered and responses were gathered from a nationally representative sample of 3,000 students in 1,179 colleges and universities.(Has been doing undergrad study since 2004)

Main Findings
  • Students are drawn to hot technologies, but they rely on more traditional devices
  • Students report technology delivers major academic benefits
  • Students report uneven perceptions of institutions' and instructors' use of technology
  • Facebook generation students juggle personal and academic interactions
  • Students prefer, and say they learn more in, classes with online components
Recommendations:
·         Investigate your students' technology needs and preferences and create an action plan to better integrate technology into courses and information systems.
·         Provide professional development opportunities and incentives so instructors can better use the technology they have.
  • Expand or enhance students' involvement in technology planning and decision making.
  • Meet students' expectations for anytime, everywhere, Wi-Fi access on the devices they prefer to use.
  • Nail the basics. Help faculty and administrators support students' use of core productivity software for academic work

Commentary
The most recent report we have. Initially looks expansive, but the suggestion of 3,000 students from 1,000 institutions does make it seem less credible in some ways – that’s 3 students per institution. It is more far reaching though, and there is a body of data that they have built up over the years.