10

Understanding the information literacy competencies of UK Higher Education students

Jillian R. Griffiths and Bob Glass

Abstract:

There has been some interesting debate regarding the assessment of students’ information literacy skills. Key questions have arisen such as: what standards and criteria should we use to assess students, what are we actually trying to measure, what type of test is the most appropriate, what do the results mean, how do we measure improvements and what are the effects of intervention.Our research project, funded by the LearnHigher Centre for Excellence in Teaching & Learning, decided to address the above questions via the use of psychometric tests. An online information literacy audit, the ILT (Wise et al., 2005), was used to assess the information literacy skills of a cohort of undergradute students in the Department of Information & Communications at Manchester Metropolitan University from 2008 to 2010. Based on the ACRL standards the ILT measures four of the five information literacy competencies.Our specific research aims were to ascertain if the testing methods were appropriate to UK students, identify areas for information literacy improvement raised in the test scores, identify practitioner intervention strategies and find out whether they could make a difference to students’ information literacy levels. Test results (presented and discussed in detail in the chapter) indicated that students struggled with (predictable) areas of Information Literacy, and that (a) identification and intervention in this area is useful to students and information literacy helpers (whoever they are); (b) intervention needs to be ongoing – not only in Year 1; (c) even up to their final year students continue to struggle with the same particular activities; and (d) a variety of approaches to support may be needed to help students develop their skills.Following completion of the testing, discussions have taken place with the CILIP CSG for information literacy regarding the creation of a UK information literacy question bank.

Key words

information literacy

online

audits

tests

undergraduate

higher education

UK

Introduction

There has been some interesting debate recently regarding the assessment of students’ information literacy skills. A number of key questions have arisen such as: what standards and criteria should we use to assess students, what are we actually trying to measure, what type of test is the most appropriate, what do the results mean, how do we measure improvements and what are the effects of intervention? Is a ‘one test fits all’ solution practical? While a number of information literacy audits or tests exist, it was decided that this research would use one of the most widely trialled information literacy tests, the online ‘Information Literacy Test’ (ILT) from Steven Wise, Lynne Cameron and their team at the Institute for Computer Based Assessment, James Madison University, Virginia, USA (Wise et al., 2005).

This chapter details a longitudinal assessment of an undergraduate cohort at the Department of Information & Communications, Manchester Metropolitan University, undertaken in the context of the LearnHigher CETL (Information Literacy Learning Area) research activities.

The research aims were to:

(a) ascertain if the testing method was appropriate to UK undergraduate students;

(b) identify areas for information literacy improvement;

(c) identify practitioner intervention strategies and whether they could make a difference to students’ information literacy.

The work undertaken was longitudinal in nature and was repeated with the same group of students as they progressed from year 1 to year 2 to year 3 of their undergraduate programmes. The initial testing took place during January and February 2007 and was repeated in 2008 and 2009. The test was taken by students from the Common Undergraduate Programme of the Department of Information & Communications. The test measures performance in four of the five information literacy competencies identified in US formulated ACRL standards (ACRL, 2000). Analysis of the results of this first set of test results has been undertaken using SPSS and this chapter will present findings and recommendations for future work.

The results indicated that there were indeed significant areas of weakness in students’ ability measured against four of the five ACRL competency standards and, while intervention by practitioners improved student performance, it was found that ongoing support was required to ensure continued progress. The following sections will discuss this work in greater detail.

Background and context

In recent years there has been a growing recognition of the importance of information literacy. Two high profile examples of such recognition are the Executive Order establishing a California ICT Digital Literacy Leadership Council and an ICT Digital Advisory Committee posted by US California Governor Arnold Schwarzenneger (May 2009) and the Presidential Proclamation on the National Information Literacy Awareness Month, issued in October 2009 (http://www.whitehouse.gov/the_press_office/presidential-proclamation-national-information-literacy-awareness-month/).

The concept of ‘information literacy’ was first introduced in 1974 by Paul Zurkowski, president of the US Information Industry Association, in a proposal submitted to the National Commission on Libraries and Information Science (NCLIS). He recommended that a national programme be established to achieve universal information literacy within the next decade. According to Zurkowski: ‘People trained in the application of information resources to their work can be called information literates. They have learned techniques and skills for utilizing the wide range of information tools as well as primary sources in moulding information solutions to their problems’ (Behrens, 1994; Bruce, 1997). In this definition Zurkowski suggested that information resources are applied in a work situation; techniques and skills are needed for using information tools and primary sources; and information is used in problem solving (Behrens, 1994: 310).

During recent years discussions about the terms information literacy and information skills, and the nature of the concepts have intensified in the UK. There are different approaches which are demonstrated by the use of differing terms such as ‘information literacy’ and ‘information skills’ and many definitions have been suggested by several organisations, institutions and authors (Virkus, 2003). Researchers of the UK’s Joint Information Systems Committee (JISC) funded ‘The Big Blue’ project, led by the Manchester Metropolitan University, and the University of Leeds found that in many instances both terms are used to describe what is essentially the same concept: ‘information literacy’ and ‘information skills’ can be described as synonyms (The Big Blue, 2002). Stubbings and Brine (2003) also note that at Loughborough University the phrases ‘information literacy’ and ‘information skills’ are both used to convey the same meaning. The Glossary of Information Terms at the British Open University (OU) Library site seems to support the same approach giving the following definition of information literacy: ‘a skill that involves being able to use information successfully, including finding information, searching using various tools (e.g., Internet, databases) and being able to critically evaluate the results’ (OU, 2003; Virkus, 2003).

Hepworth (2000a, 2000b) highlights two main approaches to information literacy that are evident: (1) attempts to identify discrete skills and attitudes that can be learnt and measured, for example Doyle (1992), the Information Literacy Competency Standards for Higher Education (ACRL, 2000) and the SCONUL approach (SCONUL, 1999); (2) emphasis on the information literate mindset associated with how an individual experiences and makes sense of his/her world, for example the work of Bruce (1997) illustrates this approach and is described as the behavioural, constructivist and relational approaches to information literacy (Virkus, 2003).

In the UK two critical definitions have been presented, by SCONUL and by CILIP. The broadly-based definition of information skills in higher education of the Society of College, National and University Libraries (SCONUL) Information Skills Task Force (now the SCONUL Advisory Committee on information literacy (Alvestrand, 2003)) reflects the twin dimensions of the ‘competent information user’ at the base level and the ‘information-literate person’. For the latter level of information skills, the term ‘information literacy’ is used. Therefore, both information skills and information technology (IT) skills are seen as essential parts of the wider concept of information literacy. For the development of the information-literate person SCONUL proposes seven sets of skills. The outline model of information skills generated in the briefing paper has become known as the Seven Pillars Model. The pillars show an iterative process whereby information users progress through competency to expertise by practising the skills (SCONUL, 1999; Bainton, 2001).

In 2003 the Information Literacy Executive met to agree a definition of information literacy for use by CILIP (the Chartered Institute of Library and Information Professionals) members. The definition was approved by the CILIP Council in December 2004 as CILIP’s definition on information literacy:

Information literacy is knowing when and why you need information, where to find it, and how to evaluate, use and communicate it in an ethical manner.

Further, this definition details a set of skills (or competencies) which are required for an individual to be information literate. These are that they have an understanding of:

image a need for information

image the resources available

image how to find information

image the need to evaluate results

image how to work with or exploit results

image ethics and responsibility of use

image how to communicate or share your findings

image how to manage your findings.

The CILIP Community Services Group (CSG) sub-group for information literacy acts as advocates and facilitators for the development of information literacy awareness and education within the UK and beyond through committee work, the LILAC conference (http://lilacconference.com) and their website (http://www.informationliteracy.org.uk/).

While there have been numerous national initiatives to address information literacy it remains a key challenge as to how we measure or assess an individual’s level of information literacy, and any progress that may be made to improvement of an individual’s information literacy skills. The research presented here used the US ILT test and identified that specific areas for concern could be highlighted and thus targeted for intervention and that levels of information literacy varied across the three year longitudinal study, at the same time recognising that the American bias in the style of the questions and their focus might influence the understanding and subsequent performance of UK students taking the test.

The ILT is an online psychometric test comprising some sixty-five multiple-choice questions, of which sixty are scored, assessing a range of information literacy Competences. Design and validation of the test was created by psychologist Steven Wise and question content was created by Lynne Cameron, each heading up a team of associated contributors (Wise and Yang, 2003; Wise et al., 2005).

Information literacy Testing at MMU – context

A successful consortium bid to create a ‘One Stop Shop’ for resources for learner development in HE resulted in the creation of the LearnHigher Centre for Excellence in Teaching and Learning (CETL) in January 2005 (www.learnhigher.ac.uk). LearnHigher received funding as one of 74 Centres for Excellence in Teaching and Learning created by HEFCE as part of their learning and teaching enhancement strategy (http://www.hefce.ac.uk/learning/tinits/cetl/).

Led by Liverpool Hope University, LearnHigher was the largest collaborative CETL with partners from 16 institutions. Each partner committed to improving student learning by providing resources to support learning development and, through practice-led research, to evaluate the effective use of those resources. LearnHigher aimed to create a network of expertise seeking to enhance professional practice and student learning, and to build capacity both within the network and across the wider sector.

LearnHigher members sought to identify, map and label the key issues and topics of learner development. Agreement on what the ‘learning areas’ should be, how they should be supported and what outcomes were expected was considered critical from the outset of the project. The learning areas below were chosen in the light of relevance to the project aims, expertise of the individuals and institutions taking part and consortium discussion. Most of the HEIs involved in the LearnHigher CETL were allocated one learning area; a small number had two. Manchester Metropolitan University is working in the learning area for information literacy (Glass, 2006, 2007a, b). There were initially 19 learning areas in all:

image Academic Writing

image Assessment

image Critical Thinking and Reflection

image Doing Research

image Group Work

image Independent Learning

image Information Literacy

image Listening and Interpersonal Skills

image Learning for All

image Mobile Learning

image Note Making

image Numeracy Maths and Statistics

image Oral Communication

image Personal Development Planning

image Reading

image Referencing

image Report Writing

image Time Management

image Business and Commercial Awareness.

The information literacy learning area at MMU, led by Bob Glass, was administered via a team made up of academics, library practitioners, learning support advisers and research associates. Through this team a large number of activities and resources were created and uploaded to the Information Literacy Learning Area of the LearnHigher Website (http://learnhigher.ac.uk/Staff/Information-literacy.html).

A requirement of the project brief was that each learning area should contribute evidence of their research activities and disseminate the outcomes. As part of the research contribution to the Information Literacy Research Area it was decided to run an information literacy audit for all year 1 students in the Department of Information & Communications, MMU, in late 2006. This project was also supported by a small grant from the Learning & Teaching Group in the Humanities, Law and Social Sciences faculty, plus the department of Information & Communications. Additionally it was decided to include around 20 students from another department in the faculty in order to generate comparative data.

The Information Literacy Test that we used was the ACRL-based (ACRL, 2000), James Madison University (JMU), ‘Online Information Literacy Test’ (ILT). There were a number of reasons for this including the nature of the test, the similarities in the standards and the availability of a ‘product’ that was ready to use. Most of the testing took place during December 2006 and February 2007. Seventy-five students in the common undergraduate programme of the department of Information & Communications and twenty from the Economics department in the same faculty were tested. The psychometric test (which is charged for on a per student basis) is based on 65 multiple choice questions. Sixty of the questions are static, five are used as ‘practice’ questions for development purposes and are varied as required by the test developers. It takes between 60 and 75 minutes for students to complete the test, depending on the speed of the student taking the test. The test measures performance in four of the five information literacy competencies identified in US-formulated ACRL standards (ACRL, 2000); written abilities cannot really be addressed by this kind of test. Students receive their score immediately at the end of the test, and tutors are provided with an extensive range of statistics relating to the student performance, question scores and overall test results. The data file is provided in Access, Excel, SPSS or other formats. We undertook our analysis using SPSS as this was the most convenient format to use at MMU. Previous results have been presented by Glass and Griffiths (2008, 2009, 2010).

Research methods

Longitudinal testing was undertaken with a cohort of undergraduate students over the three-year period of their academic life at MMU. The number of students participating were:

image Year 1 = 62

image Year 2 = 28

image Year 3 = 45

In year 1, students undertook the test within computer teaching labs within the Department of Information & Communications during one-hour ‘seminar’ sessions (allowing for ‘over run’ time if necessary). The sessions were formal, supervised and run as part of a year 1 compulsory unit (‘Information Literacies for the Digital Age’), facilitated by Bob Glass and Chris Dawson, both departmental tutors.

In year 2, participation was semi-voluntary and unsupervised. Fewer students took part than in year 1 and those that did were those for library and information-related degree routes.

In year 3 the tests were administered, once again, in one-hour seminars as part of a compulsory unit and supervised by a tutor. Participation increased, although an element of test fatigue was noted.

The ACRL standards assess information literacy competencies using five measures. However, one of the standards (Standard 4: the student is able to use information effectively to accomplish a specific purpose – i.e. writing an essay), cannot be measured using a multiple-choice item format and was excluded from these assessments. Therefore the four standards assessed via the ILT are:

image Standard 1: defines and articulates the nature and extent of the information needed.

image Standard 2: accesses needed information effectively and efficiently.

image Standard 3: evaluates information and its sources critically and incorporates selected information into his or her knowledge base and value system.

image Standard 5: understands many of the ethical, legal and socio-economic issues surrounding information and information technology.

Results

Two performance level standards have been defined by the ACRL ILT creators, these are Proficient and Advanced. A score of 65–89 per cent (39–53 out of 60) is required for a student to be assessed as Proficient, and a score of 90 per cent + (54 + out of 60) for a student to be assessed as Advanced. A Proficient student will be able to:

image describe how libraries are organised;

image define major library services;

image choose appropriate types of reference source for a particular information need/identify common types of citations;

image employ basic database search strategies;

image locate a variety of sources in a library or online;

image discriminate between scholarly and popular publications;

image legally and ethically use information.

An Advanced student is able to attain the criteria for Proficient and will be able to:

image modify and improve database search strategies;

image employ sophisticated database search strategies;

image interpret information in a variety of sources;

image evaluate information in terms of purpose, authority and reliability;

image understand ethical, legal and socioeconomic issues relating to information access and use.

The mean score of students across the three years of study are shown below, demonstrating some improvement in the information literacy of the students over the period of research (see Fig. 10.1).

image

Figure 10.1 Mean final scores for all students (%)

The mean score of students in each year of study fell just short of the score required to be Proficient, with a peak score of 64.29 noted in year 2. From field observations it was noted that the participating sample was the most focused and committed of the students and therefore the overall score was biased towards the more capable students. By year 3 a broader spread of participants were again recruited. However, overall improvement was observed from years 1 to 3.

The following sections will provide detail on student performance across each objective.

Overall results across the ACRL objectives

Results across these four standards show that the majority of students (see Fig. 10.2):

image

Figure 10.2 Results for each standard by year of study

image were able to answer questions concerning defining and articulating the nature and extent of the information needed correctly: ACRL1 (range 69 per cent–74 per cent of students scoring correctly);

image were less able to answer questions concerning accessing needed information effectively and efficiently correctly: ACRL2 (range of 49 per cent–53 per cent of students scoring correctly);

image were able to answer questions concerning evaluating information and its sources critically and being able to incorporate selected information into his or her knowledge base and value system correctly: ACRL3 (range of 61 per cent–65 per cent of students scoring correctly);

image were able to answer questions concerning understanding many of the ethical, legal and socio-economic issues surrounding information and information technology correctly: ACRL5 (65 per cent − 73 per cent of students scoring correctly).

From these results it would seem that the majority of students have little or no difficulty identifying the information they need, are able to evaluate the information and sources and incorporate the information into their knowledge and understand some of the legal and ethical issues surrounding use of information. However, some areas are causing difficulties and the results of these tests show that students in year 3 are scoring slightly lower across each standard, these will be presented in detail below.

Results for ACRL standard 1 – define and articulate the nature and extent of the information needed

Two thirds of students encountered little difficulty in defining and articulating the nature and extent of the information they needed, and an improvement was apparent from years 1 to 3. However, there was a slight drop between years 2 and 3. Detailed analysis identified that year 3 students struggled to identify correctly where and what primary sources are (44 per cent incorrect).

image

Figure 10.3 Results for ACRL1 by year of study

Results for ACRL standard 2 – accesses needed information effectively and efficiently

Some students also appear to struggle with how they go about accessing that information (see Fig. 10.4). There skills are core to our profession, critical in information literacy and an area where the library can, and does, provide excellent training; and this training can be further directed and targeted by understanding the results of this test. In year 3 the following areas caused particular difficulties:

image

Figure 10.4 Results for ACRL2 by year of study

image Boolean strategy choice (93 per cent incorrect);

image understanding types of reference, 5 (91 per cent incorrect);

image locating journal articles (89 per cent incorrect);

image locating chapter in an edited book (76 per cent incorrect);

image phrase searching (73 per cent incorrect);

image understanding types of references, 3 (71 per cent incorrect);

image getting pdf/full-text of an article (71 per cent incorrect);

image narrowing a search (68 per cent incorrect);

image expanding strategy to get other similar items (68 per cent incorrect).

image

Figure 10.5 Results for ACRL3 by year of study

Results for ACRL standard 3 – evaluates information and its sources critically and incorporates selected information into his or her knowledge base and value system

In year 3 students struggled with:

image understanding census tables (63 per cent incorrect);

image understanding the conclusions of a website (69 per cent incorrect);

image understanding journal references (63 per cent incorrect);

image understanding government websites (53 per cent incorrect).

Results for ACRL standard 5 – understands many of the ethical, legal and socioeconomic issues surrounding information and information technology

Students in year 3 particularly struggled with:

image understanding citation guidelines (71 per cent incorrect);

image understanding copyright law (59 per cent incorrect);

image understanding asking permission to use images (56 per cent incorrect).

image

Figure 10.6 Results for ACRL5 by year of study

Conclusions and recommendations

This longitudinal study of undergraduate students across the three years of their study has identified that students struggle with (predictable) areas of information literacy, and that:

image Identification and intervention in these areas is useful to students and information literacy helpers (whoever they are).

image Intervention needs to be ongoing – not only in year 1.

image Even up to their final year students continue to struggle with the same particular activities.

image A variety of approaches to support may be needed to help students develop their skills.

The following illustrate questions which students had difficulty with and form the basis for recommendations for areas to target for further support and training for students.

Constructing and implementing effectively designed search strategies

image Did not know how to expand searches or apply them to different databases; identifying additional relevant items from a citation which has already been retrieved.

Refining the search strategy if necessary

image Understanding how phrase searching works and what titles would be retrieved when presented with a specific example and possible results.

image Known item search (journal article): identifying which strategy would retrieve a specific journal article when presented with a number of different possibilities.

image Known item search (book chapter): identifying which strategy would retrieve a specific book chapter when presented with a number of different possibilities.

image Understanding how truncation works and what would be retrieved when presented with a specific example and possible results.

image Understanding how to narrow a search when presented with a number of different possibilities.

image Understanding how to refine a search to suppress irrelevant items when presented with a number of different possibilities.

image Understanding how Boolean operators work and what would be retrieved when presented with a specific example and a range of different possible strategies.

Extracting records and managing the information and its sources, could not recognise different item formats from bibliographic search results presented to them from databases

image Understanding the source of an item from its citation description (journal).

image Understanding the source of an item from its citation description (book).

image Understanding the source of an item from its citation description (chapter in a book).

image Understanding the source of an item from its citation description (newspaper).

image Locating full text of an article from its citation which has a link to ‘Full-Text/PDF’.

Certainly, these results provide detailed identification of areas for practitioners to target to improve information literacy – results which we hope the community will find useful. However, in conducting this research other findings have emerged, such that it is posited that:

image there is value in the activity of assessing the information literacy of students;

image it can provide a useful quantitative measure;

image student and tutor feedback provided can be helpful;

image it can provide baselines and progress indicators;

image it can be a cohesive/inclusive activity but could also be negative and we need to be aware of this;

image it can trigger test fatigue and there are task attention issues;

image there are presentation format and question set issues;

image there can be cost and time factors.

Having used a US-based test designed for use in a US context, it is also felt that this may cause bias and/or confusion among UK students. If a UK test were developed then this might prove to be an effective diagnostic tool for students, tutors and library practitioners. Discussions with the LearnHigher Information Literacy Learning Area and the CILIP CSG for Information Literacy has led to initial work to create a UK Information Literacy Question Bank. It is envisaged that this Question Bank may be used by different people in different ways, for example online psychometric tests, printed quizzes or web-based interactive tutorials. This work is now in progress and it is hoped that it will provide the UK with a method to ensure appropriate assessment of information literacy to enable meaningful intervention for students.

References

Alvestrand, V. Support for info literacy certificate grows. Information World. 2003; 191:1.

The Association of CollegeLibraries, Research. Information Literacy Competency Standards for Higher Education. ACRL; 2000.

Bainton, T., Information literacy and academic libraries: the SCONUL approach. Proceedings of the 67th IFLA Council and General Conference, 2001. [August 16–25.].

Behrens, S.J. A conceptual analysis and historical overview of information literacy. College and Research Libraries. 1994; 55(4):309–322.

The Big Blue: Final Report. 2002. http://www.library.mmu.ac.uk/bigblue/pdf/finalreportful.pdf

Bruce, C.S. The relational approach: a new model for information literacy. The New Review of Information and Library Research. 1997; 3:1–22.

Doyle, C.S. Outcome measures for information literacy. In: Final report to the National Forum on Information Literacy. Syracuse, NY: ERIC Clearinghouse; 1992:P148. [ED 351033.].

Glass, B., Griffiths, J.R.Online Information Literacy Audits: A Longitudinal Study (Year 2). Cardiff, UK: Cardiff University, 2009. [LILAC 2009, 30 March-1 April].

Glass, B., Griffi ths, J.R.Understanding the Information Literacy Levels of Students: The Results of a Three Year Online Information Literacy Audit at Manchester Metropolitan University. Limerick, Republic of Ireland: Limerick Strand Hotel, 2010. [LILAC 2010, 29–31 March,].

Glass, N.R., LearnHigher CETL: Information Literacy. The LearnHigher Suite at MMU. 2006. LearnHigher: www.learnhigher.mmu.ac.uk/learnhigher-suite/

Glass, N.R., LearnHigher CETL: Information Literacy. The Project. 2007. LearnHigher: www.learnhigher.mmu.ac.uk/project/

Glass, N.R., LearnHigher CETL: Information Literacy. Resources. 2007. LearnHigher: www.learnhigher.mmu.ac.uk/resources/

Hepworth, M., Developing information literacy programs in SingaporeBruce, C.S., Candy, P.C., eds. Information Literacy around the World: Advances in Programs and Research. Charles Sturt University, Wagga Wagga, NSW, 2000:51–65.

Hepworth, M. Approaches to information literacy training in higher education: challenges for librarians. New Review of Academic Librarianship. 2000; 6:21–34.

Open University Library, Glossary of information terms. Open University, Milton Keynes, 2003. http://library.open.ac.uk/help/helpsheets/intglossary.html

The SCONUL Advisory Committee on Information Literacy. Information skills in higher education: a SCONUL position paper. National and University Libraries The Society of College; 1999.

Stubbings, R., Brine, A., Reviewing electronic information literacy training packages. Innovations in Teaching and Learning in Information and Computer Sciences (ITALICS). 2003;2(1). http://www.ics.ltsn.ac.uk/pub/italics/issuel/stubbings/010.html

Virkus, S., Information literacy in Europe: a literature review. Information Research. 2003;8(4). Paper No. 159,. http://informationr.net/ir/8-4/paper159.html [(accessed 2 October 2005).].

Wise, S.L., Yang, S., The Adaptex Assessment System (Version 1.5). US Department of Education Fund for the Improvement of Postsecondary Education, 2003.

Wise, S.L., Cameron, L., Yang, S., Davis, S. Information Literacy Test: Test Development and Administration Manual. Harrisonburg: Institute for Computer-Based Assessment; Center for Assessment & Research Studies, James Madison University; 2005.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset