Information Literacy Game Design and Assessment Literature Review

 

Abstract

This literature review focuses on digital information literacy games played by students at the academic undergraduate level.  The application of digital games to college information literacy courses is a recent development.  Phetteplace and Felker (2014) state that: ”to date, efforts to create library games have been a ground-up endeavor, with librarians struggling to understand principles of game design as they produce their first game” (p. 21).  Information literacy instructors (almost all of whom are librarians) are new to designing and creating information literacy games that will effectively supplement an information literacy module or course.  Non-librarian educators having been using games for a lot longer than librarians have (Tewell & Angell, 2015).  Markey, Keeder, & St. Jean, (2011) note that while the literature on games for learning is extensive, the literature “specifically addressing games for IL [information literacy] education is significantly limited (p. 47).  Only a few information literacy games have been evaluated by using a research study (Guo & Goh, 2016a).  There is, therefore, great opportunity to develop an information literacy game for college students because not that many information literacy games have been developed so far.

 

Introduction

This literature review will examine why digital information literacy games should be included in a college-level information literacy course, in that information literacy instructors have difficulty getting students to be interested in the various subject matter of an information literacy course (Young, 2016).  For example, students have been shown to disengage from information literacy classroom instruction in various ways, such as by texting, checking email, and even sleeping (Leach & Sugarman, 2006).  This review will also discuss two major issues underlying information literacy game design: meeting learning goals and objectives and also providing fun elements so students want to play this type of game (Phetteplace & Felker, 2014).  This literature review will also present research studies that assess student learning outcomes after students have played a digital information literacy game.  This literature review will also discuss what students say they really want from a digital information literacy game.

Literature Review

Reasons to Use Games in Information Literacy Courses

Squire and Jenkin (2003) point out that games offer an alternative learning environment to the traditional classroom instructional model, because they “encourage collaboration among players and thus provide a context for peer-to-peer teaching for the emergence of learning communities” (p. 29) (as cited in Markey, Leeder, & St. Jean, 2011).  This social aspect of games aligns with the socio-constructivist view that learning occurs in a personal context and that students learn from each other and from their teachers (Nelson & Erlandson, 2012).

By having students play information literacy games, such games can help to “motivate and engage students throughout the information-gathering process” (Martin & Ewing, 2008; Kim, 2012).  Information literacy games tend to motivate and engage students and help them to problem solve (Kim, 2012; Young, 2016).  This may help to prevent students from mentally and emotionally disconnecting from the information literacy course itself.

One reason students may disengage is because of the wide variety and amount of information the instructors cover in an information literacy session (Leach & Sugarman, 2006).  Smith (2007) states that “student boredom does not reflect negatively on the original information literacy instructor. Research shows this is a prevalent pedagogical obstacle” (Smith, 2007, para. 5).

Leeder, Markey & Rieh, (2010, October 8) state that a significant number of students have difficulties evaluating and using information and think that introducing a game into an information literacy course “can address many of these challenges, offering fundamental training in IL skills and critical evaluation which is integrated into coursework at the students point of need” (p. 5).  Information literacy games can also acquaint students with library sources are not aware of and using more scholarly sources (Leeder, Markey & Rieh, 2010, October 8).

Games may offset the negative effects of an information literacy classroom environment by adding a dimension of fun (Pho and Dinscore, 2015).  Fun can be defined in terms of two aspects: immersion and emotions (Vieira, L. C., & da Silva, 2017), with immersion described as “the increasing level of involvement with the task, that subsume all the sensory, cognitive and emotional systems” and emotions characterized as “experienced by performing the task, which can be considered pleasurable despite their general likeness in other contexts” (p. 133).  One dimension of assessing how much fun a game is is to measure how pleasurable it is (“the hedonic aspect of fun”) (Vieira, L. C., & da Silva, 2017, p. 133).  The concept of fun will be discussed later in this literature review.

Selecting the most suitable subject matter content for an information literacy game is critical (Guo & Goh, 2016).  There is and will always be a tradeoff between designing the game to be fun and designing the game to meet specific curriculum learning goals.  The game designer must be very aware of the fun dimension in an information literacy game and which learning goal elements may lessen the fun elements to the point where students no longer wish to play the information literacy game being designed.

Games may also provide for a more impartial evaluation than a traditional classroom assignment does.  Kim (2012) states that “unlike the real world, a game is transparent about what information and skills are needed for progress and how to obtain them, rewards efforts and achievement fairly, and provides immediate feedback on performance” (p. 466).

Information literacy games are being increasingly used and are becoming more popular (Tewell & Angell, 2015).

Information Literacy Digital Game Design

Gee (2005) states that “good video games incorporate good learning principles, principles supported by current research in cognitive science” (as cited in Smale, 2015, para. 5).  As will be seen in the studies that follow, information literacy game designers need to consider not only course learning goals and objectives, but also fun elements to motivate the learners while playing a digital game (Phetteplace & Felker, 2014).

An information literacy game should be designed to offset the boredom factor while also preserving the learning goals of an information literacy course.  This is a difficult design task facing any information literacy game developer, particularly because college game designers should pay attention to the Millennial Generation’s learning needs and styles when developing such college-level educational games.  Young (2016) notes that Millennials have the following characteristics: gets bored easily; “wants instant feedback and gratification” (p. 2); likes working with computers; and desires classroom activities that provide for interaction and socializing.  Kubiatko (2013) describes Millennials similarly:

They have grown up with computers; they are digitally or technologically literate. They tend to be inventive and are self-sufficient problem-solvers. They often desire support and feedback, but detest authoritative control, expect immediate answers and feedback….They learn differently as compared to past generations of students. They are considered to be active experimental learners, proficient in multitasking, and dependent on communication technologies for accessing information and for interacting with others (p. 1264).

It should be noted that as Millennials appear to learn differently, a different instructional strategy may be more effective.  Also, since Millennials tend to bore easily and like to interact and socialize (Young, 2016), information literacy game designers face the challenge of getting Millennials to be interested in a traditional information literacy course, which, as aforementioned, has tended to disengage students (Leach & Sugarman, 2006).

The Millennial Generation was born from 1981 through 1997 and is, now, the largest living American generation (even larger than the Baby Boomer Generation) (Pew Research Center, 2016, April 25).  The Millennial Generation can also be characterized as socially and demographically diverse (Broido, 2004):

One highly visible way in which Millennial students differ from earlier students is their racial and ethnic diversity. According to the 2000 U.S. census, 39.1 percent of people under eighteen are people of color (Asian; black; Hispanic, who may be of any race; or Native American), as compared to 28.02 percent of people eighteen and over (p. 73).

Since the Millennial Generation is relatively diverse, it would be more engaging to ask students information literacy questions that reflect this diversity – in terms of designing questions and answers that reflect diverse student interest.

Considine, Horton, & Moorman (2009) state that “the defining factor that leads to the Millennials distinctive character is that they are the first generation to be immersed in ICT [Information Communication Technology] for their entire lives” (p. 473).  The application of online digital information literacy games matches the Millennial students’ interest in and use of various computer technology and so could be an appropriate instructional tool – if designed effectively and used appropriately (for motivational and student engagement purposes).

It is also a challenge to design a game that incorporates and integrates specific learning objectives, and that playing games and conducting research have similar characteristics — because both require skill, practice, and the completion of a series of steps. (Smith, 2007).  Pho and Dinscore (2015) note that “game-based learning involves the design of learning activities that “incrementally introduce concepts, and guide users towards an end goal,” and information literacy game design can include competition, points, incentives, and feedback (p. 1).  A game is also a good way to determine if students have achieved the learning outcomes for a particular information literacy class session (Leach & Sugarman, 2006).  In terms of learning design, information literacy questions and categories should be created based on lesson-outcome goals (Leach & Sugarman, 2006).  Young (2016) also agrees that learning objectives are the most important element of a library game.  Young (2016) states that “a successful library instructional game meets learning objectives, engages and motivates students, has a high level of player participation, and gives players a low level of frustration” (p. 4 ).  Young (2016) notes that “simpler games are more likely to succeed in an educational setting” and adds the caveat that “just because a game is technically and graphically impressive does not mean that students will enjoy it and get the information they need from it” (p. 5).  Young (2016) further notes that successful library games are also easy to understand no matter how difficult the subject content is.  Young (2016) also warns that “if the elements of an effective library game cannot be achieved from a particular assignment, or if the assignment is not conducive to game play, then the game will ultimately fail to meet objectives” (p. 5).

Iten and Petko (2016) found from their study of the “relationship between anticipated enjoyment and willingness to play” a game that the most important factor was “the students’ expectation that the learning game would be easy and instructive” (p. 151).  Although Iten and Petko (2016) studied children, their findings are consistent with Young’s (2016) conclusion above that “simpler games are more likely to succeed in an educational setting” (p. 5) and Markey, Leeder, and St. Jean’s (2011) conclusion that students want to learn from the game how to conduct actual library research to find relevant sources and how to properly evaluate these sources for use in their research papers.  The Markey et al. (2011) study will be discussed in depth later in this literature review.

Iten and Petko (2016) recommended that “to achieve greater learning gains from playing serious games, the teacher should activate children’s prior knowledge and ensure that the software includes good scaffolding functions” (p. 152).  I agree with Iten and Petko about using scaffolding, and I plan to design scaffolding functions into my information literacy game so students will better understand the information literacy concepts presented to them and how to go about answering the questions asked of them — in order to increase the potential for student engagement with information literacy course content.

In terms of what I consider fun, I found a somewhat similar information game — to what I would like to create — that is called iCivics (Carnesi, 2014).  The iCivics game website (at http://www.icivics.org) was founded by retired Supreme Court Justice Sandra Day O’Connor so students with “declining knowledge of civics” (p. 72) could learn more about the United States Government.  Each game is 15-60 minutes long, which is the length I would want to make my information literacy game.  There are seven games and 12 web quests.  In terms of my game design, I would ideally want to have about seven games as well.  My games would be designed as “mini-format” games that focus on specific information literacy tasks that a student should complete in order to become information literate in that task area (Markey, Swanson, Jenkins, Jennings, and St. Jean, 2009).  What makes this study so interesting to me is that Carnesi (2014) did exactly what I plan to do with my information literacy game (in terms of having it played in a library computer lab).  Carnesi (a junior high school librarian) had the students play the iCivics game in the library computer lab as follows:

Using the computer lab, the class was directed to the library’s wiki to access online documents. To better engage the students in this review, I updated all printable worksheets by transferring the info from the iCivics forms to online Google document forms, using the same fill-in-the-blank and multiple choice formats from the iCivics downloadable form. Completion of the Google forms allows teachers to see immediately who needs additional remediation on a specific concept. After students completed submission of the review form, we closed out the lesson, either at the computer stations or in the instructional space, depending on the time available (p. 74).

The iCivics game was a big success for Carnesi, and she stated that “by the third session, students were hooked….They were enthralled with the opportunities the site provided for role-playing lawyers, judges, and political figures” (p. 74).  The important concept here is that an information game can be successful if designed and administered effectively.  In terms of learning outcomes, Carnesi found that students obviously met learning assessment goals:

According to many students, the chance to practice the concept is what helped them to clarify detailed issues and retain information. In my school 96 percent of the eighth-graders passed their state-mandated civics test that spring! This was an improvement of 5 percent over the previous year (p. 74).

Overall, I think that what made iCivics a successful game is that the game is a multimedia experience in terms of the instructions provided; the audio that students can listen to in order to learn the relevant subject-matter content before answering questions (about, for example, Supreme Court decisions); the relevant contextual popups with explanatory text to clarify various game elements and choices; video to make the game experience engaging; background music; warning messages if a problem arises; interactivity by allowing students to press buttons to make decisions; and a Help button to explain what is going on in the game.  I think that this type of design concept is more optimal in today’s educational environment because it offers more than just reading).  The iCivics game software also provides a lot of scaffolding to guide students throughout the game, as Iten and Petko (2016) recommend for game software.

In addition, for me, fun could be defined as being “enthralled” by playing an online digital game or wanting to keep playing the game (as Carnesi described the students who played the iCivics game).

The reason that I want to use a library computer lab (or any computer lab) as the learning environment for an information literacy game is that the social context in which the game takes place can increase social engagement and interaction among students.  As Kirk and Harris (2011) state:

the real value of games are the episodes of authentic play that unite groups and build communities—and school librarians can easily encourage that play. And if, while engaging in authentic play, students also happen to be using 21st-century learning skills like inquiry, evaluation, and synthesis, that isn’t a bad thing.

While Kirk and Harris focused on library games for school libraries, their reasons for using games in a library environment can also be applied to college libraries.  College students can play games and work with other students to answer information literacy questions by solving research and evaluation problems and so learn 21st century information literacy skills they need for their other courses.

Effective learning games “should be designed around custom learning outcomes” and should include “unambiguous questions” (Margino, 2013, p. 339).  “Narrowly focused mini games” may be an effective solution for an information literacy games than “one fully encompassing game (Markey et al., 2009; Margino, 2013).  This is a recommendation that I believe a first-time game designer should follow to keep the game “targeted” and “more closely related to course content” (Young, 2016, p. 5).

Information literacy games that are designed to provide quick feedback and social interaction help to satisfy the Millennial students’ needs for fast feedback and socializing with other students (Margino, 2013).

Martin and Ewing (2008) point out that traditional formal instruction does not entertain or engage students.  Students who like the gaming experience tend to be drawn to it by a different set of learning qualities, such as learning by trial and error, being able to fail at the game and start over if they make a mistake, and being able to learn from others during the course of a game (Martin & Ewing, 2008).  This is a very crucial concept for a new game designer to be aware of – that game design requires a different set of learning qualities than traditional classroom learning does.

Phetteplace and Felker (2014) describe how an information literacy game could be able to incorporate more complicated concepts such as open access to information and library sources.  Phetteplace and Felker envision students playing active roles, such as acting as researchers or publishers, and following game-based rules, to find and retrieve open access materials – in comparison to closed-access materials that could cost a significant amount of money to access.  Phetteplace and Felker discuss the relative importance of the educational goals underlying an information literacy game as compared to how much fun a student should have.  Phetteplace and Felker focus on a central issue underlying education games by stating that:

Most educational games that fail do so for one basic reason: they aren’t fun.  [Game designer Gabriel Zicherman] argues that the fundamental problem with most educational gaming is that the educational goal takes precedence at the expense of the fun of the gaming experience: in other words, educational games are so preoccupied with trying to get the game to teach, they fail to devote enough time and attention to perfecting the experience of playing the game” (p. 21)

Iten and Petko (2016) point out that “serious games are generally considered to induce positive effects in the areas of learning motivation and learning gains….Yet few studies have examined how these factors are related” (p. 151).  Iten and Petko (2016) also note that “learning games must be regarded as goal-directed activities rather than being just for fun and enjoyment” (p. 151).  This is consistent with Margino (2013) and Young (2016) who focus on learning goals and outcomes over fun and enjoyment.  Iten and Petko (2016) assert that:

Enjoying the learning game does not automatically mean learning success. Indeed, learning games can encourage children’s motivation to learn about a subject, but engagement with content is essential for achieving cognitive learning gains (p. 152).

Iten & Petko (2016) also found that it was primarily the anticipated usefulness of the game by students as well as the ease of understanding the game and how to play it that influenced children’s general intention to use these games for learning purposes.

The lack of fun-quality elements in some information literacy games was one reason why the University of Michigan library’s first online game (Defense of Hidgeon: The Plague Years) was unsuccessful (Young, 2016).  Kim (2012) states similarly about the lack of fun being problematic to a game environment by noting that “when a goal other than fun is imposed, the game begins to lose its magical effect on our motivation and productivity (p. 468).  The goal of information literacy games, in terms of the fun concept, is to make the learning task “less painful and even enjoyable” (Kim, 2012, p. 468).

Information literacy games also appear to correspond to the first four Association of College & Research Libraries (ACRL) standards (Margino, 2013; Young, 2016), in terms of being able to 1) determine one’s information needs; 2) find and retrieve information; 3) evaluate information; 4) organize and communicate information (ACRL, 2000).

Problems with Information Literacy Game Research and Evaluation

A librarian initially undertaking information literacy game design and development should be aware of the various issues and problems underlying past developments of information literacy games – in order to avoid making the same mistakes previous information literacy game developers did.

Leach and Sugarman (2006) have noted the research problems that are associated with studies on games in education and in classroom learning, and that a researcher cannot readily apply the research results of one game-type to a different game-types:

Because there are many different types of games, for example, video, simulation, and online, it is difficult to generalize research findings from a study of one particular kind of game to all games in general.  In addition, many articles written about the use of games in classroom instruction, including some of these cited above, are descriptive or anecdotal in nature and lack an evaluation component such as a retention test to address the impact of the game on student learning (p. 192).

Therefore, an information literacy game designer should, for example, be careful about applying the results found from a non-digital board game study to that of a digital game-based learning game environment.

Another important issue is that of Leach and Sugarman’s (2006) finding that most game studies are descriptive and anecdotal rather than evaluative.  This can be observed in Broussard’s (2011) excellent descriptive review of 17 online library games.  Broussard, however, did not mention or discuss actual research conducted on these library games to determine how effective these library games are in improving student learning.  Guo and Goh (2016a) also state that “the evaluation of IL games relied mostly on qualitative anecdotal quotations, and lacked rigorous experimental comparisons and concrete measures on students’ learning performance” (p. 61).  Guo and Goh (2016a) point out that Utah Valley University’s Get a Clue game (to familiarize students with the actual physical library) and LibraryCraft game (to familiarize students with the Library’s digital library resources) did not receive a rigorous evaluation because the library only administered a short survey to students (and so did not collect sufficient data) and did not utilize a rigorous research approach that validly and reliably measured more precisely what students actually learned from playing either of these games.  The important idea here is that a digital information literacy game designer should be careful about interpreting the results found from the LibraryCraft surveys because the student surveys only collected a limited amount of information).

Tewell and Angell (2015) state that “little research has been done on whether playing games in academic library settings may in fact translate into learning” (p. 21).

One information literacy game mentioned by Guo and Goh (2016a) that did include a research evaluation component was BiblioBouts (an information literacy games designed at the University of Michigan).  BiblioBouts is a tournament-type game in which students compete against each other by completing various information literacy tasks in order to earn points (Markey, Leeder, & Hoffer, 2011).  BiblioBouts will be discussed later in this literature review.

Research Studies that Assess Information Literacy Digital Game-Based Learning

Markey, Swanson, Jenkins, Jennings, and St. Jean (2009) studied whether students will play games to learn how to do library research for their course needs.  Markey et al. (2009) introduced the game in a University of Michigan undergraduate class called Introduction to Information Studies (which was cross-listed as Sociology S1 110).  This Information Studies course required students to do library research.  The class had 75 undergraduate students who were majoring in a variety of academic subjects.  The games were played during November 2007.

The game designers decided to adopt a board game for the students to play.  The game was called Defense of Hidgeon.  The game was designed to project a 14th Century European setting.  The goal of each student-team is to conduct their library research faster and more accurately than the other teams.  The game-playing mechanics include rolling an e-die and then moving around a board and landing on libraries that then ask the student-teams various questions.  During the four-week game play, students in the class were writing a 5-7 page paper on how information systems impact the information found by students.  The students’ play data was collected by a Perl script into MS Excel.  The Library game asked 18 questions and the reward given was a half-letter grade if the students answered 40% or more of the questions correctly.

Markey et al. (2009) found (from game logs) that students, overall, answered web, encyclopedia, and database questions with the highest accuracy rates — of 67%, 62%, and 62%.  Markey et al. (2009) also found that students, overall, answered questions on online citation indexes, books and edited works with the lowest accuracy rates – 42%, 43%, and 39% respectively.  Markey et al. concluded that students probably had low accuracy rates in these areas because the students had to stop game play (by exiting the computer) and go to the library to find the answers to these types of questions.  A number of students guessed at the answers because they did not want to go to exit the game and go to the library to find the answers (and admitted this in interviews).  This finding is also confirmed by the fact that 51 reserve books were taken out only 5 times by students (to find the answers to the game questions).

The importance of the Markey et al. (2009) study for my information literacy game project is that Markey conducted in-depth interviews with students playing the Defense of Hidgeon game and found that “most players were focused on the ‘how-to’ connected with library research such as learning names of databases, becoming familiar with a particular database’s content and interface, choosing databases, using Search Tools, and retrieving useful results” (p. 311).  These findings will serve as a guide for the learning objectives that I establish in my information literacy game because this is precisely what students should learn in an information literacy course in order to successfully complete their course research papers and other assignments.

Markey et al. (2009) also found that students “did not explicitly say that the game taught them how to think about what they were doing or gave them opportunities to do so…” (p. 311).  As a result, Markey et al. says that information literacy games “cannot stand on their own” and instructors must integrate the game into information literacy course by having students think about how the game relates to and supports other aspects of this course (Markey et al., 2009, p. 311).

Markey et al. (2009) also planned to develop a digital game that could be played on the computer as students looked for information in library databases, catalogs, and digital collection in the same way that they do to complete actual research assignments.  This is an excellent idea and one that I will be utilizing for my information literacy game.

Markey et al. (2009) recommended that libraries adopt a “…mini-game format that would enable students to discriminate between the many tasks that make up library research, and, possibly, by becoming specialists at certain research tasks, make it easy for them to apply their task expertise to related class assignments” (p. 312).  I also think that this is a good idea and will be designed into my game.  The reason is that it keeps the student focused on the specific research activities that need to be completed instead of diverting the student’s attention to a variety of other information literacy concepts and tasks.  It also helps to keep the game sufficiently short so students do not complain that the game is too long, which students thought was the case with Utah Valley University LibraryCraft game (Guo & Goh, 2016a).

Markey, Leeder, & St. Jean (2011) studied the BiblioBouts information literacy game that was designed and implemented in the 2009-2010 academic year by the University of Michigan’s School of Information.  As aforementioned, BiblioBouts is a tournament-style game composed of a variety of mini-games or bouts in which students engage in a variety of hands-on information literacy tasks, such as searching the Web and library databases for appropriate sources; use Zotero to save their sources; evaluate their opponent’s sources; categorize and sub-categorize sources and sub-sources by subject; and formulate a research question (Markey, Leeder, & St. Jean, 2011).

Markey et al. (2011) found the following results from focus group discussions with students — in terms of what students think are the benefits of an information literacy game:

  1. Realizing that library databases yield sources that are qualitatively better than Google, Wikipedia, and the web
  2. Getting hands-on practice using a step-by-step approach for conducting library research and evaluating sources
  3. Finding relevant sources for their writing assignments that other students submitted to the game
  4. Reducing procrastination
  5. Giving access to many times more sources than they would have found on their own (p. 63).

 

Markey et al. (2011) also found that students want to have fun, but when game play is made an assignment and will be graded, the fun element becomes problematic.  This finding is very relevant to my information literacy game because many information literacy games have been unsuccessful because they lacked a sufficient fun element (Phetteplace and Felker, 2014).  An educational game designer who is designing games for courses will continually face these apparently diametrically opposed values: meeting educational goals and objectives versus having fun.  This is clearly where the design challenge is in creating an information literacy digital game.

Tewell and Angell (2015) studied whether playing an online information literacy game would increase understanding of keyword use (in terms of choosing an appropriate keyword to represent a topic and identifying citation types (such as APA and MLA citation formats).  The research study was conducted at Long Island University and included a non-probability sample of 86 students in seven freshman English composition classes (that include an information literacy component to them).  43 students were assigned to a control group of classes that did not play any games.  43 students were assigned to an experimental group that played a keyword use game in the first information literacy session (1 ¼ hour session class) and a citing game.  The researchers administered a pre- and post-test questionnaire (that included six questions).  The researchers used a t-test on the two sample populations to determine if there was any statistically significant difference between the pre-tests and post-tests.  The researchers found that there was statistical significance between pre- and post-tests for the experimental group that played the games but no statistical significance between pre- and post-tests for the control group that did not play any games.  The researchers found that while both groups increased their scores between pre- and post-tests, the experimental group (that played games) increased their scores by approximately 10 percentage points while the control group (that did not play games) increased their scores by only about 2 percentage points (p. 25).  Tewell and Angell concluded that using games can increase student skills in the two areas studied (keyword use and identifying citation types).

Guo and Goh (2016a) developed a user-design approach to evaluate information literacy games and noted that a user-designed approach is rare in information literacy games.  Guo and Goh point out that information literacy games differ from other digital game-based learning approaches (DGBL) because “IL [information literacy] instruction involves higher-order thinking skills such as how to apply, analyze, or evaluate knowledge” as opposed to lower order thinking skills, such as recall (p. 199).   Guo and Goh (2016) found from semi-structured research interviews with 10 students (knowledgeable of gaming) that students by-and-large have misconceptions about what constitutes good information literacy skills (for example, just being able to use Google to do academic research).

 

Conclusion

This literature review examined previous relevant studies on information literacy digital game design and assessment (or evaluation).  The research shows that information literacy games are a relatively new development and that many academic librarians are new to information literacy game development.  The research also shows that academic librarians who want to develop an information literacy game are continually faced with and challenged by two competing issues: creating a game that meets information literacy learning goals and objectives and a game that is fun in a way that students want to play it.  It is a design challenge to be able to do both well in an information literacy game.

The research shows that many previous research studies on information literacy games have been descriptive in nature as opposed to research-oriented and applying acceptable statistical analysis and design.  As a result, many research studies have not assessed or been able to properly assess if actual learning results from playing these information literacy games.  A few studies have been research-oriented and applied acceptable statistical methods.  These studies have shown that information literacy games can increase student engagement as well as learning results.

The research also shows that information literacy games have the potential to increase student learning outcomes.  Much more research needs to be done on how to more precisely and effectively use information literacy games in a college-level information literacy course to increase the desired student learning outcomes, while, at the same time, introducing a fun element into the game.

My goal, based on the literature reviewed, would be to develop an information literacy game that requires the student to do actual library research and evaluate the sources found because this is a student preference in an information course (Markey, K., Swanson, F., Jenkins, A., Jennings, B., St. Jean, B., Rosenberg, V., & Frost, R., 2009).  To further motivate students to participate in such a game, a significant part of the student’s final grade in an information literacy course would be based on these information literacy activities.  My information literacy game design goal would be to develop a focused mini-module that has the student doing hands-on library research and evaluation of sources (Markey et al., 2009) and that is simple and easy to follow (Young, 2016).


References

Association of College and Research Libraries (ACRL). 2000. Information Literacy Competency Standards for Higher Education. Retrieved from http://www.ala.org/acrl/standards/informationliteracycompetency

Broido, E. M. (2004). Understanding diversity in Millennial students. New Directions for Student Services, 106, 73-85. Retrieved from http://eds.a.ebscohost.com.library.esc.edu/ehost/detail/detail?sid=19a10f06-b0e1-4914-a642-5e21bf5df18e%40sessionmgr4010&vid=8&hid=4205&bdata=JnNpdGU9ZWhvc3QtbGl2ZQ%3d%3d#AN=507920599&db=eue

Broussard, M. S. (2012). Digital games in academic libraries: a review of games and suggested best practices. Reference Services Review, 40(1), 75-89. Retrieved from http://eds.a.ebscohost.com.library.esc.edu/ehost/detail/detail?vid=3&sid=0050855a-e07c-4c36-a59a-a69ceac1f857%40sessionmgr101&hid=113&bdata=JnNpdGU9ZWhvc3QtbGl2ZQ%3d%3d#AN=72096493&db=eue

Carnesi, S. (2014). iCivics–A fun way to enhance learning. Knowledge Quest, 42(5), 72-74. Retrieved from http://eds.a.ebscohost.com.library.esc.edu/ehost/detail/detail?sid=30207401-10d6-447b-bde8-13cd69130e72%40sessionmgr4008&vid=3&hid=4105&bdata=JnNpdGU9ZWhvc3QtbGl2ZQ%3d%3d#AN=95795526&db=a9h

Considine, D., Horton, J., & Moorman, G. (2009). Teaching and reading the Millennial Generation through media literacy. Journal of Adolescent & Adult Literacy, 52(6), 471-481. Retrieved from http://eds.b.ebscohost.com.library.esc.edu/ehost/detail/detail?sid=1bde7413-b24b-4df5-b2f6-53bcbfe18ce0%40sessionmgr101&vid=13&hid=117&bdata=JnNpdGU9ZWhvc3QtbGl2ZQ%3d%3d#AN=37012193&db=a9h

Guo, Y. W., & Goh, D. (2016a). Evaluation of affective embodied agents in an information literacy game. Computers & Education, 10359-75. Retrieved from http://www.sciencedirect.com.library.esc.edu/science/article/pii/S0360131516301762

Guo, Y. W., & Goh, D. (2016). From storyboard to software: user evaluation of an information literacy game. The 31st Annual ACM Symposium, Pisa, Italy, April 4-8, 2016. New York: ACM. Retrieved from http://delivery.acm.org/10.1145/2860000/2851909/p199-guo.pdf?ip=167.206.128.10&id=2851909&acc=ACTIVE%20SERVICE&key=7777116298C9657D%2E43DB0337F9DDC627%2E4D4702B0C3E38B35%2E4D4702B0C3E38B35&CFID=899873103&CFTOKEN=91143574&__acm__=1486933507_2bace7e914f5f2d815c24b2de9c80d93

Iten, N., & Petko, D. (2016). Learning with serious games: Is fun playing the game a predictor of learning success? British Journal of Educational Technology, 47(1), 151-163. Retrieved from http://eds.b.ebscohost.com.library.esc.edu/ehost/detail/detail?sid=1810fbfe-0179-46e5-9ab3-544a11648509%40sessionmgr103&vid=3&hid=127&bdata=JnNpdGU9ZWhvc3QtbGl2ZQ%3d%3d#AN=112192250&db=a9h

Kim, B. (2012). Harnessing the power of game dynamics. College & Research Libraries News, 73(8), 465-469. Retrieved from http://eds.b.ebscohost.com.library.esc.edu/ehost/detail/detail?vid=4&sid=379763fc-cbfe-46b7-84b0-a1b568638454%40sessionmgr120&hid=117&bdata=JnNpdGU9ZWhvc3QtbGl2ZQ%3d%3d#AN=79730290&db=eue

Kirk, T., & Harris, C. (2011). It’s all fun and games in the library. Knowledge Quest, 40(1), 8-9. Retrieved from http://eds.b.ebscohost.com.library.esc.edu/ehost/detail/detail?sid=4b421c50-1379-4042-a061-bb1b2a221b5f%40sessionmgr104&vid=3&hid=127&bdata=JnNpdGU9ZWhvc3QtbGl2ZQ%3d%3d#AN=66835813&db=a9h

Kubiatko, M. (2013). The comparison of different age groups on the attitudes toward and the use of ICT. Educational Sciences: Theory & Practice, 13(2), 1263-1272. Retrieve http://eds.a.ebscohost.com.library.esc.edu/ehost/detail/detail?vid=3&sid=19a10f06-b0e1-4914-a642-5e21bf5df18e%40sessionmgr4010&hid=4205&bdata=JnNpdGU9ZWhvc3QtbGl2ZQ%3d%3d#db=eue&AN=87343974

Leach, G. J., & Sugarman, T. S. (2005). Play to win! Using games in library instruction to enhance student learning. Research Strategies, 20(3), 191-203. doi:10.1016/j.resstr.2006.05.002

Leeder, C., Markey, K., & Rieh, S.Y. (2010, October 8). College student perceptions of learning academic research skills through an online game. Paper presented at the Library Research Seminar-V, Hyattsville, MD. Retrieved from http://lrsv.umd.edu/abstracts/Leeder_et_al.pdf

Margino, M. m. (2013). Revitalizing traditional information literacy instruction: Exploring Games in Academic Libraries. Public Services Quarterly, 9(4), 333-341. Retrieved from http://eds.b.ebscohost.com.library.esc.edu/ehost/detail/detail?vid=10&sid=732f27f7-22cd-4c5b-b2c5-349aff5c798a%40sessionmgr102&hid=113&bdata=JnNpdGU9ZWhvc3QtbGl2ZQ%3d%3d#db=eue&AN=92006161

Markey, K., Leeder, C., & Hofer, A. (2011). BiblioBouts. College & Research Libraries News, 72(11), 632-645. Retrieved from http://eds.b.ebscohost.com.library.esc.edu/ehost/detail/detail?vid=17&sid=732f27f7-22cd-4c5b-b2c5-349aff5c798a%40sessionmgr102&hid=113&bdata=JnNpdGU9ZWhvc3QtbGl2ZQ%3d%3d#db=eue&AN=69590204

Markey, K. Leeder, C. and St. Jean, B. (2011). Students’ behavior playing an online information literacy game. Journal of Information Literacy. 5(2), 46-65. Retrieved from http://ojs.lboro.ac.uk/ojs/index.php/JIL/article/view/PRA-V5-I2-2011-3/1819

Markey, K., Swanson, F., Jenkins, A., Jennings, B., St. Jean, B., Rosenberg, V., & … Frost, R. (2009). Will undergraduate students play games to learn how to conduct library research? Journal of Academic Librarianship, 35(4), 303-313. Retrieved from http://www.sciencedirect.com.library.esc.edu/science/article/pii/S0099133309000652

Martin, J., & Ewing, R. (2008). Power up! Using digital gaming techniques to enhance library instruction. Internet Reference Services Quarterly, 13(2/3), 209-225. Retrieved from http://eds.b.ebscohost.com.library.esc.edu/ehost/detail/detail?vid=13&sid=732f27f7-22cd-4c5b-b2c5-349aff5c798a%40sessionmgr102&hid=113&bdata=JnNpdGU9ZWhvc3QtbGl2ZQ%3d%3d#AN=502936240&db=eue

Nelson, B.C., & Erlandson, B. E. (2012). Design for learning in virtual worlds: Interdisciplinary approaches to educational technology. New York: Routledge.

Pew Research Center (2016, April 25). Millenials overtake Baby Boomers as America’s largest generation. Retrieved from http://www.pewresearch.org/fact-tank/2016/04/25/millennials-overtake-baby-boomers/

Phetteplace, E., & Felker, K. (2014). Gamification in libraries. Reference & User Services Quarterly, 54(2), 19-23. Retrieved from http://eds.b.ebscohost.com.library.esc.edu/ehost/detail/detail?vid=7&sid=732f27f7-22cd-4c5b-b2c5-349aff5c798a%40sessionmgr102&hid=113&bdata=JnNpdGU9ZWhvc3QtbGl2ZQ%3d%3d#db=eue&AN=100188427

Pho, A., & Dinscore, A. (Spring 2015). Game-based learning. Association of College & Research Libraries Instruction Section. Retrieved from connect.ala.org/node/241217

Smale, M.A. (2015). Play a game, make a game: Getting creative with professional development for library instruction. The Journal of Creative Library Practice. Retrieved from http://creativelibrarypractice.org/2015/05/18/play-a-game-make-a-game/

Smith, F.A. (2007. Games for teaching information literacy skills. Library Philosophy & Practice 2007. Retrieved from http://www.webpages.uidaho.edu/~mbolin/f-smith.htm

Tewell, E., & Angell, K. (2015). Far from a trivial pursuit: Assessing the effectiveness of games in information literacy instruction. Evidence Based Library and Information Practice. 10(1), 1-8. Retrieved from https://ejournals.library.ualberta.ca/index.php/EBLIP/article/view/22887/17963

Vieira, L. C., & da Silva, F. C. (2017). Assessment of fun in interactive systems: A survey. Cognitive Systems Research, 41, 130-143. Retrieved from http://ac.els-cdn.com.library.esc.edu/S1389041716300717/1-s2.0-S1389041716300717-main.pdf?_tid=236cd036-f243-11e6-a781-00000aacb361&acdnat=1487028265_fe3bf22fa7cac8a5ae69fd7f8f9e4a8d

Young, J. (2016) Can library research be fun: Using games for information literacy instruction in higher education. Georgia Library Quarterly, 53(3), 1-7. Retrieved from http://digitalcommons.kennesaw.edu/cgi/viewcontent.cgi?article=1973&context=glq

 

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s